3D-QSAR analysis of MCD inhibitors by CoMFA and CoMSIA.
Pourbasheer, Eslam; Aalizadeh, Reza; Ebadi, Amin; Ganjali, Mohammad Reza
2015-01-01
Three-dimensional quantitative structure-activity relationship was developed for the series of compounds as malonyl-CoA decarboxylase antagonists (MCD) using the CoMFA and CoMSIA methods. The statistical parameters for CoMFA (q(2)=0.558, r(2)=0.841) and CoMSIA (q(2)= 0.615, r(2) = 0.870) models were derived based on 38 compounds as training set in the basis of the selected alignment. The external predictive abilities of the built models were evaluated by using the test set of nine compounds. From obtained results, the CoMSIA method was found to have highly predictive capability in comparison with CoMFA method. Based on the given results by CoMSIA and CoMFA contour maps, some features that can enhance the activity of compounds as MCD antagonists were introduced and used to design new compounds with better inhibition activity.
QSAR studies on triazole derivatives as sglt inhibitors via CoMFA and CoMSIA
NASA Astrophysics Data System (ADS)
Zhi, Hui; Zheng, Junxia; Chang, Yiqun; Li, Qingguo; Liao, Guochao; Wang, Qi; Sun, Pinghua
2015-10-01
Forty-six sodium-dependent glucose cotransporters-2 (SGLT-2) inhibitors with hypoglycemic activity were selected to develop three-dimensional quantitative structure-activity relationship (3D-QSAR) using comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) models. A training set of 39 compounds were used to build up the models, which were then evaluated by a series of internal and external cross-validation techniques. A test set of 7 compounds was used for the external validation. The CoMFA model predicted a q2 value of 0.792 and an r2 value of 0.985. The best CoMSIA model predicted a q2 value of 0.633 and an r2 value of 0.895 based on a combination of steric, electrostatic, hydrophobic and hydrogen-bond acceptor effects. The predictive correlation coefficients (rpred2) of CoMFA and CoMSIA models were 0.872 and 0.839, respectively. The analysis of the contour maps from each model provided insight into the structural requirements for the development of more active sglt inhibitors, and on the basis of the models 8 new sglt inhibitors were designed and predicted.
Gupte, Amol; Buolamwini, John K
2009-01-15
3D-QSAR (CoMFA and CoMSIA) studies were performed on human equlibrative nucleoside transporter (hENT1) inhibitors displaying K(i) values ranging from 10,000 to 0.7nM. Both CoMFA and CoMSIA analysis gave reliable models with q2 values >0.50 and r2 values >0.92. The models have been validated for their stability and robustness using group validation and bootstrapping techniques and for their predictive abilities using an external test set of nine compounds. The high predictive r2 values of the test set (0.72 for CoMFA model and 0.74 for CoMSIA model) reveals that the models can prove to be a useful tool for activity prediction of newly designed nucleoside transporter inhibitors. The CoMFA and CoMSIA contour maps identify features important for exhibiting good binding affinities at the transporter, and can thus serve as a useful guide for the design of potential equilibrative nucleoside transporter inhibitors.
Sivan, Sree Kanth; Manga, Vijjulatha
2012-02-01
Multiple receptors conformation docking (MRCD) and clustering of dock poses allows seamless incorporation of receptor binding conformation of the molecules on wide range of ligands with varied structural scaffold. The accuracy of the approach was tested on a set of 120 cyclic urea molecules having HIV-1 protease inhibitory activity using 12 high resolution X-ray crystal structures and one NMR resolved conformation of HIV-1 protease extracted from protein data bank. A cross validation was performed on 25 non-cyclic urea HIV-1 protease inhibitor having varied structures. The comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) models were generated using 60 molecules in the training set by applying leave one out cross validation method, r (loo) (2) values of 0.598 and 0.674 for CoMFA and CoMSIA respectively and non-cross validated regression coefficient r(2) values of 0.983 and 0.985 were obtained for CoMFA and CoMSIA respectively. The predictive ability of these models was determined using a test set of 60 cyclic urea molecules that gave predictive correlation (r (pred) (2) ) of 0.684 and 0.64 respectively for CoMFA and CoMSIA indicating good internal predictive ability. Based on this information 25 non-cyclic urea molecules were taken as a test set to check the external predictive ability of these models. This gave remarkable out come with r (pred) (2) of 0.61 and 0.53 for CoMFA and CoMSIA respectively. The results invariably show that this method is useful for performing 3D QSAR analysis on molecules having different structural motifs.
3D-QSAR modeling and molecular docking studies on a series of 2,5 disubstituted 1,3,4-oxadiazoles
NASA Astrophysics Data System (ADS)
Ghaleb, Adib; Aouidate, Adnane; Ghamali, Mounir; Sbai, Abdelouahid; Bouachrine, Mohammed; Lakhlifi, Tahar
2017-10-01
3D-QSAR (comparative molecular field analysis (CoMFA)) and comparative molecular similarity indices analysis (CoMSIA) were performed on novel 2,5 disubstituted 1,3,4-oxadiazoles analogues as anti-fungal agents. The CoMFA and CoMSIA models using 13 compounds in the training set gives Q2 values of 0.52 and 0.51 respectively, while R2 values of 0.92. The adapted alignment method with the suitable parameters resulted in reliable models. The contour maps produced by the CoMFA and CoMSIA models were employed to determine a three-dimensional quantitative structure-activity relationship. Based on this study a set of new molecules with high predicted activities were designed. Surflex-docking confirmed the stability of predicted molecules in the receptor.
Ul-Haq, Zaheer; Ashraf, Sajda; Al-Majid, Abdullah Mohammed; Barakat, Assem
2016-04-30
Urease enzyme (EC 3.5.1.5) has been determined as a virulence factor in pathogenic microorganisms that are accountable for the development of different diseases in humans and animals. In continuance of our earlier study on the helicobacter pylori urease inhibition by barbituric acid derivatives, 3D-QSAR (three dimensional quantitative structural activity relationship) advance studies were performed by Comparative Molecular Field Analysis (CoMFA) and Comparative Molecular Similarity Indices Analysis (CoMSIA) methods. Different partial charges were calculated to examine their consequences on the predictive ability of the developed models. The finest developed model for CoMFA and CoMSIA were achieved by using MMFF94 charges. The developed CoMFA model gives significant results with cross-validation (q²) value of 0.597 and correlation coefficients (r²) of 0.897. Moreover, five different fields i.e., steric, electrostatic, and hydrophobic, H-bond acceptor and H-bond donors were used to produce a CoMSIA model, with q² and r² of 0.602 and 0.98, respectively. The generated models were further validated by using an external test set. Both models display good predictive power with r²pred ≥ 0.8. The analysis of obtained CoMFA and CoMSIA contour maps provided detailed insight for the promising modification of the barbituric acid derivatives with an enhanced biological activity.
CoMSIA and Docking Study of Rhenium Based Estrogen Receptor Ligand Analogs
Wolohan, Peter; Reichert, David E.
2007-01-01
OPLS all atom force field parameters were developed in order to model a diverse set of novel rhenium based estrogen receptor ligands whose relative binding affinities (RBA) to the estrogen receptor alpha isoform (ERα) with respect to 17β-Estradiol were available. The binding properties of these novel rhenium based organometallic complexes were studied with a combination of Comparative Molecular Similarity Indices Analysis (CoMSIA) and docking. A total of 29 estrogen receptor ligands consisting of 11 rhenium complexes and 18 organic ligands were docked inside the ligand-binding domain (LBD) of ERα utilizing the program Gold. The top ranked pose was used to construct CoMSIA models from a training set of 22 of the estrogen receptor ligands which were selected at random. In addition scoring functions from the docking runs and the polar volume (PV) were also studied to investigate their ability to predict RBA ERα. A partial least-squares analysis consisting of the CoMSIA steric, electrostatic and hydrophobic indices together with the polar volume proved sufficiently predictive having a correlation coefficient, r2, of 0.94 and a cross-validated correlation coefficient, q2, utilizing the leave one out method of 0.68. Analysis of the scoring functions from Gold showed particularly poor correlation to RBA ERα which did not improve when the rhenium complexes were extracted to leave the organic ligands. The combined CoMSIA and polar volume model ranked correctly the ligands in order of increasing RBA ERα, illustrating the utility of this method as a prescreening tool in the development of novel rhenium based estrogen receptor ligands. PMID:17280694
Nayana, M Ravi Shashi; Sekhar, Y Nataraja; Nandyala, Haritha; Muttineni, Ravikumar; Bairy, Santosh Kumar; Singh, Kriti; Mahmood, S K
2008-10-01
In the present study, a series of 179 quinoline and quinazoline heterocyclic analogues exhibiting inhibitory activity against Gastric (H+/K+)-ATPase were investigated using the comparative molecular field analysis (CoMFA) and comparative molecular similarity indices (CoMSIA) methods. Both the models exhibited good correlation between the calculated 3D-QSAR fields and the observed biological activity for the respective training set compounds. The most optimal CoMFA and CoMSIA models yielded significant leave-one-out cross-validation coefficient, q(2) of 0.777, 0.744 and conventional cross-validation coefficient, r(2) of 0.927, 0.914 respectively. The predictive ability of generated models was tested on a set of 52 compounds having broad range of activity. CoMFA and CoMSIA yielded predicted activities for test set compounds with r(pred)(2) of 0.893 and 0.917 respectively. These validation tests not only revealed the robustness of the models but also demonstrated that for our models r(pred)(2) based on the mean activity of test set compounds can accurately estimate external predictivity. The factors affecting activity were analyzed carefully according to standard coefficient contour maps of steric, electrostatic, hydrophobic, acceptor and donor fields derived from the CoMFA and CoMSIA. These contour plots identified several key features which explain the wide range of activities. The results obtained from models offer important structural insight into designing novel peptic-ulcer inhibitors prior to their synthesis.
Sethi, Kalyan K; Verma, Saurabh M
2014-08-01
Drug design involves the design of small molecules that are complementary in shape and charge to the biomolecular target with which they interact and therefore will bind to it. Three-dimensional quantitative structure-activity relationship (3D-QSAR) studies were performed for a series of carbonic anhydrase IX inhibitors using comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) techniques with the help of SYBYL 7.1 software. The large set of 36 different aromatic/heterocyclic sulfamates carbonic anhydrase (CA, EC 4.2.1.1) inhibitors, such as hCA IX, was chosen for this study. The conventional ligand-based 3D-QSAR studies were performed based on the low energy conformations employing database alignment rule. The ligand-based model gave q(2) values 0.802 and 0.829 and r(2) values 1.000 and 0.994 for CoMFA and CoMSIA, respectively, and the predictive ability of the model was validated. The predicted r(2) values are 0.999 and 0.502 for CoMFA and CoMSIA, respectively. SEA (steric, electrostatic, hydrogen bond acceptor) of CoMSIA has the significant contribution for the model development. The docking of inhibitors into hCA IX active site using Glide XP (Schrödinger) software revealed the vital interactions and binding conformation of the inhibitors. The CoMFA and CoMSIA field contour maps are well in agreement with the structural characteristics of the binding pocket of hCA IX active site, which suggests that the information rendered by 3D-QSAR models and the docking interactions can provide guidelines for the development of improved hCA IX inhibitors as leads for various types of metastatic cancers including those of cervical, renal, breast and head and neck origin.
3D-QSAR and docking studies of 3-Pyridine heterocyclic derivatives as potent PI3K/mTOR inhibitors
NASA Astrophysics Data System (ADS)
Yang, Wenjuan; Shu, Mao; Wang, Yuanqiang; Wang, Rui; Hu, Yong; Meng, Lingxin; Lin, Zhihua
2013-12-01
Phosphoinosmde-3-kinase/ mammalian target of rapamycin (PI3K/mTOR) dual inhibitors have attracted a great deal of interest as antitumor drugs research. In order to design and optimize these dual inhibitors, two types of 3D-quantitative structure-activity relationship (3D-QSAR) studies based on the ligand alignment and receptor alignment were applied using the comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA). In the study based on ligands alignment, models of PI3K (CoMFA with r2, 0.770; q2, 0.622; CoMSIA with r2, 0.945; q2, 0.748) and mTOR (CoMFA with r2, 0.850; q2, 0.654; CoMSIA with r2, 0.983; q2, 0.676) have good predictability. And in the study based on receptor alignment, models of PI3K (CoMFA with r2, 0.745; q2, 0.538; CoMSIA with r2, 0.938; q2, 0.630) and mTOR (CoMFA with r2, 0.977; q2, 0.825; CoMSIA with r2, 0.985; q2, 0.728) also have good predictability. 3D contour maps and docking results suggested different groups on the core parts of the compounds could enhance the biological activities. Finally, ten derivatives as potential candidates of PI3K/mTOR inhibitors with good predicted activities were designed.
Ai, Yong; Wang, Shao-Teng; Sun, Ping-Hua; Song, Fa-Jun
2010-01-01
CDK2/cyclin A has appeared as an attractive drug targets over the years with diverse therapeutic potentials. A computational strategy based on comparative molecular fields analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) followed by molecular docking studies were performed on a series of 4,5-dihydro-1H-pyrazolo[4,3-h]quinazoline derivatives as potent CDK2/cyclin A inhibitors. The CoMFA and CoMSIA models, using 38 molecules in the training set, gave r2cv values of 0.747 and 0.518 and r2 values of 0.970 and 0.934, respectively. 3D contour maps generated by the CoMFA and CoMSIA models were used to identify the key structural requirements responsible for the biological activity. Molecular docking was applied to explore the binding mode between the ligands and the receptor. The information obtained from molecular modeling studies may be helpful to design novel inhibitors of CDK2/cyclin A with desired activity. PMID:21152296
Ai, Yong; Wang, Shao-Teng; Sun, Ping-Hua; Song, Fa-Jun
2010-09-28
CDK2/cyclin A has appeared as an attractive drug targets over the years with diverse therapeutic potentials. A computational strategy based on comparative molecular fields analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) followed by molecular docking studies were performed on a series of 4,5-dihydro-1H-pyrazolo[4,3-h]quinazoline derivatives as potent CDK2/cyclin A inhibitors. The CoMFA and CoMSIA models, using 38 molecules in the training set, gave r(2) (cv) values of 0.747 and 0.518 and r(2) values of 0.970 and 0.934, respectively. 3D contour maps generated by the CoMFA and CoMSIA models were used to identify the key structural requirements responsible for the biological activity. Molecular docking was applied to explore the binding mode between the ligands and the receptor. The information obtained from molecular modeling studies may be helpful to design novel inhibitors of CDK2/cyclin A with desired activity.
Xie, Huiding; Chen, Lijun; Zhang, Jianqiang; Xie, Xiaoguang; Qiu, Kaixiong; Fu, Jijun
2015-01-01
B-Raf kinase is an important target in treatment of cancers. In order to design and find potent B-Raf inhibitors (BRIs), 3D pharmacophore models were created using the Genetic Algorithm with Linear Assignment of Hypermolecular Alignment of Database (GALAHAD). The best pharmacophore model obtained which was used in effective alignment of the data set contains two acceptor atoms, three donor atoms and three hydrophobes. In succession, comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were performed on 39 imidazopyridine BRIs to build three dimensional quantitative structure-activity relationship (3D QSAR) models based on both pharmacophore and docking alignments. The CoMSIA model based on the pharmacophore alignment shows the best result (q2 = 0.621, r2pred = 0.885). This 3D QSAR approach provides significant insights that are useful for designing potent BRIs. In addition, the obtained best pharmacophore model was used for virtual screening against the NCI2000 database. The hit compounds were further filtered with molecular docking, and their biological activities were predicted using the CoMSIA model, and three potential BRIs with new skeletons were obtained. PMID:26035757
Xie, Huiding; Chen, Lijun; Zhang, Jianqiang; Xie, Xiaoguang; Qiu, Kaixiong; Fu, Jijun
2015-05-29
B-Raf kinase is an important target in treatment of cancers. In order to design and find potent B-Raf inhibitors (BRIs), 3D pharmacophore models were created using the Genetic Algorithm with Linear Assignment of Hypermolecular Alignment of Database (GALAHAD). The best pharmacophore model obtained which was used in effective alignment of the data set contains two acceptor atoms, three donor atoms and three hydrophobes. In succession, comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were performed on 39 imidazopyridine BRIs to build three dimensional quantitative structure-activity relationship (3D QSAR) models based on both pharmacophore and docking alignments. The CoMSIA model based on the pharmacophore alignment shows the best result (q(2) = 0.621, r(2)(pred) = 0.885). This 3D QSAR approach provides significant insights that are useful for designing potent BRIs. In addition, the obtained best pharmacophore model was used for virtual screening against the NCI2000 database. The hit compounds were further filtered with molecular docking, and their biological activities were predicted using the CoMSIA model, and three potential BRIs with new skeletons were obtained.
3D-QSAR and molecular docking studies on HIV protease inhibitors
NASA Astrophysics Data System (ADS)
Tong, Jianbo; Wu, Yingji; Bai, Min; Zhan, Pei
2017-02-01
In order to well understand the chemical-biological interactions governing their activities toward HIV protease activity, QSAR models of 34 cyclic-urea derivatives with inhibitory HIV were developed. The quantitative structure activity relationship (QSAR) model was built by using comparative molecular similarity indices analysis (CoMSIA) technique. And the best CoMSIA model has rcv2, rncv2 values of 0.586 and 0.931 for cross-validated and non-cross-validated. The predictive ability of CoMSIA model was further validated by a test set of 7 compounds, giving rpred2 value of 0.973. Docking studies were used to find the actual conformations of chemicals in active site of HIV protease, as well as the binding mode pattern to the binding site in protease enzyme. The information provided by 3D-QSAR model and molecular docking may lead to a better understanding of the structural requirements of 34 cyclic-urea derivatives and help to design potential anti-HIV protease molecules.
Vijayaraj, Ramadoss; Devi, Mekapothula Lakshmi Vasavi; Subramanian, Venkatesan; Chattaraj, Pratim Kumar
2012-06-01
Three-dimensional quantitative structure activity relationship (3D-QSAR) study has been carried out on the Escherichia coli DHFR inhibitors 2,4-diamino-5-(substituted-benzyl)pyrimidine derivatives to understand the structural features responsible for the improved potency. To construct highly predictive 3D-QSAR models, comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) methods were used. The predicted models show statistically significant cross-validated and non-cross-validated correlation coefficient of r2 CV and r2 nCV, respectively. The final 3D-QSAR models were validated using structurally diverse test set compounds. Analysis of the contour maps generated from CoMFA and CoMSIA methods reveals that the substitution of electronegative groups at the first and second position along with electropositive group at the third position of R2 substitution significantly increases the potency of the derivatives. The results obtained from the CoMFA and CoMSIA study delineate the substituents on the trimethoprim analogues responsible for the enhanced potency and also provide valuable directions for the design of new trimethoprim analogues with improved affinity. © 2012 John Wiley & Sons A/S.
CoMFA and CoMSIA studies on C-aryl glucoside SGLT2 inhibitors as potential anti-diabetic agents.
Vyas, V K; Bhatt, H G; Patel, P K; Jalu, J; Chintha, C; Gupta, N; Ghate, M
2013-01-01
SGLT2 has become a target of therapeutic interest in diabetes research. CoMFA and CoMSIA studies were performed on C-aryl glucoside SGLT2 inhibitors (180 analogues) as potential anti-diabetic agents. Three different alignment strategies were used for the compounds. The best CoMFA and CoMSIA models were obtained by means of Distill rigid body alignment of training and test sets, and found statistically significant with cross-validated coefficients (q²) of 0.602 and 0.618, respectively, and conventional coefficients (r²) of 0.905 and 0.902, respectively. Both models were validated by a test set of 36 compounds giving satisfactory predicted correlation coefficients (r² pred) of 0.622 and 0.584 for CoMFA and CoMSIA models, respectively. A comparison was made with earlier 3D QSAR study on SGLT2 inhibitors, which shows that our 3D QSAR models are better than earlier models to predict good inhibitory activity. CoMFA and CoMSIA models generated in this work can provide useful information to design new compounds and helped in prediction of activity prior to synthesis.
Ai, Yong; Wang, Shao-Teng; Sun, Ping-Hua; Song, Fa-Jun
2011-01-01
Aurora kinases have emerged as attractive targets for the design of anticancer drugs. 3D-QSAR (comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA)) and Surflex-docking studies were performed on a series of pyrrole-indoline-2-ones as Aurora A inhibitors. The CoMFA and CoMSIA models using 25 inhibitors in the training set gave r2cv values of 0.726 and 0.566, and r2 values of 0.972 and 0.984, respectively. The adapted alignment method with the suitable parameters resulted in reliable models. The contour maps produced by the CoMFA and CoMSIA models were employed to rationalize the key structural requirements responsible for the activity. Surflex-docking studies revealed that the sulfo group, secondary amine group on indolin-2-one, and carbonyl of 6,7-dihydro-1H-indol-4(5H)-one groups were significant for binding to the receptor, and some essential features were also identified. Based on the 3D-QSAR and docking results, a set of new molecules with high predicted activities were designed. PMID:21673910
Ai, Yong; Wang, Shao-Teng; Sun, Ping-Hua; Song, Fa-Jun
2011-01-01
Aurora kinases have emerged as attractive targets for the design of anticancer drugs. 3D-QSAR (comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA)) and Surflex-docking studies were performed on a series of pyrrole-indoline-2-ones as Aurora A inhibitors. The CoMFA and CoMSIA models using 25 inhibitors in the training set gave r(2) (cv) values of 0.726 and 0.566, and r(2) values of 0.972 and 0.984, respectively. The adapted alignment method with the suitable parameters resulted in reliable models. The contour maps produced by the CoMFA and CoMSIA models were employed to rationalize the key structural requirements responsible for the activity. Surflex-docking studies revealed that the sulfo group, secondary amine group on indolin-2-one, and carbonyl of 6,7-dihydro-1H-indol-4(5H)-one groups were significant for binding to the receptor, and some essential features were also identified. Based on the 3D-QSAR and docking results, a set of new molecules with high predicted activities were designed.
Wang, Zhanhui; Kai, Zhenpeng; Beier, Ross C.; Shen, Jianzhong; Yang, Xinling
2012-01-01
A three-dimensional quantitative structure-activity relationship (3D-QSAR) model of sulfonamide analogs binding a monoclonal antibody (MAbSMR) produced against sulfamerazine was carried out by Distance Comparison (DISCOtech), comparative molecular field analysis (CoMFA), and comparative molecular similarity indices analysis (CoMSIA). The affinities of the MAbSMR, expressed as Log10IC50, for 17 sulfonamide analogs were determined by competitive fluorescence polarization immunoassay (FPIA). The results demonstrated that the proposed pharmacophore model containing two hydrogen-bond acceptors, two hydrogen-bond donors and two hydrophobic centers characterized the structural features of the sulfonamides necessary for MAbSMR binding. Removal of two outliers from the initial set of 17 sulfonamide analogs improved the predictability of the models. The 3D-QSAR models of 15 sulfonamides based on CoMFA and CoMSIA resulted in q2 cv values of 0.600 and 0.523, and r2 values of 0.995 and 0.994, respectively, which indicates that both methods have significant predictive capability. Connolly surface analysis, which mainly focused on steric force fields, was performed to complement the results from CoMFA and CoMSIA. This novel study combining FPIA with pharmacophore modeling demonstrates that multidisciplinary research is useful for investigating antigen-antibody interactions and also may provide information required for the design of new haptens. PMID:22754368
NASA Astrophysics Data System (ADS)
Li, Wenlian; Xiao, Faqi; Zhou, Mingming; Jiang, Xuejin; Liu, Jun; Si, Hongzong; Xie, Meng; Ma, Xiuting; Duan, Yunbo; Zhai, Honglin
2016-09-01
The three dimensional-quantitative structure activity relationship (3D-QSAR) study was performed on a series of 4-hydroxyamino α-pyranone carboxamide analogues using comparative molecular similarity indices analysis (COMSIA). The purpose of the present study was to develop a satisfactory model providing a reliable prediction based on 4-hydroxyamino α-pyranone carboxamide analogues as anti-HCV (hepatitis C virus) inhibitors. The statistical results and the results of validation of this optimum COMSIA model were satisfactory. Furthermore, analysis of the contour maps helped to provide guidelines for finding structural requirement. Therefore, the satisfactory results from this study may provide useful guidelines for drug development of anti-HCV inhibitors.
Sivan, Sree Kanth; Manga, Vijjulatha
2010-06-01
Nonnucleoside reverse transcriptase inhibitors (NNRTIs) are allosteric inhibitors of the HIV-1 reverse transcriptase. Recently a series of Triazolinone and Pyridazinone were reported as potent inhibitors of HIV-1 wild type reverse transcriptase. In the present study, docking and 3D quantitative structure activity relationship (3D QSAR) studies involving comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were performed on 31 molecules. Ligands were built and minimized using Tripos force field and applying Gasteiger-Hückel charges. These ligands were docked into protein active site using GLIDE 4.0. The docked poses were analyzed; the best docked poses were selected and aligned. CoMFA and CoMSIA fields were calculated using SYBYL6.9. The molecules were divided into training set and test set, a PLS analysis was performed and QSAR models were generated. The model showed good statistical reliability which is evident from the r2 nv, q2 loo and r2 pred values. The CoMFA model provides the most significant correlation of steric and electrostatic fields with biological activities. The CoMSIA model provides a correlation of steric, electrostatic, acceptor and hydrophobic fields with biological activities. The information rendered by 3D QSAR model initiated us to optimize the lead and design new potential inhibitors.
Ding, Lina; Wang, Zhi-Zheng; Sun, Xu-Dong; Yang, Jing; Ma, Chao-Ya; Li, Wen; Liu, Hong-Min
2017-08-01
Recently, Histone Lysine Specific Demethylase 1 (LSD1) was regarded as a promising anticancer target for the novel drug discovery. And several small molecules as LSD1 inhibitors in different structures have been reported. In this work, we carried out a molecular modeling study on the 6-aryl-5-cyano-pyrimidine fragment LSD1 inhibitors using three-dimensional quantitative structure-activity relationship (3D-QSAR), molecular docking and molecular dynamics simulations. Comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were used to generate 3D-QSAR models. The results show that the best CoMFA model has q 2 =0.802, r 2 ncv =0.979, and the best CoMSIA model has q 2 =0.799, r 2 ncv =0.982. The electrostatic, hydrophobic and H-bond donor fields play important roles in the models. Molecular docking studies predict the binding mode and the interactions between the ligand and the receptor protein. Molecular dynamics simulations results reveal that the complex of the ligand and the receptor protein are stable at 300K. All the results can provide us more useful information for our further drug design. Copyright © 2017. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Murumkar, Prashant Revan; Zambre, Vishal Prakash; Yadav, Mange Ram
2010-02-01
A chemical feature-based pharmacophore model was developed for Tumor Necrosis Factor-α converting enzyme (TACE) inhibitors. A five point pharmacophore model having two hydrogen bond acceptors (A), one hydrogen bond donor (D) and two aromatic rings (R) with discrete geometries as pharmacophoric features was developed. The pharmacophore model so generated was then utilized for in silico screening of a database. The pharmacophore model so developed was validated by using four compounds having proven TACE inhibitory activity which were grafted into the database. These compounds mapped well onto the five listed pharmacophoric features. This validated pharmacophore model was also used for alignment of molecules in CoMFA and CoMSIA analysis. The contour maps of the CoMFA/CoMSIA models were utilized to provide structural insight for activity improvement of potential novel TACE inhibitors. The pharmacophore model so developed could be used for in silico screening of any commercial/in house database for identification of TACE inhibiting lead compounds, and the leads so identified could be optimized using the developed CoMSIA model. The present work highlights the tremendous potential of the two mutually complementary ligand-based drug designing techniques (i.e. pharmacophore mapping and 3D-QSAR analysis) using TACE inhibitors as prototype biologically active molecules.
NASA Astrophysics Data System (ADS)
Wang, Fangfang; Zhou, Bo
2018-04-01
Protein tyrosine phosphatase 1B (PTP1B) is an intracellular non-receptor phosphatase that is implicated in signal transduction of insulin and leptin pathways, thus PTP1B is considered as potential target for treating type II diabetes and obesity. The present article is an attempt to formulate the three-dimensional quantitative structure-activity relationship (3D-QSAR) modeling of a series of compounds possessing PTP1B inhibitory activities using comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) techniques. The optimum template ligand-based models are statistically significant with great CoMFA (R2cv = 0.600, R2pred = 0.6760) and CoMSIA (R2cv = 0.624, R2pred = 0.8068) values. Molecular docking was employed to elucidate the inhibitory mechanisms of this series of compounds against PTP1B. In addition, the CoMFA and CoMSIA field contour maps agree well with the structural characteristics of the binding pocket of PTP1B active site. The knowledge of structure-activity relationship and ligand-receptor interactions from 3D-QSAR model and molecular docking will be useful for better understanding the mechanism of ligand-receptor interaction and facilitating development of novel compounds as potent PTP1B inhibitors.
Saraiva, Ádria P B; Miranda, Ricardo M; Valente, Renan P P; Araújo, Jéssica O; Souza, Rutelene N B; Costa, Clauber H S; Oliveira, Amanda R S; Almeida, Michell O; Figueiredo, Antonio F; Ferreira, João E V; Alves, Cláudio Nahum; Honorio, Kathia M
2018-04-22
In this work, a group of α-keto-based inhibitors of the cruzain enzyme with anti-chagas activity was selected for a three-dimensional quantitative structure-activity relationship study (3D-QSAR) combined with molecular dynamics (MD). Firstly, statistical models based on Partial Least Square (PLS) regression were developed employing comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) descriptors. Validation parameters (q 2 and r 2 )for the models were, respectively, 0.910 and 0.997 (CoMFA) and 0.913 and 0.992 (CoMSIA). In addition, external validation for the models using a test group revealed r 2 pred = 0.728 (CoMFA) and 0.971 (CoMSIA). The most relevant aspect in this study was the generation of molecular fields in both favorable and unfavorable regions based on the models developed. These fields are important to interpret modifications necessary to enhance the biological activities of the inhibitors. This analysis was restricted considering the inhibitors in a fixed conformation, not interacting with their target, the cruzain enzyme. Then, MD was employed taking into account important variables such as time and temperature. MD helped describe the behavior of the inhibitors and their properties showed similar results as those generated by QSAR-3D study. © 2018 John Wiley & Sons A/S.
Chen, Ying; Cai, Xiaoyu; Jiang, Long; Li, Yu
2016-02-01
Based on the experimental data of octanol-air partition coefficients (KOA) for 19 polychlorinated biphenyl (PCB) congeners, two types of QSAR methods, comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA), are used to establish 3D-QSAR models using the structural parameters as independent variables and using logKOA values as the dependent variable with the Sybyl software to predict the KOA values of the remaining 190 PCB congeners. The whole data set (19 compounds) was divided into a training set (15 compounds) for model generation and a test set (4 compounds) for model validation. As a result, the cross-validation correlation coefficient (q(2)) obtained by the CoMFA and CoMSIA models (shuffled 12 times) was in the range of 0.825-0.969 (>0.5), the correlation coefficient (r(2)) obtained was in the range of 0.957-1.000 (>0.9), and the SEP (standard error of prediction) of test set was within the range of 0.070-0.617, indicating that the models were robust and predictive. Randomly selected from a set of models, CoMFA analysis revealed that the corresponding percentages of the variance explained by steric and electrostatic fields were 23.9% and 76.1%, respectively, while CoMSIA analysis by steric, electrostatic and hydrophobic fields were 0.6%, 92.6%, and 6.8%, respectively. The electrostatic field was determined as a primary factor governing the logKOA. The correlation analysis of the relationship between the number of Cl atoms and the average logKOA values of PCBs indicated that logKOA values gradually increased as the number of Cl atoms increased. Simultaneously, related studies on PCB detection in the Arctic and Antarctic areas revealed that higher logKOA values indicate a stronger PCB migration ability. From CoMFA and CoMSIA contour maps, logKOA decreased when substituents possessed electropositive groups at the 2-, 3-, 3'-, 5- and 6- positions, which could reduce the PCB migration ability. These results are expected to be beneficial in predicting logKOA values of PCB homologues and derivatives and in providing a theoretical foundation for further elucidation of the global migration behaviour of PCBs. Copyright © 2015 Elsevier Inc. All rights reserved.
Zhang, Shuqun; Hou, Bo; Yang, Huaiyu; Zuo, Zhili
2016-05-01
Acetylcholinesterase (AChE) is an important enzyme in the pathogenesis of Alzheimer's disease (AD). Comparative quantitative structure-activity relationship (QSAR) analyses on some huprines inhibitors against AChE were carried out using comparative molecular field analysis (CoMFA), comparative molecular similarity indices analysis (CoMSIA), and hologram QSAR (HQSAR) methods. Three highly predictive QSAR models were constructed successfully based on the training set. The CoMFA, CoMSIA, and HQSAR models have values of r (2) = 0.988, q (2) = 0.757, ONC = 6; r (2) = 0.966, q (2) = 0.645, ONC = 5; and r (2) = 0.957, q (2) = 0.736, ONC = 6. The predictabilities were validated using an external test sets, and the predictive r (2) values obtained by the three models were 0.984, 0.973, and 0.783, respectively. The analysis was performed by combining the CoMFA and CoMSIA field distributions with the active sites of the AChE to further understand the vital interactions between huprines and the protease. On the basis of the QSAR study, 14 new potent molecules have been designed and six of them are predicted to be more active than the best active compound 24 described in the literature. The final QSAR models could be helpful in design and development of novel active AChE inhibitors.
Balupuri, Anand; Balasubramanian, Pavithra K; Cho, Seung J
2016-01-01
Checkpoint kinase 1 (Chk1) has emerged as a potential therapeutic target for design and development of novel anticancer drugs. Herein, we have performed three-dimensional quantitative structure-activity relationship (3D-QSAR) and molecular docking analyses on a series of diazacarbazoles to design potent Chk1 inhibitors. 3D-QSAR models were developed using comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) techniques. Docking studies were performed using AutoDock. The best CoMFA and CoMSIA models exhibited cross-validated correlation coefficient (q2) values of 0.631 and 0.585, and non-cross-validated correlation coefficient (r2) values of 0.933 and 0.900, respectively. CoMFA and CoMSIA models showed reasonable external predictabilities (r2 pred) of 0.672 and 0.513, respectively. A satisfactory performance in the various internal and external validation techniques indicated the reliability and robustness of the best model. Docking studies were performed to explore the binding mode of inhibitors inside the active site of Chk1. Molecular docking revealed that hydrogen bond interactions with Lys38, Glu85 and Cys87 are essential for Chk1 inhibitory activity. The binding interaction patterns observed during docking studies were complementary to 3D-QSAR results. Information obtained from the contour map analysis was utilized to design novel potent Chk1 inhibitors. Their activities and binding affinities were predicted using the derived model and docking studies. Designed inhibitors were proposed as potential candidates for experimental synthesis.
Molecular docking and 3D-QSAR studies on inhibitors of DNA damage signaling enzyme human PARP-1.
Fatima, Sabiha; Bathini, Raju; Sivan, Sree Kanth; Manga, Vijjulatha
2012-08-01
Poly (ADP-ribose) polymerase-1 (PARP-1) operates in a DNA damage signaling network. Molecular docking and three dimensional-quantitative structure activity relationship (3D-QSAR) studies were performed on human PARP-1 inhibitors. Docked conformation obtained for each molecule was used as such for 3D-QSAR analysis. Molecules were divided into a training set and a test set randomly in four different ways, partial least square analysis was performed to obtain QSAR models using the comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA). Derived models showed good statistical reliability that is evident from their r², q²(loo) and r²(pred) values. To obtain a consensus for predictive ability from all the models, average regression coefficient r²(avg) was calculated. CoMFA and CoMSIA models showed a value of 0.930 and 0.936, respectively. Information obtained from the best 3D-QSAR model was applied for optimization of lead molecule and design of novel potential inhibitors.
NASA Astrophysics Data System (ADS)
Doytchinova, Irini A.; Flower, Darren R.
2002-08-01
The 3D-QSAR CoMSIA technique was applied to a set of 458 peptides binding to the five most widespread HLA-A2-like alleles: A*0201, A*0202, A*0203, A*0206 and A*6802. Models comprising the main physicochemical properties (steric bulk, electron density, hydrophobicity and hydrogen-bond formation abilities) were obtained with acceptable predictivity ( q 2 ranged from 0.385 to 0.683). The use of coefficient contour maps allowed an A2-supermotif to be identified based on common favoured and disfavoured areas. The CoMSIA definition for the best HLA-A2 binder is as follows: hydrophobic aromatic amino acid at position 1; hydrophobic bulky side chains at positions 2, 6 and 9; non-hydrogen-bond-forming amino acids at position 3; small aliphatic hydrogen-bond donors at position 4; aliphatic amino acids at position 5; small aliphatic side chains at position 7; and small aliphatic hydrophilic and hydrogen-bond forming amino acids at position 8.
3D-QSAR and docking studies of flavonoids as potent Escherichia coli inhibitors
Fang, Yajing; Lu, Yulin; Zang, Xixi; Wu, Ting; Qi, XiaoJuan; Pan, Siyi; Xu, Xiaoyun
2016-01-01
Flavonoids are potential antibacterial agents. However, key substituents and mechanism for their antibacterial activity have not been fully investigated. The quantitative structure-activity relationship (QSAR) and molecular docking of flavonoids relating to potent anti-Escherichia coli agents were investigated. Comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were developed by using the pIC50 values of flavonoids. The cross-validated coefficient (q2) values for CoMFA (0.743) and for CoMSIA (0.708) were achieved, illustrating high predictive capabilities. Selected descriptors for the CoMFA model were ClogP (logarithm of the octanol/water partition coefficient), steric and electrostatic fields, while, ClogP, electrostatic and hydrogen bond donor fields were used for the CoMSIA model. Molecular docking results confirmed that half of the tested flavonoids inhibited DNA gyrase B (GyrB) by interacting with adenosine-triphosphate (ATP) pocket in a same orientation. Polymethoxyl flavones, flavonoid glycosides, isoflavonoids changed their orientation, resulting in a decrease of inhibitory activity. Moreover, docking results showed that 3-hydroxyl, 5-hydroxyl, 7-hydroxyl and 4-carbonyl groups were found to be crucial active substituents of flavonoids by interacting with key residues of GyrB, which were in agreement with the QSAR study results. These results provide valuable information for structure requirements of flavonoids as antibacterial agents. PMID:27049530
Gueto, Carlos; Ruiz, José L; Torres, Juan E; Méndez, Jefferson; Vivas-Reyes, Ricardo
2008-03-01
Comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were performed on a series of benzotriazine derivatives, as Src inhibitors. Ligand molecular superimposition on the template structure was performed by database alignment method. The statistically significant model was established of 72 molecules, which were validated by a test set of six compounds. The CoMFA model yielded a q(2)=0.526, non cross-validated R(2) of 0.781, F value of 88.132, bootstrapped R(2) of 0.831, standard error of prediction=0.587, and standard error of estimate=0.351 while the CoMSIA model yielded the best predictive model with a q(2)=0.647, non cross-validated R(2) of 0.895, F value of 115.906, bootstrapped R(2) of 0.953, standard error of prediction=0.519, and standard error of estimate=0.178. The contour maps obtained from 3D-QSAR studies were appraised for activity trends for the molecules analyzed. Results indicate that small steric volumes in the hydrophobic region, electron-withdrawing groups next to the aryl linker region, and atoms close to the solvent accessible region increase the Src inhibitory activity of the compounds. In fact, adding substituents at positions 5, 6, and 8 of the benzotriazine nucleus were generated new compounds having a higher predicted activity. The data generated from the present study will further help to design novel, potent, and selective Src inhibitors as anticancer therapeutic agents.
Vyas, V K; Gupta, N; Ghate, M; Patel, S
2014-01-01
In this study we designed novel substituted benzimidazole derivatives and predicted their absorption, distribution, metabolism, excretion and toxicity (ADMET) properties, based on a predictive 3D QSAR study on 132 substituted benzimidazoles as AngII-AT1 receptor antagonists. The two best predicted compounds were synthesized and evaluated for AngII-AT1 receptor antagonism. Three different alignment tools for comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were used. The best 3D QSAR models were obtained using the rigid body (Distill) alignment method. CoMFA and CoMSIA models were found to be statistically significant with leave-one-out correlation coefficients (q(2)) of 0.630 and 0.623, respectively, cross-validated coefficients (r(2)cv) of 0.651 and 0.630, respectively, and conventional coefficients of determination (r(2)) of 0.848 and 0.843, respectively. 3D QSAR models were validated using a test set of 24 compounds, giving satisfactory predicted results (r(2)pred) of 0.727 and 0.689 for the CoMFA and CoMSIA models, respectively. We have identified some key features in substituted benzimidazole derivatives, such as lipophilicity and H-bonding at the 2- and 5-positions of the benzimidazole nucleus, respectively, for AT1 receptor antagonistic activity. We designed 20 novel substituted benzimidazole derivatives and predicted their activity. In silico ADMET properties were also predicted for these designed molecules. Finally, the compounds with best predicted activity were synthesized and evaluated for in vitro angiotensin II-AT1 receptor antagonism.
3D QSAR based design of novel oxindole derivative as 5HT7 inhibitors.
Chitta, Aparna; Sivan, Sree Kanth; Manga, Vijjulatha
2014-06-01
To understand the structural requirements of 5-hydroxytryptamine (5HT7) receptor inhibitors and to design new ligands against 5HT7 receptor with enhanced inhibitory potency, a three-dimensional quantitative structure-activity relationship study with comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) for a data set of 56 molecules consisting of oxindole, tetrahydronaphthalene, aryl ketone substituted arylpiperazinealkylamide derivatives was performed. Derived model showed good statistical reliability in terms of predicting 5HT7 inhibitory activity of the molecules, based on molecular property fields like steric, electrostatic, hydrophobic, hydrogen bond donor and hydrogen bond acceptor fields. This is evident from statistical parameters like conventional r2 and a cross validated (q2) values of 0.985, 0.743 for CoMFA and 0.970, 0.608 for CoMSIA, respectively. Predictive ability of the models to determine 5HT7 antagonistic activity is validated using a test set of 16 molecules that were not included in the training set. Predictive r2 obtained for the test set was 0.560 and 0.619 for CoMFA and CoMSIA, respectively. Steric, electrostatic fields majorly contributed toward activity which forms the basis for design of new molecules. Absorption, distribution, metabolism and elimination (ADME) calculation using QikProp 2.5 (Schrodinger 2010, Portland, OR) reveals that the molecules confer to Lipinski's rule of five in majority of the cases.
Kim, J; Lee, C; Chong, Y
2009-01-01
Influenza endonucleases have appeared as an attractive target of antiviral therapy for influenza infection. With the purpose of designing a novel antiviral agent with enhanced biological activities against influenza endonuclease, a three-dimensional quantitative structure-activity relationships (3D-QSAR) model was generated based on 34 influenza endonuclease inhibitors. The comparative molecular similarity index analysis (CoMSIA) with a steric, electrostatic and hydrophobic (SEH) model showed the best correlative and predictive capability (q(2) = 0.763, r(2) = 0.969 and F = 174.785), which provided a pharmacophore composed of the electronegative moiety as well as the bulky hydrophobic group. The CoMSIA model was used as a pharmacophore query in the UNITY search of the ChemDiv compound library to give virtual active compounds. The 3D-QSAR model was then used to predict the activity of the selected compounds, which identified three compounds as the most likely inhibitor candidates.
Combined 3D-QSAR modeling and molecular docking study on azacycles CCR5 antagonists
NASA Astrophysics Data System (ADS)
Ji, Yongjun; Shu, Mao; Lin, Yong; Wang, Yuanqiang; Wang, Rui; Hu, Yong; Lin, Zhihua
2013-08-01
The beta chemokine receptor 5 (CCR5) is an attractive target for pharmaceutical industry in the HIV-1, inflammation and cancer therapeutic areas. In this study, we have developed quantitative structure activity relationship (QSAR) models for a series of 41 azacycles CCR5 antagonists using comparative molecular field analysis (CoMFA), comparative molecular similarity indices analysis (CoMSIA), and Topomer CoMFA methods. The cross-validated coefficient q2 values of 3D-QASR (CoMFA, CoMSIA, and Topomer CoMFA) methods were 0.630, 0.758, and 0.852, respectively, the non-cross-validated R2 values were 0.979, 0.978, and 0.990, respectively. Docking studies were also employed to determine the most probable binding mode. 3D contour maps and docking results suggested that bulky groups and electron-withdrawing groups on the core part would decrease antiviral activity. Furthermore, docking results indicated that H-bonds and π bonds were favorable for antiviral activities. Finally, a set of novel derivatives with predicted activities were designed.
NASA Astrophysics Data System (ADS)
Cho, Sehyeon; Choi, Min Ji; Kim, Minju; Lee, Sunhoe; Lee, Jinsung; Lee, Seok Joon; Cho, Haelim; Lee, Kyung-Tae; Lee, Jae Yeol
2015-03-01
A series of 3,4-dihydroquinazoline derivatives with anti-cancer activities against human lung cancer A549 cells were subjected to three-dimensional quantitative structure-activity relationship (3D-QSAR) studies using the comparative molecular similarity indices analysis (CoMSIA) approaches. The most potent compound, 1 was used to align the molecules. As a result, the best prediction was obtained with CoMSIA combined the steric, electrostatic, hydrophobic, hydrogen bond donor, and hydrogen bond acceptor fields (q2 = 0.720, r2 = 0.897). This model was validated by an external test set of 6 compounds giving satisfactory predictive r2 value of 0.923 as well as the scrambling stability test. This model would guide the design of potent 3,4-dihydroquinazoline derivatives as anti-cancer agent for the treatment of human lung cancer.
NASA Astrophysics Data System (ADS)
Zhao, Siqi; Zhang, Guanglong; Xia, Shuwei; Yu, Liangmin
2018-06-01
As a group of diversified frameworks, quinazolin derivatives displayed a broad field of biological functions, especially as anticancer. To investigate the quantitative structure-activity relationship, 3D-QSAR models were generated with 24 quinazolin scaffold molecules. The experimental and predicted pIC50 values for both training and test set compounds showed good correlation, which proved the robustness and reliability of the generated QSAR models. The most effective CoMFA and CoMSIA were obtained with correlation coefficient r 2 ncv of 1.00 (both) and leave-one-out coefficient q 2 of 0.61 and 0.59, respectively. The predictive abilities of CoMFA and CoMSIA were quite good with the predictive correlation coefficients ( r 2 pred ) of 0.97 and 0.91. In addition, the statistic results of CoMFA and CoMSIA were used to design new quinazolin molecules.
Gu, Wenwen; Chen, Ying; Li, Yu
2017-08-01
Based on the experimental subcooled liquid vapor pressures (P L ) of 17 polychlorinated naphthalene (PCN) congeners, one type of three-dimensional quantitative structure-activity relationship (3D-QSAR) models, comparative molecular similarity indices analysis (CoMSIA), was constructed with Sybyl software. Full factor experimental design was used to obtain the final regulation scheme for PCN, and then carry out modification of PCN-2 to significantly lower its P L . The contour maps of CoMSIA model showed that the migration ability of PCN decreases when the Cl atoms at the 2-, 3-, 4-, 5-, 6-, 7- and 8-positions of PCNs are replaced by electropositive groups. After modification of PCN-2, 12 types of new modified PCN-2 compounds were obtained with lnP L values two orders of magnitude lower than that of PCN-2. In addition, there are significant differences between the calculated total energies and energy gaps of the new modified compounds and those of PCN-2.
Wu, Mingwei; Li, Yan; Fu, Xinmei; Wang, Jinghui; Zhang, Shuwei; Yang, Ling
2014-09-01
Melanin concentrating hormone receptor 1 (MCHR1), a crucial regulator of energy homeostasis involved in the control of feeding and energy metabolism, is a promising target for treatment of obesity. In the present work, the up-to-date largest set of 181 quinoline/quinazoline derivatives as MCHR1 antagonists was subjected to both ligand- and receptor-based three-dimensional quantitative structure-activity (3D-QSAR) analysis applying comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA). The optimal predictable CoMSIA model exhibited significant validity with the cross-validated correlation coefficient (Q²) = 0.509, non-cross-validated correlation coefficient (R²(ncv)) = 0.841 and the predicted correlation coefficient (R²(pred)) = 0.745. In addition, docking studies and molecular dynamics (MD) simulations were carried out for further elucidation of the binding modes of MCHR1 antagonists. MD simulations in both water and lipid bilayer systems were performed. We hope that the obtained models and information may help to provide an insight into the interaction mechanism of MCHR1 antagonists and facilitate the design and optimization of novel antagonists as anti-obesity agents.
Pandey, Gyanendra; Saxena, Anil K
2006-01-01
A set of 65 flexible peptidomimetic competitive inhibitors (52 in the training set and 13 in the test set) of protein tyrosine phosphatase 1B (PTP1B) has been used to compare the quality and predictive power of 3D quantitative structure-activity relationship (QSAR) comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) models for the three most commonly used conformer-based alignments, namely, cocrystallized conformer-based alignment (CCBA), docked conformer-based alignment (DCBA), and global minima energy conformer-based alignment (GMCBA). These three conformers of 5-[(2S)-2-({(2S)-2-[(tert-butoxycarbonyl)amino]-3-phenylpropanoyl}amino)3-oxo-3-pentylamino)propyl]-2-(carboxymethoxy)benzoic acid (compound number 66) were obtained from the X-ray structure of its cocrystallized complex with PTP1B (PDB ID: 1JF7), its docking studies, and its global minima by simulated annealing. Among the 3D QSAR models developed using the above three alignments, the CCBA provided the optimal predictive CoMFA model for the training set with cross-validated r2 (q2)=0.708, non-cross-validated r2=0.902, standard error of estimate (s)=0.165, and F=202.553 and the optimal CoMSIA model with q2=0.440, r2=0.799, s=0.192, and F=117.782. These models also showed the best test set prediction for the 13 compounds with predictive r2 values of 0.706 and 0.683, respectively. Though the QSAR models derived using the other two alignments also produced statistically acceptable models in the order DCBA>GMCBA in terms of the values of q2, r2, and predictive r2, they were inferior to the corresponding models derived using CCBA. Thus, the order of preference for the alignment selection for 3D QSAR model development may be CCBA>DCBA>GMCBA, and the information obtained from the CoMFA and CoMSIA contour maps may be useful in designing specific PTP1B inhibitors.
Peddi, Saikiran Reddy; Sivan, Sree Kanth; Manga, Vijjulatha
2016-10-01
Anaplastic lymphoma kinase (ALK), a promising therapeutic target for treatment of human cancers, is a receptor tyrosine kinase that instigates the activation of several signal transduction pathways. In the present study, in silico methods have been employed in order to explore the structural features and functionalities of a series of tetracyclic derivatives displaying potent inhibitory activity toward ALK. Initially docking was performed using GLIDE 5.6 to probe the bioactive conformation of all the compounds and to understand the binding modes of inhibitors. The docking results revealed that ligand interaction with Met 1199 plays a crucial role in binding of inhibitors to ALK. Further to establish a robust 3D-QSAR model using CoMFA and CoMSIA methods, the whole dataset was divided into three splits. Model obtained from Split 3 showed high accuracy ([Formula: see text] of 0.700 and 0.682, [Formula: see text] of 0.971 and 0.974, [Formula: see text] of 0.673 and 0.811, respectively for CoMFA and CoMSIA). The key structural requirements for enhancing the inhibitory activity were derived from CoMFA and CoMSIA contours in combination with site map analysis. Substituting small electronegative groups at Position 8 by replacing either morpholine or piperidine rings and maintaining hydrophobic character at Position 9 in tetracyclic derivatives can enhance the inhibitory potential. Finally, we performed molecular dynamics simulations in order to investigate the stability of protein ligand interactions and MM/GBSA calculations to compare binding free energies of co-crystal ligand and newly designed molecule N1. Based on the coherence of outcome of various molecular modeling studies, a set of 11 new molecules having potential predicted inhibitory activity were designed.
Ballu, Srilata; Itteboina, Ramesh; Sivan, Sree Kanth; Manga, Vijjulatha
2018-04-01
Staphylococcus aureus is a gram positive bacterium. It is the leading cause of skin and respiratory infections, osteomyelitis, Ritter's disease, endocarditis, and bacteraemia in the developed world. We employed combined studies of 3D QSAR, molecular docking which are validated by molecular dynamics simulations and in silico ADME prediction have been performed on Isothiazoloquinolones inhibitors against methicillin resistance Staphylococcus aureus. Three-dimensional quantitative structure-activity relationship (3D-QSAR) study was applied using comparative molecular field analysis (CoMFA) with Q 2 of 0.578, R 2 of 0.988, and comparative molecular similarity indices analysis (CoMSIA) with Q 2 of 0.554, R 2 of 0.975. The predictive ability of these model was determined using a test set of molecules that gave acceptable predictive correlation (r 2 Pred) values 0.55 and 0.57 of CoMFA and CoMSIA respectively. Docking, simulations were employed to position the inhibitors into protein active site to find out the most probable binding mode and most reliable conformations. Developed models and Docking methods provide guidance to design molecules with enhanced activity. Copyright © 2017 Elsevier Ltd. All rights reserved.
QSAR analyses on avian influenza virus neuraminidase inhibitors using CoMFA, CoMSIA, and HQSAR
NASA Astrophysics Data System (ADS)
Zheng, Mingyue; Yu, Kunqian; Liu, Hong; Luo, Xiaomin; Chen, Kaixian; Zhu, Weiliang; Jiang, Hualiang
2006-09-01
The recent wide spreading of the H5N1 avian influenza virus (AIV) in Asia, Europe and Africa and its ability to cause fatal infections in human has raised serious concerns about a pending global flu pandemic. Neuraminidase (NA) inhibitors are currently the only option for treatment or prophylaxis in humans infected with this strain. However, drugs currently on the market often meet with rapidly emerging resistant mutants and only have limited application as inadequate supply of synthetic material. To dig out helpful information for designing potent inhibitors with novel structures against the NA, we used automated docking, CoMFA, CoMSIA, and HQSAR methods to investigate the quantitative structure-activity relationship for 126 NA inhibitors (NIs) with great structural diversities and wide range of bioactivities against influenza A virus. Based on the binding conformations discovered via molecular docking into the crystal structure of NA, CoMFA and CoMSIA models were successfully built with the cross-validated q 2 of 0.813 and 0.771, respectively. HQSAR was also carried out as a complementary study in that HQSAR technique does not require 3D information of these compounds and could provide a detailed molecular fragment contribution to the inhibitory activity. These models also show clearly how steric, electrostatic, hydrophobicity, and individual fragments affect the potency of NA inhibitors. In addition, CoMFA and CoMSIA field distributions are found to be in well agreement with the structural characteristics of the corresponding binding sites. Therefore, the final 3D-QSAR models and the information of the inhibitor-enzyme interaction should be useful in developing novel potent NA inhibitors.
Binding site exploration of CCR5 using in silico methodologies: a 3D-QSAR approach.
Gadhe, Changdev G; Kothandan, Gugan; Cho, Seung Joo
2013-01-01
Chemokine receptor 5 (CCR5) is an important receptor used by human immunodeficiency virus type 1 (HIV-1) to gain viral entry into host cell. In this study, we used a combined approach of comparative modeling, molecular docking, and three dimensional quantitative structure activity relationship (3D-QSAR) analyses to elucidate detailed interaction of CCR5 with their inhibitors. Docking study of the most potent inhibitor from a series of compounds was done to derive the bioactive conformation. Parameters such as random selection, rational selection, different charges and grid spacing were utilized in the model development to check their performance on the model predictivity. Final comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) models were chosen based on the rational selection method, Gasteiger-Hückel charges and a grid spacing of 0.5 Å. Rational model for CoMFA (q(2) = 0.722, r(2) = 0.884, Q(2) = 0.669) and CoMSIA (q(2) = 0.712, r(2) = 0.825, Q(2) = 0.522) was obtained with good statistics. Mapping of contour maps onto CCR5 interface led us to better understand of the ligand-protein interaction. Docking analysis revealed that the Glu283 is crucial for interaction. Two new amino acid residues, Tyr89 and Thr167 were identified as important in ligand-protein interaction. No site directed mutagenesis studies on these residues have been reported.
Design of novel quinazolinone derivatives as inhibitors for 5HT7 receptor.
Chitta, Aparna; Jatavath, Mohan Babu; Fatima, Sabiha; Manga, Vijjulatha
2012-02-01
To study the pharmacophore properties of quinazolinone derivatives as 5HT(7) inhibitors, 3D QSAR methodologies, namely Comparative Molecular Field Analysis (CoMFA) and Comparative Molecular Similarity Indices Analysis (CoMSIA) were applied, partial least square (PLS) analysis was performed and QSAR models were generated. The derived model showed good statistical reliability in terms of predicting the 5HT(7) inhibitory activity of the quinazolione derivative, based on molecular property fields like steric, electrostatic, hydrophobic, hydrogen bond donor and hydrogen bond acceptor fields. This is evident from statistical parameters like q(2) (cross validated correlation coefficient) of 0.642, 0.602 and r(2) (conventional correlation coefficient) of 0.937, 0.908 for CoMFA and CoMSIA respectively. The predictive ability of the models to determine 5HT(7) antagonistic activity is validated using a test set of 26 molecules that were not included in the training set and the predictive r(2) obtained for the test set was 0.512 & 0.541. Further, the results of the derived model are illustrated by means of contour maps, which give an insight into the interaction of the drug with the receptor. The molecular fields so obtained served as the basis for the design of twenty new ligands. In addition, ADME (Adsorption, Distribution, Metabolism and Elimination) have been calculated in order to predict the relevant pharmaceutical properties, and the results are in conformity with required drug like properties.
Wang, Yuan; Wu, Mingwei; Ai, Chunzhi; Wang, Yonghua
2015-01-01
Presently, 151 widely-diverse pyridinylimidazole-based compounds that show inhibitory activities at the TNF-α release were investigated. By using the distance comparison technique (DISCOtech), comparative molecular field analysis (CoMFA), and comparative molecular similarity index analysis (CoMSIA) methods, the pharmacophore models and the three-dimensional quantitative structure-activity relationships (3D-QSAR) of the compounds were explored. The proposed pharmacophore model, including two hydrophobic sites, two aromatic centers, two H-bond donor atoms, two H-bond acceptor atoms, and two H-bond donor sites characterizes the necessary structural features of TNF-α release inhibitors. Both the resultant CoMFA and CoMSIA models exhibited satisfactory predictability (with Q2 (cross-validated correlation coefficient) = 0.557, R2ncv (non-cross-validated correlation coefficient) = 0.740, R2pre (predicted correlation coefficient) = 0.749 and Q2 = 0.598, R2ncv = 0.767, R2pre = 0.860, respectively). Good consistency was observed between the 3D-QSAR models and the pharmacophore model that the hydrophobic interaction and hydrogen bonds play crucial roles in the mechanism of actions. The corresponding contour maps generated by these models provide more diverse information about the key intermolecular interactions of inhibitors with the surrounding environment. All these models have extended the understanding of imidazole-based compounds in the structure-activity relationship, and are useful for rational design and screening of novel 2-thioimidazole-based TNF-α release inhibitors. PMID:26307982
Wang, Yuan; Wu, Mingwei; Ai, Chunzhi; Wang, Yonghua
2015-08-25
Presently, 151 widely-diverse pyridinylimidazole-based compounds that show inhibitory activities at the TNF-α release were investigated. By using the distance comparison technique (DISCOtech), comparative molecular field analysis (CoMFA), and comparative molecular similarity index analysis (CoMSIA) methods, the pharmacophore models and the three-dimensional quantitative structure-activity relationships (3D-QSAR) of the compounds were explored. The proposed pharmacophore model, including two hydrophobic sites, two aromatic centers, two H-bond donor atoms, two H-bond acceptor atoms, and two H-bond donor sites characterizes the necessary structural features of TNF-α release inhibitors. Both the resultant CoMFA and CoMSIA models exhibited satisfactory predictability (with Q(2) (cross-validated correlation coefficient) = 0.557, R(2)ncv (non-cross-validated correlation coefficient) = 0.740, R(2)pre (predicted correlation coefficient) = 0.749 and Q(2) = 0.598, R(2)ncv = 0.767, R(2)pre = 0.860, respectively). Good consistency was observed between the 3D-QSAR models and the pharmacophore model that the hydrophobic interaction and hydrogen bonds play crucial roles in the mechanism of actions. The corresponding contour maps generated by these models provide more diverse information about the key intermolecular interactions of inhibitors with the surrounding environment. All these models have extended the understanding of imidazole-based compounds in the structure-activity relationship, and are useful for rational design and screening of novel 2-thioimidazole-based TNF-α release inhibitors.
NASA Astrophysics Data System (ADS)
Chen, Bohong; Zhu, Zhibo; Chen, Min; Dong, Wenqi; Li, Zhen
2014-03-01
A comparative molecular similarity indices analysis (CoMSIA) was performed on a set of 27 curcumin-like diarylpentanoid analogues with the radical scavenging activities. A significant cross-validated correlation coefficient Q2 (0.784), SEP (0.042) for CoMSIA were obtained, indicating the statistical significance of the correlation. Further we adopt a rational approach toward the selection of substituents at various positions in our scaffold,and finally find the favored and disfavoured regions for the enhanced antioxidative activity. The results have been used as a guide to design compounds that, potentially, have better activity against oxidative damage.
Tsareva, Daria A; Osolodkin, Dmitry I; Shulga, Dmitry A; Oliferenko, Alexander A; Pisarev, Sergey A; Palyulin, Vladimir A; Zefirov, Nikolay S
2011-03-14
Two fast empirical charge models, Kirchhoff Charge Model (KCM) and Dynamic Electronegativity Relaxation (DENR), had been developed in our laboratory previously for widespread use in drug design research. Both models are based on the electronegativity relaxation principle (Adv. Quantum Chem. 2006, 51, 139-156) and parameterized against ab initio dipole/quadrupole moments and molecular electrostatic potentials, respectively. As 3D QSAR studies comprise one of the most important fields of applied molecular modeling, they naturally have become the first topic to test our charges and thus, indirectly, the assumptions laid down to the charge model theories in a case study. Here these charge models are used in CoMFA and CoMSIA methods and tested on five glycogen synthase kinase 3 (GSK-3) inhibitor datasets, relevant to our current studies, and one steroid dataset. For comparison, eight other different charge models, ab initio through semiempirical and empirical, were tested on the same datasets. The complex analysis including correlation and cross-validation, charges robustness and predictability, as well as visual interpretability of 3D contour maps generated was carried out. As a result, our new electronegativity relaxation-based models both have shown stable results, which in conjunction with other benefits discussed render them suitable for building reliable 3D QSAR models. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Jójárt, Balázs; Martinek, Tamás A.; Márki, Árpád
2005-05-01
Molecular docking and 3D-QSAR studies were performed to determine the binding mode for a series of benzoxazine oxytocin antagonists taken from the literature. Structural hypotheses were generated by docking the most active molecule to the rigid receptor by means of AutoDock 3.05. The cluster analysis yielded seven possible binding conformations. These structures were refined by using constrained simulated annealing, and the further ligands were aligned in the refined receptor by molecular docking. A good correlation was found between the estimated Δ G bind and the p K i values for complex F. The Connolly-surface analysis, CoMFA and CoMSIA models q CoMFA 2 = 0.653, q CoMSA 2 = 0.630 and r pred,CoMFA 2 = 0.852 , r pred,CoMSIA 2 = 0.815) confirmed the scoring function results. The structural features of the receptor-ligand complex and the CoMFA and CoMSIA fields are in closely connected. These results suggest that receptor-ligand complex F is the most likely binding hypothesis for the studied benzoxazine analogs.
Lorca, Marcos; Morales-Verdejo, Cesar; Vásquez-Velásquez, David; Andrades-Lagos, Juan; Campanini-Salinas, Javier; Soto-Delgado, Jorge; Recabarren-Gajardo, Gonzalo; Mella, Jaime
2018-05-16
The wide tissue distribution of the adrenergic β3 receptor makes it a potential target for the treatment of multiple pathologies such as diabetes, obesity, depression, overactive bladder (OAB), and cancer. Currently, there is only one drug on the market, mirabegron, approved for the treatment of OAB. In the present study, we have carried out an extensive structure-activity relationship analysis of a series of 41 aryloxypropanolamine compounds based on three-dimensional quantitative structure-activity relationship (3D-QSAR) techniques. This is the first combined comparative molecular field analysis (CoMFA) and comparative molecular similarity index analysis (CoMSIA) study in a series of selective aryloxypropanolamines displaying anti-diabetes and anti-obesity pharmacological profiles. The best CoMFA and CoMSIA models presented values of r ² ncv = 0.993 and 0.984 and values of r ² test = 0.865 and 0.918, respectively. The results obtained were subjected to extensive external validation ( q ², r ², r ² m , etc.) and a final series of compounds was designed and their biological activity was predicted (best pEC 50 = 8.561).
Feng, Taotao; Wang, Hai; Zhang, Xiaojin; Sun, Haopeng; You, Qidong
2014-06-01
Protein lysine methyltransferase G9a, which catalyzes methylation of lysine 9 of histone H3 (H3K9) and lysine 373 (K373) of p53, is overexpressed in human cancers. This suggests that small molecular inhibitors of G9a might be attractive antitumor agents. Herein we report our efforts on the design of novel G9a inhibitor based on the 3D quantitative structure-activity relationship (3D-QSAR) analysis of a series of 2,4-diamino-7-aminoalkoxyquinazolineas G9a inhibitors. The 3D-QSAR model was generated from 47 compounds using docking based molecular alignment. The best predictions were obtained with CoMFA standard model (q2 =0.700, r2 = 0.952) and CoMSIA model combined with steric, electrostatic, hydrophobic, hydrogen bond donor and acceptor fields (q2 = 0.724, r2 =0.960). The structural requirements for substituted 2,4-diamino-7-aminoalkoxyquinazoline for G9a inhibitory activity can be obtained by analysing the COMSIA plots. Based on the information, six novel follow-up analogs were designed.
NASA Astrophysics Data System (ADS)
Aouidate, Adnane; Ghaleb, Adib; Ghamali, Mounir; Chtita, Samir; Choukrad, M'barek; Sbai, Abdelouahid; Bouachrine, Mohammed; Lakhlifi, Tahar
2017-07-01
A series of nineteen DHFR inhibitors was studied based on the combination of two computational techniques namely, three-dimensional quantitative structure activity relationship (3D-QSAR) and molecular docking. The comparative molecular field analysis (CoMFA) and comparative molecular similarity index analysis (CoMSIA) were developed using 19 molecules having pIC50 ranging from 9.244 to 5.839. The best CoMFA and CoMSIA models show conventional determination coefficients R2 of 0.96 and 0.93 as well as the Leave One Out cross-validation determination coefficients Q2 of 0.64 and 0.72, respectively. The predictive ability of those models was evaluated by the external validation using a test set of five compounds with predicted determination coefficients R2test of 0.92 and 0.94, respectively. The binding mode between this kind of compounds and the DHFR enzyme in addition to the key amino acid residues were explored by molecular docking simulation. Contour maps and molecular docking identified that the R1 and R2 natures at the pyrazole moiety are the important features for the optimization of the binding affinity to the DHFR receptor. According to the good concordance between the CoMFA/CoMSIA contour maps and docking results, the obtained information was explored to design novel molecules.
QSAR and 3D QSAR of inhibitors of the epidermal growth factor receptor
NASA Astrophysics Data System (ADS)
Pinto-Bazurco, Mariano; Tsakovska, Ivanka; Pajeva, Ilza
This article reports quantitative structure-activity relationships (QSAR) and 3D QSAR models of 134 structurally diverse inhibitors of the epidermal growth factor receptor (EGFR) tyrosine kinase. Free-Wilson analysis was used to derive the QSAR model. It identified the substituents in aniline, the polycyclic system, and the substituents at the 6- and 7-positions of the polycyclic system as the most important structural features. Comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were used in the 3D QSAR modeling. The steric and electrostatic interactions proved the most important for the inhibitory effect. Both QSAR and 3D QSAR models led to consistent results. On the basis of the statistically significant models, new structures were proposed and their inhibitory activities were predicted.
Liu, Jing; Li, Yan; Zhang, Shuwei; Xiao, Zhengtao; Ai, Chunzhi
2011-01-01
In recent years, great interest has been paid to the development of compounds with high selectivity for central dopamine (DA) D3 receptors, an interesting therapeutic target in the treatment of different neurological disorders. In the present work, based on a dataset of 110 collected benzazepine (BAZ) DA D3 antagonists with diverse kinds of structures, a variety of in silico modeling approaches, including comparative molecular field analysis (CoMFA), comparative similarity indices analysis (CoMSIA), homology modeling, molecular docking and molecular dynamics (MD) were carried out to reveal the requisite 3D structural features for activity. Our results show that both the receptor-based (Q2 = 0.603, R2ncv = 0.829, R2pre = 0.690, SEE = 0.316, SEP = 0.406) and ligand-based 3D-QSAR models (Q2 = 0.506, R2ncv =0.838, R2pre = 0.794, SEE = 0.316, SEP = 0.296) are reliable with proper predictive capacity. In addition, a combined analysis between the CoMFA, CoMSIA contour maps and MD results with a homology DA receptor model shows that: (1) ring-A, position-2 and R3 substituent in ring-D are crucial in the design of antagonists with higher activity; (2) more bulky R1 substituents (at position-2 of ring-A) of antagonists may well fit in the binding pocket; (3) hydrophobicity represented by MlogP is important for building satisfactory QSAR models; (4) key amino acids of the binding pocket are CYS101, ILE105, LEU106, VAL151, PHE175, PHE184, PRO254 and ALA251. To our best knowledge, this work is the first report on 3D-QSAR modeling of the new fused BAZs as DA D3 antagonists. These results might provide information for a better understanding of the mechanism of antagonism and thus be helpful in designing new potent DA D3 antagonists. PMID:21541053
Liu, Jing; Li, Yan; Zhang, Shuwei; Xiao, Zhengtao; Ai, Chunzhi
2011-02-18
In recent years, great interest has been paid to the development of compounds with high selectivity for central dopamine (DA) D3 receptors, an interesting therapeutic target in the treatment of different neurological disorders. In the present work, based on a dataset of 110 collected benzazepine (BAZ) DA D3 antagonists with diverse kinds of structures, a variety of in silico modeling approaches, including comparative molecular field analysis (CoMFA), comparative similarity indices analysis (CoMSIA), homology modeling, molecular docking and molecular dynamics (MD) were carried out to reveal the requisite 3D structural features for activity. Our results show that both the receptor-based (Q(2) = 0.603, R(2) (ncv) = 0.829, R(2) (pre) = 0.690, SEE = 0.316, SEP = 0.406) and ligand-based 3D-QSAR models (Q(2) = 0.506, R(2) (ncv) =0.838, R(2) (pre) = 0.794, SEE = 0.316, SEP = 0.296) are reliable with proper predictive capacity. In addition, a combined analysis between the CoMFA, CoMSIA contour maps and MD results with a homology DA receptor model shows that: (1) ring-A, position-2 and R(3) substituent in ring-D are crucial in the design of antagonists with higher activity; (2) more bulky R(1) substituents (at position-2 of ring-A) of antagonists may well fit in the binding pocket; (3) hydrophobicity represented by MlogP is important for building satisfactory QSAR models; (4) key amino acids of the binding pocket are CYS101, ILE105, LEU106, VAL151, PHE175, PHE184, PRO254 and ALA251. To our best knowledge, this work is the first report on 3D-QSAR modeling of the new fused BAZs as DA D3 antagonists. These results might provide information for a better understanding of the mechanism of antagonism and thus be helpful in designing new potent DA D3 antagonists.
Morales-Bayuelo, Alejandro; Ayazo, Hernan; Vivas-Reyes, Ricardo
2010-10-01
Comparative molecular similarity indices analysis (CoMSIA) and comparative molecular field analysis (CoMFA) were performed on a series of bicyclo [4.1.0] heptanes derivatives as melanin-concentrating hormone receptor R1 antagonists (MCHR1 antagonists). Molecular superimposition of antagonists on the template structure was performed by database alignment method. The statistically significant model was established on sixty five molecules, which were validated by a test set of ten molecules. The CoMSIA model yielded the best predictive model with a q(2) = 0.639, non cross-validated R(2) of 0.953, F value of 92.802, bootstrapped R(2) of 0.971, standard error of prediction = 0.402, and standard error of estimate = 0.146 while the CoMFA model yielded a q(2) = 0.680, non cross-validated R(2) of 0.922, F value of 114.351, bootstrapped R(2) of 0.925, standard error of prediction = 0.364, and standard error of estimate = 0.180. CoMFA analysis maps were employed for generating a pseudo cavity for LeapFrog calculation. The contour maps obtained from 3D-QSAR studies were appraised for activity trends for the molecules analyzed. The results show the variability of steric and electrostatic contributions that determine the activity of the MCHR1 antagonist, with these results we proposed new antagonists that may be more potent than previously reported, these novel antagonists were designed from the addition of highly electronegative groups in the substituent di(i-C(3)H(7))N- of the bicycle [4.1.0] heptanes, using the model CoMFA which also was used for the molecular design using the technique LeapFrog. The data generated from the present study will further help to design novel, potent, and selective MCHR1 antagonists. Copyright (c) 2010 Elsevier Masson SAS. All rights reserved.
NASA Astrophysics Data System (ADS)
Lalit, Manisha; Gangwal, Rahul P.; Dhoke, Gaurao V.; Damre, Mangesh V.; Khandelwal, Kanchan; Sangamwar, Abhay T.
2013-10-01
A combined pharmacophore modelling, 3D-QSAR and molecular docking approach was employed to reveal structural and chemical features essential for the development of small molecules as LRH-1 agonists. The best HypoGen pharmacophore hypothesis (Hypo1) consists of one hydrogen-bond donor (HBD), two general hydrophobic (H), one hydrophobic aromatic (HYAr) and one hydrophobic aliphatic (HYA) feature. It has exhibited high correlation coefficient of 0.927, cost difference of 85.178 bit and low RMS value of 1.411. This pharmacophore hypothesis was cross-validated using test set, decoy set and Cat-Scramble methodology. Subsequently, validated pharmacophore hypothesis was used in the screening of small chemical databases. Further, 3D-QSAR models were developed based on the alignment obtained using substructure alignment. The best CoMFA and CoMSIA model has exhibited excellent rncv2 values of 0.991 and 0.987, and rcv2 values of 0.767 and 0.703, respectively. CoMFA predicted rpred2 of 0.87 and CoMSIA predicted rpred2 of 0.78 showed that the predicted values were in good agreement with the experimental values. Molecular docking analysis reveals that π-π interaction with His390 and hydrogen bond interaction with His390/Arg393 is essential for LRH-1 agonistic activity. The results from pharmacophore modelling, 3D-QSAR and molecular docking are complementary to each other and could serve as a powerful tool for the discovery of potent small molecules as LRH-1 agonists.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Xiaolin; Ye, Li; Wang, Xiaoxiang
2012-12-15
Several recent reports suggested that hydroxylated polybrominated diphenyl ethers (HO-PBDEs) may disturb thyroid hormone homeostasis. To illuminate the structural features for thyroid hormone activity of HO-PBDEs and the binding mode between HO-PBDEs and thyroid hormone receptor (TR), the hormone activity of a series of HO-PBDEs to thyroid receptors β was studied based on the combination of 3D-QSAR, molecular docking, and molecular dynamics (MD) methods. The ligand- and receptor-based 3D-QSAR models were obtained using Comparative Molecular Similarity Index Analysis (CoMSIA) method. The optimum CoMSIA model with region focusing yielded satisfactory statistical results: leave-one-out cross-validation correlation coefficient (q{sup 2}) was 0.571 andmore » non-cross-validation correlation coefficient (r{sup 2}) was 0.951. Furthermore, the results of internal validation such as bootstrapping, leave-many-out cross-validation, and progressive scrambling as well as external validation indicated the rationality and good predictive ability of the best model. In addition, molecular docking elucidated the conformations of compounds and key amino acid residues at the docking pocket, MD simulation further determined the binding process and validated the rationality of docking results. -- Highlights: ► The thyroid hormone activities of HO-PBDEs were studied by 3D-QSAR. ► The binding modes between HO-PBDEs and TRβ were explored. ► 3D-QSAR, molecular docking, and molecular dynamics (MD) methods were performed.« less
NASA Astrophysics Data System (ADS)
Xie, Aihua; Odde, Srinivas; Prasanna, Sivaprakasam; Doerksen, Robert J.
2009-07-01
One of the most promising anticancer and recent antimalarial targets is the heterodimeric zinc-containing protein farnesyltransferase (FT). In this work, we studied a highly diverse series of 192 Abbott-initiated imidazole-containing compounds and their FT inhibitory activities using 3D-QSAR and docking, in order to gain understanding of the interaction of these inhibitors with FT to aid development of a rational strategy for further lead optimization. We report several highly significant and predictive CoMFA and CoMSIA models. The best model, composed of CoMFA steric and electrostatic fields combined with CoMSIA hydrophobic and H-bond acceptor fields, had r 2 = 0.878, q 2 = 0.630, and r pred 2 = 0.614. Docking studies on the statistical outliers revealed that some of them had a different binding mode in the FT active site based on steric bulk and available active site space, explaining why the predicted activities differed from the experimental activities.
Gan, Xiuhai; Hu, Deyu; Li, Pei; Wu, Jian; Chen, Xuewen; Xue, Wei; Song, Baoan
2016-03-01
1,4-Pentadien-3-one and 1,3,4-oxadiazole derivatives possess good antiviral activities, and their substructure units are usually used in antiviral agent design. In order to discover novel molecules with high antiviral activities, a series of 1,4-pentadien-3-one derivatives containing the 1,3,4-oxadiazole moiety were designed and synthesised. Bioassays showed that most of the title compounds exhibited good inhibitory activities against tobacco mosaic virus (TMV) in vivo. The compound 8f possessing the best protective activity against TMV had an EC50 value of 135.56 mg L(-1) , which was superior to that of ribavirin (435.99 mg L(-1) ). Comparative molecular field analysis (CoMFA) and comparative molecular similarity index analysis (CoMSIA) techniques were used in three-dimensional quantitative structure-activity relationship (3D-QSAR) studies of protective activities, with values of q(2) and r(2) for the CoMFA and CoMSIA models of 0.751 and 0.775 and 0.936 and 0.925 respectively. Compound 8k with higher protective activity (EC50 = 123.53 mg L(-1) ) according to bioassay was designed and synthesised on the basis of the 3D-QSAR models. Some of the title compounds displayed good antiviral activities. 3D-QSAR models revealed that the appropriate compact electron-withdrawing and hydrophobic group at the benzene ring could enhance antiviral activity. These results could provide important structural insights for the design of highly active 1,4-pentadien-3-one derivatives. © 2015 Society of Chemical Industry.
Caballero, Julio; Fernández, Michael; Coll, Deysma
2010-12-01
Three-dimensional quantitative structure-activity relationship studies were carried out on a series of 28 organosulphur compounds as 15-lipoxygenase inhibitors using comparative molecular field analysis and comparative molecular similarity indices analysis. Quantitative information on structure-activity relationships is provided for further rational development and direction of selective synthesis. All models were carried out over a training set including 22 compounds. The best comparative molecular field analysis model only included steric field and had a good Q² = 0.789. Comparative molecular similarity indices analysis overcame the comparative molecular field analysis results: the best comparative molecular similarity indices analysis model also only included steric field and had a Q² = 0.894. In addition, this model predicted adequately the compounds contained in the test set. Furthermore, plots of steric comparative molecular similarity indices analysis field allowed conclusions to be drawn for the choice of suitable inhibitors. In this sense, our model should prove useful in future 15-lipoxygenase inhibitor design studies. © 2010 John Wiley & Sons A/S.
NASA Astrophysics Data System (ADS)
Li, Peizhen; Tian, Yueli; Zhai, Honglin; Deng, Fangfang; Xie, Meihong; Zhang, Xiaoyun
2013-11-01
Non-purine derivatives have been shown to be promising novel drug candidates as xanthine oxidase inhibitors. Based on three-dimensional quantitative structure-activity relationship (3D-QSAR) methods including comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA), two 3D-QSAR models for a series of non-purine xanthine oxidase (XO) inhibitors were established, and their reliability was supported by statistical parameters. Combined 3D-QSAR modeling and the results of molecular docking between non-purine xanthine oxidase inhibitors and XO, the main factors that influenced activity of inhibitors were investigated, and the obtained results could explain known experimental facts. Furthermore, several new potential inhibitors with higher activity predicted were designed, which based on our analyses, and were supported by the simulation of molecular docking. This study provided some useful information for the development of non-purine xanthine oxidase inhibitors with novel structures.
NASA Astrophysics Data System (ADS)
Wang, Zhenya; Chang, Yiqun; Han, Yushui; Liu, Kangjia; Hou, Jinsong; Dai, Chengli; Zhai, Yuanhao; Guo, Jialiang; Sun, Pinghua; Lin, Jing; Chen, Weimin
2016-11-01
Mutation of isocitrate dehydrogenase 1 (IDH1) which is frequently found in certain cancers such as glioma, sarcoma and acute myeloid leukemia, has been proven to be a potent drug target for cancer therapy. In silico methodologies such as 3D-QSAR and molecular docking were performed to explore compounds with better mutant isocitrate dehydrogenase 1 (MIDH1) inhibitory activity using a series of 40 newly reported 1-hydroxypyridin-2-one compounds as MIDH1 inhibitors. The satisfactory CoMFA and CoMSIA models obtained after internal and external cross-validation gave q2 values of 0.691 and 0.535, r2 values of 0.984 and 0.936, respectively. 3D contour maps generated from CoMFA and CoMSIA along with the docking results provided information about the structural requirements for better MIDH1 inhibitory activity. Based on the structure-activity relationship, 17 new potent molecules with better predicted activity than the most active compound in the literature have been designed.
NASA Astrophysics Data System (ADS)
Lan, Ping; Xie, Mei-Qi; Yao, Yue-Mei; Chen, Wan-Na; Chen, Wei-Min
2010-12-01
Fructose-1,6-biphophatase has been regarded as a novel therapeutic target for the treatment of type 2 diabetes mellitus (T2DM). 3D-QSAR and docking studies were performed on a series of [5-(4-amino-1 H-benzoimidazol-2-yl)-furan-2-yl]-phosphonic acid derivatives as fructose-1,6-biphophatase inhibitors. The CoMFA and CoMSIA models using thirty-seven molecules in the training set gave r cv 2 values of 0.614 and 0.598, r 2 values of 0.950 and 0.928, respectively. The external validation indicated that our CoMFA and CoMSIA models possessed high predictive powers with r 0 2 values of 0.994 and 0.994, r m 2 values of 0.751 and 0.690, respectively. Molecular docking studies revealed that a phosphonic group was essential for binding to the receptor, and some key features were also identified. A set of forty new analogues were designed by utilizing the results revealed in the present study, and were predicted with significantly improved potencies in the developed models. The findings can be quite useful to aid the designing of new fructose-1,6-biphophatase inhibitors with improved biological response.
Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta; ...
2016-08-29
In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta
In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less
Molecular Determinants of Juvenile Hormone Action as Revealed by 3D QSAR Analysis in Drosophila
Beňo, Milan; Farkaš, Robert
2009-01-01
Background Postembryonic development, including metamorphosis, of many animals is under control of hormones. In Drosophila and other insects these developmental transitions are regulated by the coordinate action of two principal hormones, the steroid ecdysone and the sesquiterpenoid juvenile hormone (JH). While the mode of ecdysone action is relatively well understood, the molecular mode of JH action remains elusive. Methodology/Principal Findings To gain more insights into the molecular mechanism of JH action, we have tested the biological activity of 86 structurally diverse JH agonists in Drosophila melanogaster. The results were evaluated using 3D QSAR analyses involving CoMFA and CoMSIA procedures. Using this approach we have generated both computer-aided and species-specific pharmacophore fingerprints of JH and its agonists, which revealed that the most active compounds must possess an electronegative atom (oxygen or nitrogen) at both ends of the molecule. When either of these electronegative atoms are replaced by carbon or the distance between them is shorter than 11.5 Å or longer than 13.5 Å, their biological activity is dramatically decreased. The presence of an electron-deficient moiety in the middle of the JH agonist is also essential for high activity. Conclusions/Significance The information from 3D QSAR provides guidelines and mechanistic scope for identification of steric and electrostatic properties as well as donor and acceptor hydrogen-bonding that are important features of the ligand-binding cavity of a JH target protein. In order to refine the pharmacophore analysis and evaluate the outcomes of the CoMFA and CoMSIA study we used pseudoreceptor modeling software PrGen to generate a putative binding site surrogate that is composed of eight amino acid residues corresponding to the defined molecular interactions. PMID:19547707
NASA Astrophysics Data System (ADS)
Li, Wenlian; Si, Hongzong; Li, Yang; Ge, Cuizhu; Song, Fucheng; Ma, Xiuting; Duan, Yunbo; Zhai, Honglin
2016-08-01
Viral hepatitis C infection is one of the main causes of the hepatitis after blood transfusion and hepatitis C virus (HCV) infection is a global health threat. The HCV NS5B polymerase, an RNA dependent RNA polymerase (RdRp) and an essential role in the replication of the virus, has no functional equivalent in mammalian cells. So the research and development of efficient NS5B polymerase inhibitors provides a great strategy for antiviral therapy against HCV. A combined three-dimensional quantitative structure-activity relationship (QSAR) modeling was accomplished to profoundly understand the structure-activity correlation of a train of indole-based inhibitors of the HCV NS5B polymerase to against HCV. A comparative molecular similarity indices analysis (COMSIA) model as the foundation of the maximum common substructure alignment was developed. The optimum model exhibited statistically significant results: the cross-validated correlation coefficient q2 was 0.627 and non-cross-validated r2 value was 0.943. In addition, the results of internal validations of bootstrapping and Y-randomization confirmed the rationality and good predictive ability of the model, as well as external validation (the external predictive correlation coefficient rext2 = 0.629). The information obtained from the COMSIA contour maps enables the interpretation of their structure-activity relationship. Furthermore, the molecular docking study of the compounds for 3TYV as the protein target revealed important interactions between active compounds and amino acids, and several new potential inhibitors with higher activity predicted were designed basis on our analyses and supported by the simulation of molecular docking. Meanwhile, the OSIRIS Property Explorer was introduced to help select more satisfactory compounds. The satisfactory results from this study may lay a reliable theoretical base for drug development of hepatitis C virus NS5B polymerase inhibitors.
DAT/SERT Selectivity of Flexible GBR 12909 Analogs Modeled Using 3D-QSAR Methods
Gilbert, Kathleen M.; Boos, Terrence L.; Dersch, Christina M.; Greiner, Elisabeth; Jacobson, Arthur E.; Lewis, David; Matecka, Dorota; Prisinzano, Thomas E.; Zhang, Ying; Rothman, Richard B.; Rice, Kenner C.; Venanzi, Carol A.
2007-01-01
The dopamine reuptake inhibitor GBR 12909 (1-{2-[bis(4-fluorophenyl)methoxy]ethyl}-4-(3-phenylpropyl)piperazine, 1) and its analogs have been developed as tools to test the hypothesis that selective dopamine transporter (DAT) inhibitors will be useful therapeutics for cocaine addiction. This 3D-QSAR study focuses on the effect of substitutions in the phenylpropyl region of 1. CoMFA and CoMSIA techniques were used to determine a predictive and stable model for the DAT/serotonin transporter (SERT) selectivity (represented by pKi (DAT/SERT)) of a set of flexible analogs of 1, most of which have eight rotatable bonds. In the absence of a rigid analog to use as a 3D-QSAR template, six conformational families of analogs were constructed from six pairs of piperazine and piperidine template conformers identified by hierarchical clustering as representative molecular conformations. Three models stable to y-value scrambling were identified after a comprehensive CoMFA and CoMSIA survey with Region Focusing. Test set correlation validation led to an acceptable model, with q2 = 0.508, standard error of prediction = 0.601, two components, r2 = 0.685, standard error of estimate = 0.481, F value = 39, percent steric contribution = 65, and percent electrostatic contribution = 35. A CoMFA contour map identified areas of the molecule that affect pKi (DAT/SERT). This work outlines a protocol for deriving a stable and predictive model of the biological activity of a set of very flexible molecules. PMID:17127069
Mao, Yating; Li, Yan; Hao, Ming; Zhang, Shuwei; Ai, Chunzhi
2012-05-01
As a key component in combination therapy for acquired immunodeficiency syndrome (AIDS), non-nucleoside reverse transcriptase inhibitors (NNRTIs) have been proven to be an essential way in stopping HIV-1 replication. In the present work, in silico studies were conducted on a series of 119 NNRTIs, including 1-(2-hydroxyethoxymethyl)-6-(phenylthio)thymine (HEPT) and dihydroalkoxybenzyloxopyrimidine (DABO) derivatives by using the comparative molecular field analysis (CoMFA), comparative molecular similarity indices analysis (CoMSIA), docking simulations and molecular dynamics (MD). The statistical results of the optimal model, the ligand-based CoMSIA one (Q(2) = 0.48, R(ncv)(2) =0.847, R(pre)(2) = 0.745) validates its satisfactory predictive capacity both internally and externally. The contour maps, docking and MD results correlate well with each other, drawing conclusions as follows: 1) Compounds with bulky substituents in position-6 of ring A, hydrophobic groups around position- 1, 2, 6 are preferable to the biological activities; 2) Two hydrogen bonds between RT inhibitor and the Tyr 318, Lys 101 residues, respectively, and a π-π bond between the inhibitor and Trp 188 are formed and crucial to the orientation of the active conformation of the molecules; 3) The binding pocket is essentially hydrophobic, which are determined by residues such as Trp 229, Tyr 318, Val 179, Tyr 188 and Val 108, and hydrophobic substituents may bring an improvement to the biological activity; 4) DABO and HEPT derivatives have different structures but take a similar mechanism to inhibit RT. The potency difference between two isomers in HEPTs can be explained by the distinct locations of the 6-naphthylmethyl substituent and the reasons are explained in details. All these results could be employed to alter the structural scaffold in order to develop new HIV-1 RT inhibitors that have an improved biological property. To the best of our knowledge, this is the first report on 3D-QSAR modeling of this series of HEPT and DABO NNRTs. The QSAR model and the information derived, we hope, will be of great help in presenting clear guidelines and accurate activity predictions for newly designed HIV-1 reverse transcriptase (RT) inhibitor.
Punkvang, Auradee; Hannongbua, Supa; Saparpakorn, Patchreenart; Pungpo, Pornpan
2016-05-01
The Mycobacterium tuberculosis protein kinase B (PknB) is critical for growth and survival of M. tuberculosis within the host. The series of aminopyrimidine derivatives show impressive activity against PknB (IC50 < .5 μM). However, most of them show weak or no cellular activity against M. tuberculosis (MIC > 63 μM). Consequently, the key structural features related to activity against of both PknB and M. tuberculosis need to be investigated. Here, two- and three-dimensional quantitative structure-activity relationship (2D and 3D QSAR) analyses combined with molecular dynamics (MD) simulations were employed with the aim to evaluate these key structural features of aminopyrimidine derivatives. Hologram quantitative structure-activity relationship (HQSAR) and CoMSIA models constructed from IC50 and MIC values of aminopyrimidine compounds could establish the structural requirements for better activity against of both PknB and M. tuberculosis. The NH linker and the R1 substituent of the template compound are not only crucial for the biological activity against PknB but also for the biological activity against M. tuberculosis. Moreover, the results obtained from MD simulations show that these moieties are the key fragments for binding of aminopyrimidine compounds in PknB. The combination of QSAR analysis and MD simulations helps us to provide a structural concept that could guide future design of PknB inhibitors with improved potency against both the purified enzyme and whole M. tuberculosis cells.
Molecular design of new aggrecanases-2 inhibitors.
Shan, Zhi Jie; Zhai, Hong Lin; Huang, Xiao Yan; Li, Li Na; Zhang, Xiao Yun
2013-10-01
Aggrecanases-2 is a very important potential drug target for the treatment of osteoarthritis. In this study, a series of known aggrecanases-2 inhibitors was analyzed by the technologies of three-dimensional quantitative structure-activity relationships (3D-QSAR) and molecular docking. Two 3D-QSAR models, which based on comparative molecular field analysis (CoMFA) and comparative molecular similarity analysis (CoMSIA) methods, were established. Molecular docking was employed to explore the details of the interaction between inhibitors and aggrecanases-2 protein. According to the analyses for these models, several new potential inhibitors with higher activity predicted were designed, and were supported by the simulation of molecular docking. This work propose the fast and effective approach to design and prediction for new potential inhibitors, and the study of the interaction mechanism provide a better understanding for the inhibitors binding into the target protein, which will be useful for the structure-based drug design and modifications. Copyright © 2013 Elsevier Ltd. All rights reserved.
Zhang, Meng-Qi; Zhang, Xiao-Le; Li, Yan; Fan, Wen-Jia; Wang, Yong-Hua; Hao, Ming; Zhang, Shu-Wei; Ai, Chun-Zhi
2011-01-01
MGluR2 is G protein-coupled receptor that is targeted for diseases like anxiety, depression, Parkinson’s disease and schizophrenia. Herein, we report the three-dimensional quantitative structure–activity relationship (3D-QSAR) studies of a series of 1,3-dihydrobenzo[ b][1,4]diazepin-2-one derivatives as mGluR2 antagonists. Two series of models using two different activities of the antagonists against rat mGluR2, which has been shown to be very similar to the human mGluR2, (activity I: inhibition of [3H]-LY354740; activity II: mGluR2 (1S,3R)-ACPD inhibition of forskolin stimulated cAMP.) were derived from datasets composed of 137 and 69 molecules respectively. For activity I study, the best predictive model obtained from CoMFA analysis yielded a Q2 of 0.513, R2 ncv of 0.868, R2 pred = 0.876, while the CoMSIA model yielded a Q2 of 0.450, R2 ncv = 0.899, R2 pred = 0.735. For activity II study, CoMFA model yielded statistics of Q2 = 0.5, R2 ncv = 0.715, R2 pred = 0.723. These results prove the high predictability of the models. Furthermore, a combined analysis between the CoMFA, CoMSIA contour maps shows that: (1) Bulky substituents in R7, R3 and position A benefit activity I of the antagonists, but decrease it when projected in R8 and position B; (2) Hydrophilic groups at position A and B increase both antagonistic activity I and II; (3) Electrostatic field plays an essential rule in the variance of activity II. In search for more potent mGluR2 antagonists, two pharmacophore models were developed separately for the two activities. The first model reveals six pharmacophoric features, namely an aromatic center, two hydrophobic centers, an H-donor atom, an H-acceptor atom and an H-donor site. The second model shares all features of the first one and has an additional acceptor site, a positive N and an aromatic center. These models can be used as guidance for the development of new mGluR2 antagonists of high activity and selectivity. This work is the first report on 3D-QSAR modeling of these mGluR2 antagonists. All the conclusions may lead to a better understanding of the mechanism of antagonism and be helpful in the design of new potent mGluR2 antagonists. PMID:22016641
Zhang, Meng-Qi; Zhang, Xiao-Le; Li, Yan; Fan, Wen-Jia; Wang, Yong-Hua; Hao, Ming; Zhang, Shu-Wei; Ai, Chun-Zhi
2011-01-01
MGluR2 is G protein-coupled receptor that is targeted for diseases like anxiety, depression, Parkinson's disease and schizophrenia. Herein, we report the three-dimensional quantitative structure-activity relationship (3D-QSAR) studies of a series of 1,3-dihydrobenzo[ b][1,4]diazepin-2-one derivatives as mGluR2 antagonists. Two series of models using two different activities of the antagonists against rat mGluR2, which has been shown to be very similar to the human mGluR2, (activity I: inhibition of [(3)H]-LY354740; activity II: mGluR2 (1S,3R)-ACPD inhibition of forskolin stimulated cAMP.) were derived from datasets composed of 137 and 69 molecules respectively. For activity I study, the best predictive model obtained from CoMFA analysis yielded a Q(2) of 0.513, R(2) (ncv) of 0.868, R(2) (pred) = 0.876, while the CoMSIA model yielded a Q(2) of 0.450, R(2) (ncv) = 0.899, R(2) (pred) = 0.735. For activity II study, CoMFA model yielded statistics of Q(2) = 0.5, R(2) (ncv) = 0.715, R(2) (pred) = 0.723. These results prove the high predictability of the models. Furthermore, a combined analysis between the CoMFA, CoMSIA contour maps shows that: (1) Bulky substituents in R(7), R(3) and position A benefit activity I of the antagonists, but decrease it when projected in R(8) and position B; (2) Hydrophilic groups at position A and B increase both antagonistic activity I and II; (3) Electrostatic field plays an essential rule in the variance of activity II. In search for more potent mGluR2 antagonists, two pharmacophore models were developed separately for the two activities. The first model reveals six pharmacophoric features, namely an aromatic center, two hydrophobic centers, an H-donor atom, an H-acceptor atom and an H-donor site. The second model shares all features of the first one and has an additional acceptor site, a positive N and an aromatic center. These models can be used as guidance for the development of new mGluR2 antagonists of high activity and selectivity. This work is the first report on 3D-QSAR modeling of these mGluR2 antagonists. All the conclusions may lead to a better understanding of the mechanism of antagonism and be helpful in the design of new potent mGluR2 antagonists.
Da, Chenxiao; Mooberry, Susan L.; Gupton, John T.; Kellogg, Glen E.
2013-01-01
αβ-tubulin colchicine site inhibitors (CSIs) from four scaffolds that we previously tested for antiproliferative activity were modeled to better understand their effect on microtubules. Docking models, constructed by exploiting the SAR of a pyrrole subset and HINT scoring, guided ensemble docking of all 59 compounds. This conformation set and two variants having progressively less structure knowledge were subjected to CoMFA, CoMFA+HINT, and CoMSIA 3D-QSAR analyses. The CoMFA+HINT model (docked alignment) showed the best statistics: leave-one-out q2 of 0.616, r2 of 0.949 and r2pred (internal test set) of 0.755. An external (tested in other laboratories) collection of 24 CSIs from eight scaffolds were evaluated with the 3D-QSAR models, which correctly ranked their activity trends in 7/8 scaffolds for CoMFA+HINT (8/8 for CoMFA). The combination of SAR, ensemble docking, hydropathic analysis and 3D-QSAR provides an atomic-scale colchicine site model more consistent with a target structure resolution much higher than the ~3.6 Å available for αβ-tubulin. PMID:23961916
Liu, Hong; Ji, Ming; Luo, Xiaomin; Shen, Jianhua; Huang, Xiaoqin; Hua, Weiyi; Jiang, Hualiang; Chen, Kaixian
2002-07-04
Class III antiarrhythmic agents selectively delay the effective refractory period (ERP) and increase the transmembrane action potential duration (APD). Using dofetilide (2) as a template of class III antiarrhythmic agents, we designed and synthesized 16 methylsulfonamido phenylethylamine analogues (4a-d and 5a-l). Pharmacological assay indicated that all of these compounds showed activity for increasing the ERP in isolated animal atrium; among them, the effective concentration of compound 4a is 1.6 x 10(-8) mol/L in increasing ERP by 10 ms, slightly less potent than that of 2, 1.1 x 10(-8) mol/L. Compound 4a also produced a slightly lower change in ERP at 10(-5) M, DeltaERP% = 17.5% (DeltaERP% = 24.0% for dofetilide). On the basis of this bioassay result, these 16 compounds together with dofetilide were investigated by the three-dimensional quantitative structure-activity relationship (3D-QSAR) techniques of comparative molecular field analysis (CoMFA), comparative molecular similarity index analysis (CoMSIA), and the hologram QSAR (HQSAR). The 3D-QSAR models were tested with another 11 compounds (4e-h and 5m-s) that we synthesized later. Results revealed that the CoMFA, CoMSIA, and HQSAR predicted activities for the 11 newly synthesized compounds that have a good correlation with their experimental value, r(2) = 0.943, 0.891, and 0.809 for the three QSAR models, respectively. This indicates that the 3D-QSAR models proved a good predictive ability and could describe the steric, electrostatic, and hydrophobic requirements for recognition forces of the receptor site. On the basis of these results, we designed and synthesized another eight new analogues of methanesulfonamido phenylethyamine (6a-h) according to the clues provided by the 3D-QSAR analyses. Pharmacological assay indicated that the effective concentrations of delaying the ERP by 10 ms of these newly designed compounds correlated well with the 3D-QSAR predicted values. It is remarkable that the percent change of delaying ERP at 10(-5) M compound 6c is much higher than that of dofetilide; the effective concentration of compound 6c is 5.0 x 10(-8)mol/L in increasing the ERP by 10 ms, which is slightly lower than that of 2. The results showed that the 3D-QSAR models are reliable and can be extended to design new antiarrhythmic agents.
Zhou, Shengfu; Fang, Danqing; Tan, Shepei; Lin, Weicong; Wu, Wenjuan; Zheng, Kangcheng
2017-10-01
P2Y 12 receptor is an attractive target for the anti-platelet therapies, treating various thrombotic diseases. In this work, a total of 107 6-aminonicotinate-based compounds as potent P2Y 12 antagonists were studies by a molecular modeling study combining three-dimensional quantitative structure-activity relationship (3D-QSAR), molecular docking and molecular dynamics (MD) simulations to explore the decisive binding conformations of these antagonists with P2Y 12 and the structural features for the activity. The optimum CoMFA and CoMSIA models identified satisfactory robustness and good predictive ability, with R 2 = .983, q 2 = .805, [Formula: see text] = .881 for CoMFA model, and R 2 = .935, q 2 = .762, [Formula: see text] = .690 for CoMSIA model, respectively. The probable binding modes of compounds and key amino acid residues were revealed by molecular docking. MD simulations and MM/GBSA free energy calculations were further performed to validate the rationality of docking results and to compare the binding modes of several compound pairs with different activities, and the key residues (Val102, Tyr105, Tyr109, His187, Val190, Asn191, Phe252, His253, Arg256, Tyr259, Thr260, Val279, and Lys280) for the higher activity were pointed out. The binding energy decomposition indicated that the hydrophobic and hydrogen bond interactions play important roles for the binding of compounds to P2Y 12 . We hope these results could be helpful in design of potent and selective P2Y 12 antagonists.
Liu, H; Ji, M; Jiang, H; Liu, L; Hua, W; Chen, K; Ji, R
2000-10-02
Class III antiarrhythmic agents selectively delay the effective refractory period (ERP) and increase the transmembrance action potential duration (APD). Based on our previous studies, a set of 17 methylsulfonamido phenylethylamine analogues were investigated by 3D-QSAR techniques of CoMFA and CoMSIA. The 3D-QSAR models proved a good predictive ability, and could describe the steric, electrostatic and hydrophobic requirements for recognition forces of the receptor site. According to the clues provided by this 3D-QSAR analysis, we designed and synthesized a series of new analogues of methanesulfonamido phenylethylamine (VIa-i). Pharmacological assay indicated that the effective concentrations of delaying the functional refractory period (FRP) 10ms of these new compounds have a good correlation with the 3D-QSAR predicted values. It is remarkable that the maximal percent change of delaying FRP in microM of compound VIc is much higher than that of dofetilide. The results showed that the 3D-QSAR models are reliable.
NASA Astrophysics Data System (ADS)
Cao, Shandong
2012-08-01
The purpose of the present study was to develop in silico models allowing for a reliable prediction of polo-like kinase inhibitors based on a large diverse dataset of 136 compounds. As an effective method, quantitative structure activity relationship (QSAR) was applied using the comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA). The proposed QSAR models showed reasonable predictivity of thiophene analogs (Rcv2=0.533, Rpred2=0.845) and included four molecular descriptors, namely IC3, RDF075m, Mor02m and R4e+. The optimal model for imidazopyridine derivatives (Rcv2=0.776, Rpred2=0.876) was shown to perform good in prediction accuracy, using GATS2m and BEHe1 descriptors. Analysis of the contour maps helped to identify structural requirements for the inhibitors and served as a basis for the design of the next generation of the inhibitor analogues. Docking studies were also employed to position the inhibitors into the polo-like kinase active site to determine the most probable binding mode. These studies may help to understand the factors influencing the binding affinity of chemicals and to develop alternative methods for prescreening and designing of polo-like kinase inhibitors.
NASA Astrophysics Data System (ADS)
Mella, Jaime; Villegas, Francisco; Morales-Verdejo, César; Lagos, Carlos F.; Recabarren-Gajardo, Gonzalo
2017-07-01
We recently reported a series of 39 weakly basic N-arylsulfonylindoles as novel 5-HT6 antagonists. Eight of the compounds exhibited moderate to high binding affinities, with 2-(4-(2-Methoxyphenyl)piperazin-1-yl)-1-(1-tosyl-1H-indol-3-yl)ethanol 16 showing the highest binding affinity (pKi = 7.87). Given these encouraging results and as a continuation of our research, we performed an extensive step-by-step search for the best 3D-QSAR model that allows us to rationally propose novel molecules with improved 5-HT6 affinity based on our previously reported series. A comparative molecular similarity indices analysis (CoMSIA) model built on a docking-based alignment was developed, wherein steric, electrostatic, hydrophobic and hydrogen bond properties are correlated with biological activity. The model was validated internally and externally (q2 = 0.721; r2pred = 0.938), and identified the sulfonyl and hydroxyl groups and the piperazine ring among the main regions of the molecules that can be modified to create new 5-HT6 antagonists.
Ballu, Srilata; Itteboina, Ramesh; Sivan, Sree Kanth; Manga, Vijjulatha
2018-02-01
Filamentous temperature-sensitive protein Z (FtsZ) is a protein encoded by the FtsZ gene that assembles into a Z-ring at the future site of the septum of bacterial cell division. Structurally, FtsZ is a homolog of eukaryotic tubulin but has low sequence similarity; this makes it possible to obtain FtsZ inhibitors without affecting the eukaryotic cell division. Computational studies were performed on a series of substituted 3-arylalkoxybenzamide derivatives reported as inhibitors of FtsZ activity in Staphylococcus aureus. Quantitative structure-activity relationship models (QSAR) models generated showed good statistical reliability, which is evident from r 2 ncv and r 2 loo values. The predictive ability of these models was determined and an acceptable predictive correlation (r 2 Pred ) values were obtained. Finally, we performed molecular dynamics simulations in order to examine the stability of protein-ligand interactions. This facilitated us to compare free binding energies of cocrystal ligand and newly designed molecule B1. The good concordance between the docking results and comparative molecular field analysis (CoMFA)/comparative molecular similarity indices analysis (CoMSIA) contour maps afforded obliging clues for the rational modification of molecules to design more potent FtsZ inhibitors.
Quantitative structure activity relationship studies of mushroom tyrosinase inhibitors
NASA Astrophysics Data System (ADS)
Xue, Chao-Bin; Luo, Wan-Chun; Ding, Qi; Liu, Shou-Zhu; Gao, Xing-Xiang
2008-05-01
Here, we report our results from quantitative structure-activity relationship studies on tyrosinase inhibitors. Interactions between benzoic acid derivatives and tyrosinase active sites were also studied using a molecular docking method. These studies indicated that one possible mechanism for the interaction between benzoic acid derivatives and the tyrosinase active site is the formation of a hydrogen-bond between the hydroxyl (aOH) and carbonyl oxygen atoms of Tyr98, which stabilized the position of Tyr98 and prevented Tyr98 from participating in the interaction between tyrosinase and ORF378. Tyrosinase, also known as phenoloxidase, is a key enzyme in animals, plants and insects that is responsible for catalyzing the hydroxylation of tyrosine into o-diphenols and the oxidation of o-diphenols into o-quinones. In the present study, the bioactivities of 48 derivatives of benzaldehyde, benzoic acid, and cinnamic acid compounds were used to construct three-dimensional quantitative structure-activity relationship (3D-QSAR) models using comparative molecular field (CoMFA) and comparative molecular similarity indices (CoMSIA) analyses. After superimposition using common substructure-based alignments, robust and predictive 3D-QSAR models were obtained from CoMFA ( q 2 = 0.855, r 2 = 0.978) and CoMSIA ( q 2 = 0.841, r 2 = 0.946), with 6 optimum components. Chemical descriptors, including electronic (Hammett σ), hydrophobic (π), and steric (MR) parameters, hydrogen bond acceptor (H-acc), and indicator variable ( I), were used to construct a 2D-QSAR model. The results of this QSAR indicated that π, MR, and H-acc account for 34.9, 31.6, and 26.7% of the calculated biological variance, respectively. The molecular interactions between ligand and target were studied using a flexible docking method (FlexX). The best scored candidates were docked flexibly, and the interaction between the benzoic acid derivatives and the tyrosinase active site was elucidated in detail. We believe that the QSAR models built here provide important information necessary for the design of novel tyrosinase inhibitors.
Pagadala, Nataraj S; Perez-Pineiro, Rolando; Wishart, David S; Tuszynski, Jack A
2015-02-16
To understand the pharmacophore properties of 2-aminothiazoles and design novel inhibitors against the prion protein, a highly predictive 3D quantitative structure-activity relationship (QSAR) has been developed by performing comparative molecular field analysis (CoMFA) and comparative similarity analysis (CoMSIA). Both CoMFA and CoMSIA maps reveal the presence of the oxymethyl groups in meta and para positions on the phenyl ring of compound 17 (N-[4-(3,4-dimethoxyphenyl)-1,3-thiazol-2-yl]quinolin-2-amine), is necessary for activity while electro-negative nitrogen of quinoline is highly favorable to enhance activity. The blind docking results for these compounds show that the compound with quinoline binds with higher affinity than isoquinoline and naphthalene groups. Out of 150 novel compounds retrieved using finger print analysis by pharmacophoric model predicted based on five test sets of compounds, five compounds with diverse scaffolds were selected for biological evaluation as possible PrP inhibitors. Molecular docking combined with fluorescence quenching studies show that these compounds bind to pocket-D of SHaPrP near Trp145. The new antiprion compounds 3 and 6, which bind with the interaction energies of -12.1 and -13.2 kcal/mol, respectively, show fluorescence quenching with binding constant (Kd) values of 15.5 and 44.14 μM, respectively. Further fluorescence binding assays with compound 5, which is similar to 2-aminothiazole as a positive control, also show that the molecule binds to the pocket-D with the binding constant (Kd) value of 84.7 μM. Finally, both molecular docking and a fluorescence binding assay of noscapine as a negative control reveals the same binding site on the surface of pocket-A near a rigid loop between β2 and α2 interacting with Arg164. This high level of correlation between molecular docking and fluorescence quenching studies confirm that these five compounds are likely to act as inhibitors for prion propagation while noscapine might act as a prion accelerator from PrP(C) to PrP(Sc). Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Wang, Jinghui; Yang, Yinfeng; Li, Yan; Wang, Yonghua
2016-07-27
Bovine viral diarrhea virus (BVDV) infections are prevailing in cattle populations on a worldwide scale. The BVDV RNA-dependent RNA polymerase (RdRp), as a promising target for new anti-BVDV drug development, has attracted increasing attention. To explore the interaction mechanism of 65 benzimidazole scaffold-based derivatives as BVDV inhibitors, presently, a computational study was performed based on a combination of 3D-QSAR, molecular docking, and molecular dynamics (MD) simulations. The resultant optimum CoMFA and CoMSIA models present proper reliabilities and strong predictive abilities (with Q(2) = 0. 64, R(2)ncv = 0.93, R(2)pred = 0.80 and Q(2) = 0. 65, R(2)ncv = 0.98, R(2)pred = 0.86, respectively). In addition, there was good concordance between these models, molecular docking, and MD results. Moreover, the MM-PBSA energy analysis reveals that the major driving force for ligand binding is the polar solvation contribution term. Hopefully, these models and the obtained findings could offer better understanding of the interaction mechanism of BVDV inhibitors as well as benefit the new discovery of more potent BVDV inhibitors.
Yan, Yulian; Li, Yan; Zhang, Shuwei; Ai, Chunzhi
2011-02-01
The development of injectable integrin α(v)β(3)/α(IIb)β(3) dual antagonists attracts much attention of research for treating of acute ischemic diseases in recent years. In this work, based on a dataset composed of 102 tricyclic piperazine/piperidine furnished dual α(v)β(3) and α(IIb)β(3) antagonists, a variety of in silico modeling approaches including the comparative molecular field analysis (CoMFA), comparative similarity indices analysis (CoMSIA), and molecular docking were applied to reveal the requisite 3D structural features impacting the biological activities. Our statistical results show that the ligand-based 3D-QSAR models for both the α(v)β(3) and α(IIb)β(3) studies exhibited satisfactory internal and external predictability, i.e., for the CoMFA models, results of Q(2)=0.48, R(ncv)(2)=0.87, R(pred)(2)=0.71 for α(v)β(3) and Q(2)=0.50, R(ncv)(2)=0.85, R(pred)(2)=0.72 for α(IIb)β(3) analysis were obtained, and for the CoMSIA ones, the outcomes of Q(2)=0.55, R(ncv)(2)=0.90, R(pred)(2)=0.72 for α(v)β(3) and Q(2)=0.52, R(ncv)(2)=0.88, R(pred)(2)=0.74 for α(IIb)β(3) were achieved respectively. In addition, through a comparison between 3D-QSAR contour maps and docking results, it is revealed that that the most crucial interactions occurring between the tricyclic piperazine/piperidine derivatives and α(v)β(3)/α(IIb)β(3) receptor ligand binding pocket are H-bonding, and the key amino acids impacting the interactions are Arg214, Asn215, Ser123, and Lys253 for α(v)β(3), but Arg214, Asn215, Ser123 and Tyr190 for α(IIb)β(3) receptors, respectively. Halogen-containing groups at position 15 and 16, benzene sulfonamide substituent at position 23, and the replacement of piperazine with 4-aminopiperidine of ring B may increase the α(v)β(3)/α(IIb)β(3) antagonistic activity. The potencies for antagonists to inhibit isolated α(v)β(3) and α(IIb)β(3) are linear correlated, indicating that similar interaction mechanisms may exist for the series of molecules. To our best knowledge this is the first report on 3D-QSAR modeling of these dual α(v)β(3)/α(IIb)β(3) antagonists. The results obtained should provide information for better understanding of the mechanism of antagonism and thus be helpful in design of novel potent dual α(v)β(3)/α(IIb)β(3) antagonists. Copyright © 2011 Elsevier Inc. All rights reserved.
Itteboina, Ramesh; Ballu, Srilata; Sivan, Sree Kanth; Manga, Vijjulatha
2017-10-01
Janus kinase 1 (JAK 1) belongs to the JAK family of intracellular nonreceptor tyrosine kinase. JAK-signal transducer and activator of transcription (JAK-STAT) pathway mediate signaling by cytokines, which control survival, proliferation and differentiation of a variety of cells. Three-dimensional quantitative structure activity relationship (3 D-QSAR), molecular docking and molecular dynamics (MD) methods was carried out on a dataset of Janus kinase 1(JAK 1) inhibitors. Ligands were constructed and docked into the active site of protein using GLIDE 5.6. Best docked poses were selected after analysis for further 3 D-QSAR analysis using comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) methodology. Employing 60 molecules in the training set, 3 D-QSAR models were generate that showed good statistical reliability, which is clearly observed in terms of r 2 ncv and q 2 loo values. The predictive ability of these models was determined using a test set of 25 molecules that gave acceptable predictive correlation (r 2 Pred ) values. The key amino acid residues were identified by means of molecular docking, and the stability and rationality of the derived molecular conformations were also validated by MD simulation. The good consonance between the docking results and CoMFA/CoMSIA contour maps provides helpful clues about the reasonable modification of molecules in order to design more efficient JAK 1 inhibitors. The developed models are expected to provide some directives for further synthesis of highly effective JAK 1 inhibitors.
Dong, Lili; Feng, Ruirui; Bi, Jiawei; Shen, Shengqiang; Lu, Huizhe; Zhang, Jianjun
2018-03-06
Human sodium-dependent glucose co-transporter 2 (hSGLT2) is a crucial therapeutic target in the treatment of type 2 diabetes. In this study, both comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) were applied to generate three-dimensional quantitative structure-activity relationship (3D-QSAR) models. In the most accurate CoMFA-based and CoMSIA-based QSAR models, the cross-validated coefficients (r 2 cv ) were 0.646 and 0.577, respectively, while the non-cross-validated coefficients (r 2 ) were 0.997 and 0.991, respectively, indicating that both models were reliable. In addition, we constructed a homology model of hSGLT2 in the absence of a crystal structure. Molecular docking was performed to explore the bonding mode of inhibitors to the active site of hSGLT2. Molecular dynamics (MD) simulations and binding free energy calculations using MM-PBSA and MM-GBSA were carried out to further elucidate the interaction mechanism. With regards to binding affinity, we found that hydrogen-bond interactions of Asn51 and Glu75, located in the active site of hSGLT2, with compound 40 were critical. Hydrophobic and electrostatic interactions were shown to enhance activity, in agreement with the results obtained from docking and 3D-QSAR analysis. Our study results shed light on the interaction mode between inhibitors and hSGLT2 and may aid in the development of C-aryl glucoside SGLT2 inhibitors.
Tong, Lidan; Guo, Lixin; Lv, Xiaojun; Li, Yu
2017-01-01
Three-dimensional quantitative structure-activity relationship (3D-QSAR) models were established by comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA). Experimental toxicity data in Poecilia reticulata (pLC 50 ) and physico-chemical properties for 12 polychlorinated phenols were used as dependent and as independent variables, respectively. Among the 12 polychlorinated phenols, nine were randomly selected and used as a training set to construct the 3D-QSAR models through the SYBYL-X software to predict the pLC 50 values of the remaining 8 polychlorinated phenols congeners, and the other three polychlorinated phenols were used as a test set to evaluate the 3D-QSAR models (the training set and test set were arranged randomly, shuffled 60 times). Pentachlorophenol (PCP), which is the most toxic among the 20 polychlorinated phenols used in this experiment, was selected as an example for modification using contour maps produced using the established 3D-QSAR models. The aim was to decrease its toxicity and bioconcentration, increase its biodegradation, and maintain or better its effectiveness as a pesticide. The 3D-QSAR models were robust and had good predictive abilities with cross-validation correlation coefficients (q 2 ) of 0.858-0.992 (>0.5), correlation coefficients (r 2 ) of 0.966-1.000 (>0.9), and standard errors of prediction (SEP) of 0.004-0.159. CoMFA showed that the toxicity of the polychlorinated phenols arose mainly from electrostatic (42.7-66.7%) and steric (33.3-7.3%) contributions. By comparison, CoMSIA showed that the toxicity of polychlorinated phenols was dominated by electrostatic (57.5-76.9%) and hydrophobic (19.8-25.7%) contributions, with lesser contributions from the steric (0.7-1.0%) hydrogen bond donor (0.1-20.3%), and hydrogen bond acceptor (0-0.9%). 3D-QSAR electrostatic contour maps were used to modify PCP and design 11 new compounds with lower toxicity. The effectiveness of each of these molecules as a pesticide was verified using a 3D-QSAR model for polychlorinated phenol toxicity against Tetrahymena pyriformis. Four of these compounds, with -Br, -I, -OH and -NH 2 groups in place of chlorine at the 3-position on PCP, were all at least as effective as PCP against T. Pyriformis. The first-order rate constants (K b ) of these four compounds were predicted using a 3D-QSAR model for polychlorinated phenol degradation, which showed they were more biodegradable than PCP. Furthermore, a 3D-QSAR model for polychlorinated phenols bioconcentration in fish (containing Poecilia reticulata, Oncorhynchus mykiss, Pimephales promelas and Oryzias latipes) showed that there was no significant difference between the bioconcentration factors of the four new compounds and that of PCP. The results obtained are hoped to provide a new route for lowering the POPs characteristics of those polychlorinated phenol homologues and derivatives in use. Copyright © 2016 Elsevier Inc. All rights reserved.
3D-QSAR studies on 1,2,4-triazolyl 5-azaspiro [2.4]-heptanes as D3R antagonists
NASA Astrophysics Data System (ADS)
Zhang, Xin; Zhang, Hui
2018-07-01
Dopamine D3 receptor has become an attractive target in the treatment of abused drugs. 3D-QSAR studies were performed on a novel series of D3 receptor antagonists, 1,2,4-triazolyl 5-azaspiro [2.4]-heptanes, using CoMFA and CoMSIA methods. Two predictive 3D-QSAR models have been generated for the modified design of D3R antagonists. Based on the steric, electrostatic, hydrophobic and hydrogen-bond acceptor information of contour maps, key structural factors affecting the bioactivity were explored. This work gives helpful suggestions on the design of novel D3R antagonists with increased activities.
Chemical Structure-Biological Activity Models for Pharmacophores’ 3D-Interactions
Putz, Mihai V.; Duda-Seiman, Corina; Duda-Seiman, Daniel; Putz, Ana-Maria; Alexandrescu, Iulia; Mernea, Maria; Avram, Speranta
2016-01-01
Within medicinal chemistry nowadays, the so-called pharmaco-dynamics seeks for qualitative (for understanding) and quantitative (for predicting) mechanisms/models by which given chemical structure or series of congeners actively act on biological sites either by focused interaction/therapy or by diffuse/hazardous influence. To this aim, the present review exposes three of the fertile directions in approaching the biological activity by chemical structural causes: the special computing trace of the algebraic structure-activity relationship (SPECTRAL-SAR) offering the full analytical counterpart for multi-variate computational regression, the minimal topological difference (MTD) as the revived precursor for comparative molecular field analyses (CoMFA) and comparative molecular similarity indices analysis (CoMSIA); all of these methods and algorithms were presented, discussed and exemplified on relevant chemical medicinal systems as proton pump inhibitors belonging to the 4-indolyl,2-guanidinothiazole class of derivatives blocking the acid secretion from parietal cells in the stomach, the 1-[(2-hydroxyethoxy)-methyl]-6-(phenylthio)thymine congeners’ (HEPT ligands) antiviral activity against Human Immunodeficiency Virus of first type (HIV-1) and new pharmacophores in treating severe genetic disorders (like depression and psychosis), respectively, all involving 3D pharmacophore interactions. PMID:27399692
Zhang, Shuang; Xue, Xiwen; Zhang, Liangren; Zhang, Lihe; Liu, Zhenming
2015-12-01
In the past decade, the discovery, synthesis, and evaluation for hundreds of CD38 covalent and non-covalent inhibitors has been reported sequentially by our group and partners; however, a systematic structure-based guidance is still lacking for rational design of CD38 inhibitor. Here, we carried out a comparative analysis of pharmacophore features and quantitative structure-activity relationships for CD38 inhibitors. The results uncover that the essential interactions between key residues and covalent/non-covalent CD38 inhibitors include (i) hydrogen bond and hydrophobic interactions with residues Glu226 and Trp125, (ii) electrostatic or hydrogen bond interaction with the positively charged residue Arg127 region, and (iii) the hydrophobic interaction with residue Trp189. For covalent inhibitors, besides the covalent effect with residue Glu226, the electrostatic interaction with residue Arg127 is also necessary, while another hydrogen/non-bonded interaction with residues Trp125 and Trp189 can also be detected. By means of the SYBYL multifit alignment function, the best CoMFA and CoMSIA with CD38 covalent inhibitors presented cross-validated correlation coefficient values (q(2)) of 0.564 and 0.571, and non-cross-validated values (r(2)) of 0.967 and 0.971, respectively. The CD38 non-covalent inhibitors can be classified into five groups according to their chemical scaffolds, and the residues Glu226, Trp189, and Trp125 are indispensable for those non-covalent inhibitors binding to CD38, while the residues Ser126, Arg127, Asp155, Thr221, and Phe222 are also important. The best CoMFA and CoMSIA with the F12 analogues presented cross-validated correlation coefficient values (q(2)) of 0.469 and 0.454, and non-cross-validated values (r(2)) of 0.814 and 0.819, respectively. © 2015 John Wiley & Sons A/S.
Application of 3D-QSAR in the rational design of receptor ligands and enzyme inhibitors.
Mor, Marco; Rivara, Silvia; Lodola, Alessio; Lorenzi, Simone; Bordi, Fabrizio; Plazzi, Pier Vincenzo; Spadoni, Gilberto; Bedini, Annalida; Duranti, Andrea; Tontini, Andrea; Tarzia, Giorgio
2005-11-01
Quantitative structure-activity relationships (QSARs) are frequently employed in medicinal chemistry projects, both to rationalize structure-activity relationships (SAR) for known series of compounds and to help in the design of innovative structures endowed with desired pharmacological actions. As a difference from the so-called structure-based drug design tools, they do not require the knowledge of the biological target structure, but are based on the comparison of drug structural features, thus being defined ligand-based drug design tools. In the 3D-QSAR approach, structural descriptors are calculated from molecular models of the ligands, as interaction fields within a three-dimensional (3D) lattice of points surrounding the ligand structure. These descriptors are collected in a large X matrix, which is submitted to multivariate analysis to look for correlations with biological activity. Like for other QSARs, the reliability and usefulness of the correlation models depends on the validity of the assumptions and on the quality of the data. A careful selection of compounds and pharmacological data can improve the application of 3D-QSAR analysis in drug design. Some examples of the application of CoMFA and CoMSIA approaches to the SAR study and design of receptor or enzyme ligands is described, pointing the attention to the fields of melatonin receptor ligands and FAAH inhibitors.
NASA Astrophysics Data System (ADS)
Assefa, Haregewein; Kamath, Shantaram; Buolamwini, John K.
2003-08-01
The overexpression and/or mutation of the epidermal growth factor receptor (EGFR) tyrosine kinase has been observed in many human solid tumors, and is under intense investigation as a novel anticancer molecular target. Comparative 3D-QSAR analyses using different alignments were undertaken employing comparative molecular field analysis (CoMFA) and comparative molecular similarity analysis (CoMSIA) for 122 anilinoquinazoline and 50 anilinoquinoline inhibitors of EGFR kinase. The SYBYL multifit alignment rule was applied to three different conformational templates, two obtained from a MacroModel Monte Carlo conformational search, and one from the bound conformation of erlotinib in complex with EGFR in the X-ray crystal structure. In addition, a flexible ligand docking alignment obtained with the GOLD docking program, and a novel flexible receptor-guided consensus dynamics alignment obtained with the DISCOVER program in the INSIGHTII modeling package were also investigated. 3D-QSAR models with q2 values up to 0.70 and r2 values up to 0.97 were obtained. Among the 4-anilinoquinazoline set, the q2 values were similar, but the ability of the different conformational models to predict the activities of an external test set varied considerably. In this regard, the model derived using the X-ray crystallographically determined bioactive conformation of erlotinib afforded the best predictive model. Electrostatic, hydrophobic and H-bond donor descriptors contributed the most to the QSAR models of the 4-anilinoquinazolines, whereas electrostatic, hydrophobic and H-bond acceptor descriptors contributed the most to the 4-anilinoquinoline QSAR, particularly the H-bond acceptor descriptor. A novel receptor-guided consensus dynamics alignment has also been introduced for 3D-QSAR studies. This new alignment method may incorporate to some extent ligand-receptor induced fit effects into 3D-QSAR models.
Zong, Guanghui; Yan, Xiaojing; Bi, Jiawei; Jiang, Rui; Qin, Yinan; Yuan, Huizhu; Lu, Huizhe; Dong, Yanhong; Jin, Shuhui; Zhang, Jianjun
2017-01-01
1,3,4-Thiadiazole and sugar-derived molecules have proven to be promising agrochemicals with growth promoting, insecticidal and fungicidal activities. In the research field of agricultural fungicide, applying union of active group we synthesized a new set of 1,3,4-thiadiazole xylofuranose derivatives and all of the compounds were characterized by 1H NMR and HRMS. In precise toxicity measurement, some of compounds exhibited more potent fungicidal activities than the most widely used commercial fungicide Chlorothalonil, promoting further research and development. Based on our experimental data, 3D-QSAR (three-dimensional quantitative structure-activity relationship) was established and investigated using comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) techniques, helping to better understand the structural requirements of lead compounds with high fungicidal activity and environmental compatibility. PMID:28746366
Kapou, Agnes; Benetis, Nikolas P; Avlonitis, Nikos; Calogeropoulou, Theodora; Koufaki, Maria; Scoulica, Efi; Nikolaropoulos, Sotiris S; Mavromoustakos, Thomas
2007-02-01
The application of 2D-NMR spectroscopy and Molecular Modeling in determining the active conformation of flexible molecules in 3D-QSAR was demonstrated in the present study. In particular, a series of 33 flexible synthetic phospholipids, either 2-(4-alkylidene-cyclohexyloxy)ethyl- or omega-cycloalkylidene-substituted ether phospholipids were systematically evaluated for their in vitro antileishmanial activity against the promastigote forms of Leishmania infantum and Leishmania donovani by CoMFA and CoMSIA 3D-QSAR studies. Steric and hydrophobic properties of the phospholipids under study appear to govern their antileishmanial activity against both strains, while the electrostatic properties have no significant contribution. The acknowledgment of these important properties of the pharmacophore will aid in the rational design of new analogues with higher activity.
Zhang, Baidong; Li, Yan; Zhang, Huixiao; Ai, Chunzhi
2010-01-01
Development of anticancer drugs targeting Aurora B, an important member of the serine/threonine kinases family, has been extensively focused on in recent years. In this work, by applying an integrated computational method, including comparative molecular field analysis (CoMFA), comparative molecular similarity indices analysis (CoMSIA), homology modeling and molecular docking, we investigated the structural determinants of Aurora B inhibitors based on three different series of derivatives of 108 molecules. The resultant optimum 3D-QSAR models exhibited (q2 = 0.605, r2pred = 0.826), (q2 = 0.52, r2pred = 0.798) and (q2 = 0.582, r2pred = 0.971) for MK-0457, GSK1070916 and SNS-314 classes, respectively, and the 3D contour maps generated from these models were analyzed individually. The contour map analysis for the MK-0457 model revealed the relative importance of steric and electrostatic effects for Aurora B inhibition, whereas, the electronegative groups with hydrogen bond donating capacity showed a great impact on the inhibitory activity for the derivatives of GSK1070916. Additionally, the predictive model of the SNS-314 class revealed the great importance of hydrophobic favorable contour, since hydrophobic favorable substituents added to this region bind to a deep and narrow hydrophobic pocket composed of residues that are hydrophobic in nature and thus enhanced the inhibitory activity. Moreover, based on the docking study, a further comparison of the binding modes was accomplished to identify a set of critical residues that play a key role in stabilizing the drug-target interactions. Overall, the high level of consistency between the 3D contour maps and the topographical features of binding sites led to our identification of several key structural requirements for more potency inhibitors. Taken together, the results will serve as a basis for future drug development of inhibitors against Aurora B kinase for various tumors. PMID:21151441
Zhang, Baidong; Li, Yan; Zhang, Huixiao; Ai, Chunzhi
2010-11-02
Development of anticancer drugs targeting Aurora B, an important member of the serine/threonine kinases family, has been extensively focused on in recent years. In this work, by applying an integrated computational method, including comparative molecular field analysis (CoMFA), comparative molecular similarity indices analysis (CoMSIA), homology modeling and molecular docking, we investigated the structural determinants of Aurora B inhibitors based on three different series of derivatives of 108 molecules. The resultant optimum 3D-QSAR models exhibited (q(2) = 0.605, r(2) (pred) = 0.826), (q(2) = 0.52, r(2) (pred) = 0.798) and (q(2) = 0.582, r(2) (pred) = 0.971) for MK-0457, GSK1070916 and SNS-314 classes, respectively, and the 3D contour maps generated from these models were analyzed individually. The contour map analysis for the MK-0457 model revealed the relative importance of steric and electrostatic effects for Aurora B inhibition, whereas, the electronegative groups with hydrogen bond donating capacity showed a great impact on the inhibitory activity for the derivatives of GSK1070916. Additionally, the predictive model of the SNS-314 class revealed the great importance of hydrophobic favorable contour, since hydrophobic favorable substituents added to this region bind to a deep and narrow hydrophobic pocket composed of residues that are hydrophobic in nature and thus enhanced the inhibitory activity. Moreover, based on the docking study, a further comparison of the binding modes was accomplished to identify a set of critical residues that play a key role in stabilizing the drug-target interactions. Overall, the high level of consistency between the 3D contour maps and the topographical features of binding sites led to our identification of several key structural requirements for more potency inhibitors. Taken together, the results will serve as a basis for future drug development of inhibitors against Aurora B kinase for various tumors.
Liu, Genyan; Wang, Wenjie; Wan, Youlan; Ju, Xiulian; Gu, Shuangxi
2018-05-11
Diarylpyrimidines (DAPYs), acting as HIV-1 nonnucleoside reverse transcriptase inhibitors (NNRTIs), have been considered to be one of the most potent drug families in the fight against acquired immunodeficiency syndrome (AIDS). To better understand the structural requirements of HIV-1 NNRTIs, three-dimensional quantitative structure⁻activity relationship (3D-QSAR), pharmacophore, and molecular docking studies were performed on 52 DAPY analogues that were synthesized in our previous studies. The internal and external validation parameters indicated that the generated 3D-QSAR models, including comparative molecular field analysis (CoMFA, q 2 = 0.679, R 2 = 0.983, and r pred 2 = 0.884) and comparative molecular similarity indices analysis (CoMSIA, q 2 = 0.734, R 2 = 0.985, and r pred 2 = 0.891), exhibited good predictive abilities and significant statistical reliability. The docking results demonstrated that the phenyl ring at the C₄-position of the pyrimidine ring was better than the cycloalkanes for the activity, as the phenyl group was able to participate in π⁻π stacking interactions with the aromatic residues of the binding site, whereas the cycloalkanes were not. The pharmacophore model and 3D-QSAR contour maps provided significant insights into the key structural features of DAPYs that were responsible for the activity. On the basis of the obtained information, a series of novel DAPY analogues of HIV-1 NNRTIs with potentially higher predicted activity was designed. This work might provide useful information for guiding the rational design of potential HIV-1 NNRTI DAPYs.
Revealing interaction mode between HIV-1 protease and mannitol analog inhibitor.
Yan, Guan-Wen; Chen, Yue; Li, Yixue; Chen, Hai-Feng
2012-06-01
HIV protease is a key enzyme to play a key role in the HIV-1 replication cycle and control the maturation from HIV viruses to an infectious virion. HIV-1 protease has become an important target for anti-HIV-1 drug development. Here, we used molecular dynamics simulation to study the binding mode between mannitol derivatives and HIV-1 protease. The results suggest that the most active compound (M35) has more stable hydrogen bonds and stable native contacts than the less active one (M17). These mannitol derivatives might have similar interaction mode with HIV-1 protease. Then, 3D-QSAR was used to construct quantitative structure-activity models. The cross-validated q(2) values are found as 0.728 and 0.611 for CoMFA and CoMSIA, respectively. And the non-cross-validated r(2) values are 0.973 and 0.950. Nine test set compounds validate the model. The results show that this model possesses better prediction ability than the previous work. This model can be used to design new chemical entities and make quantitative prediction of the bioactivities for HIV-1 protease inhibitors before resorting to in vitro and in vivo experiment. © 2012 John Wiley & Sons A/S.
Cheng, Peng; Li, Jiaojiao; Wang, Juan; Zhang, Xiaoyun; Zhai, Honglin
2018-05-01
Focal adhesion kinase (FAK) is one kind of tyrosine kinases that modulates integrin and growth factor signaling pathways, which is a promising therapeutic target because of involving in cancer cell migration, proliferation, and survival. To investigate the mechanism between FAK and triazinic inhibitors and design high activity inhibitors, a molecular modeling integrated with 3D-QSAR, molecular docking, molecular dynamics simulations, and binding free energy calculations was performed. The optimum CoMFA and CoMSIA models showed good reliability and satisfactory predictability (with Q 2 = 0.663, R 2 = 0.987, [Formula: see text] = 0.921 and Q 2 = 0.670, R 2 = 0.981, [Formula: see text] = 0.953). Its contour maps could provide structural features to improve inhibitory activity. Furthermore, a good consistency between contour maps, docking, and molecular dynamics simulations strongly demonstrates that the molecular modeling is reliable. Based on it, we designed several new compounds and their inhibitory activities were validated by the molecular models. We expect our studies could bring new ideas to promote the development of novel inhibitors with higher inhibitory activity for FAK.
Chaube, Udit; Chhatbar, Dhara; Bhatt, Hardik
2016-02-01
According to WHO statistics, lung cancer is one of the leading causes of death among all other types of cancer. Many genes get mutated in lung cancer but involvement of EGFR and KRAS are more common. Unavailability of drugs or resistance to the available drugs is the major problem in the treatment of lung cancer. In the present research, mTOR was selected as an alternative target for the treatment of lung cancer which involves PI3K/AKT/mTOR pathway. 28 synthetic mTOR inhibitors were selected from the literature. Ligand based approach (CoMFA and CoMSIA) and structure based approach (molecular dynamics simulations assisted molecular docking study) were applied for the identification of important features of benzoxazepine moiety, responsible for mTOR inhibition. Three different alignments were tried to obtain best QSAR model, of which, distil was found to be the best method, as it gave good statistical results. In CoMFA, Leave One Out (LOO) cross validated coefficients (q(2)), conventional coefficient (r(2)) and predicted correlation coefficient (r(2)pred) values were found to be 0.615, 0.990 and 0.930, respectively. Similarly in CoMSIA, q(2), r(2)ncv and r(2)pred values were found to be 0.748, 0.986 and 0.933, respectively. Molecular dynamics and simulations study revealed that B-chain of mTOR protein was stable at and above 500 FS with respect to temperature (at and above 298 K), Potential energy (at and above 7669.72 kJ/mol) and kinetic energy (at and above 4009.77 kJ/mol). Molecular docking study was performed on simulated protein of mTOR which helped to correlate interactions of amino acids surrounded to the ligand with contour maps generated by QSAR method. Important features of benzoxazepine were identified by contour maps and molecular docking study which would be useful to design novel molecules as mTOR inhibitors for the treatment of lung cancer. Copyright © 2015 Elsevier Ltd. All rights reserved.
Paula, Stefan; Tabet, Michael R; Farr, Carol D; Norman, Andrew B; Ball, W James
2004-01-01
Human monoclonal antibodies (mAbs) designed for immunotherapy have a high potential for avoiding the complications that may result from human immune system responses to the introduction of nonhuman mAbs into patients. This study presents a characterization of cocaine/antibody interactions that determine the binding properties of the novel human sequence mAb 2E2 using three-dimensional quantitative structure-activity relationship (3D-QSAR) methodology. We have experimentally determined the binding affinities of mAb 2E2 for cocaine and 38 cocaine analogues. The K(d) of mAb 2E2 for cocaine was 4 nM, indicating a high affinity. Also, mAb 2E2 displayed good cocaine specificity, as reflected in its 10-, 1500-, and 25000-fold lower binding affinities for the three physiologically relevant cocaine metabolites benzoylecgonine, ecgonine methyl ester, and ecgonine, respectively. 3D-QSAR models of cocaine binding were developed by comparative molecular similarity index analysis (CoMSIA). A model of high statistical quality was generated showing that cocaine binds to mAb 2E2 in a sterically restricted binding site that leaves the methyl group attached to the ring nitrogen of cocaine solvent-exposed. The methyl ester group of cocaine appears to engage in attractive van der Waals interactions with mAb 2E2, whereas the phenyl group contributes to the binding primarily via hydrophobic interactions. The model further indicated that an increase in partial positive charge near the nitrogen proton and methyl ester carbonyl group enhances binding affinity and that the ester oxygen likely forms an intermolecular hydrogen bond with mAb 2E2. Overall, the cocaine binding properties of mAb 2E2 support its clinical potential for development as a treatment of cocaine overdose and addiction.
NASA Astrophysics Data System (ADS)
Li, Yan; He, Haoran; Wang, Jinghui; Han, Chunxiao; Feng, Jiaqi; Zhang, Shuwei; Yang, Ling
2014-09-01
The migraine never fails to afflict individuals in the world that knows no lack of such cases. CGRP (calcitonin gene-related peptide) is found closely related to migraine and olcegepant (BIBN4096) is effective in alleviating the pain. In our work, the combination of ligand- and receptor-based three-dimensional quantitative structure-activity relationship (3D-QSAR) studies along with molecular docking was applied to provide us insights about how urethanamide, pyridine and aspartate and succinate derivatives (novel CGRP receptor antagonists) play a part in inhibiting the activity of CGRP receptor. The optimal CoMSIA model shows the Q2 of 0.505, R2ncv of 0.992 and its accurate predictive ability was confirmed by checking out an independent test set which gave R2pred value of 0.885. Besides, the 3D contour maps help us identify how different groups affect the antagonist activity while connecting to some key positions. In addition, the docking analysis shows the binding site emerging as the distorted “V” shape and including two binding pockets: one of them is hydrophobic, fixing the structural part 3 of compound 80, the other anchors the part 1 of compound 80. The docking analysis also shows the interaction mechanism between compound 80 and CGRP receptor, similar to the interaction between olcegepant and CGRP receptor. The findings derived from this work reveal the mechanism of related antagonists and facilitate the future rational design of novel antagonists with higher potency.
Li, Yan; Wu, Wenzhao; Ren, Hong; Wang, Jinghui; Zhang, Shuwei; Li, Guohui; Yang, Ling
2012-09-01
Phosphodiesterase type 5 (PDE5) inhibitors are clinically indicated for the treatment of erectile dysfunction, pulmonary hypertension and various other diseases. In this work, both ligand- and receptor-based three-dimensional quantitative structure-activity relationship (3D-QSAR) studies were carried out using comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) techniques on 122 pyrazinone derivatives as PDE inhibitors. The resultant optimum 3D-QSAR model exhibits a proper predictive ability as indicated by the statistical results of Q² of 0.584, R(ncv)² of 0.884 and R(pre)² of 0.817, respectively. In addition, docking analysis and molecular dynamics (MD) simulation were also applied to elucidate the probable binding modes of these inhibitors. Our main findings are: (1) Introduction of bulky, electropositive and hydrophobic substituents at 12- and 19-positions can increase the biological activities. (2) N atom at 8-position is detrimental to the inhibitor activity, and the effect of N atoms at 5- and 6-positions on compound activity is co-determined by both the hydrophobic force and the π-π stacking interaction. (3) Bulky and hydrophilic substitutions are favored at the 27-position of ring D. (4) Electronegative and hydrophilic substitutions around 5- and 6-positions increase the inhibitory activity. (5) Hydrophobic forces and π-π stacking interaction with Phe786 and Phe820 are crucial in determining the binding of pyrazinone derivatives to PDE5. (6) Bulky substitutions around ring C favors selectivity against PDE11, while bulky groups near the 21-position disfavor the selectivity. The information obtained from this work can be utilized to accurately predict the binding affinity of related analogues and also facilitate future rational designs of novel PDE5 inhibitors with improved activity and selectivity. Copyright © 2012 Elsevier Inc. All rights reserved.
Pyridones as NNRTIs against HIV-1 mutants: 3D-QSAR and protein informatics
NASA Astrophysics Data System (ADS)
Debnath, Utsab; Verma, Saroj; Jain, Surabhi; Katti, Setu B.; Prabhakar, Yenamandra S.
2013-07-01
CoMFA and CoMSIA based 3D-QSAR of HIV-1 RT wild and mutant (K103, Y181C, and Y188L) inhibitory activities of 4-benzyl/benzoyl pyridin-2-ones followed by protein informatics of corresponding non-nucleoside inhibitors' binding pockets from pdbs 2BAN, 3MED, 1JKH, and 2YNF were analysed to discover consensus features of the compounds for broad-spectrum activity. The CoMFA/CoMSIA models indicated that compounds with groups which lend steric-cum-electropositive fields in the vicinity of C5, hydrophobic field in the vicinity of C3 of pyridone region and steric field in aryl region produce broad-spectrum anti-HIV-1 RT activity. Also, a linker rendering electronegative field between pyridone and aryl moieties is common requirement for the activities. The protein informatics showed considerable alteration in residues 181 and 188 characteristics on mutation. Also, mutants' isoelectric points shifted in acidic direction. The study offered fresh avenues for broad-spectrum anti-HIV-1 agents through designing new molecules seeded with groups satisfying common molecular fields and concerns of mutating residues.
Hattotuwagama, Channa K; Doytchinova, Irini A; Flower, Darren R
2007-01-01
Quantitative structure-activity relationship (QSAR) analysis is a cornerstone of modern informatics. Predictive computational models of peptide-major histocompatibility complex (MHC)-binding affinity based on QSAR technology have now become important components of modern computational immunovaccinology. Historically, such approaches have been built around semiqualitative, classification methods, but these are now giving way to quantitative regression methods. We review three methods--a 2D-QSAR additive-partial least squares (PLS) and a 3D-QSAR comparative molecular similarity index analysis (CoMSIA) method--which can identify the sequence dependence of peptide-binding specificity for various class I MHC alleles from the reported binding affinities (IC50) of peptide sets. The third method is an iterative self-consistent (ISC) PLS-based additive method, which is a recently developed extension to the additive method for the affinity prediction of class II peptides. The QSAR methods presented here have established themselves as immunoinformatic techniques complementary to existing methodology, useful in the quantitative prediction of binding affinity: current methods for the in silico identification of T-cell epitopes (which form the basis of many vaccines, diagnostics, and reagents) rely on the accurate computational prediction of peptide-MHC affinity. We have reviewed various human and mouse class I and class II allele models. Studied alleles comprise HLA-A*0101, HLA-A*0201, HLA-A*0202, HLA-A*0203, HLA-A*0206, HLA-A*0301, HLA-A*1101, HLA-A*3101, HLA-A*6801, HLA-A*6802, HLA-B*3501, H2-K(k), H2-K(b), H2-D(b) HLA-DRB1*0101, HLA-DRB1*0401, HLA-DRB1*0701, I-A(b), I-A(d), I-A(k), I-A(S), I-E(d), and I-E(k). In this chapter we show a step-by-step guide into predicting the reliability and the resulting models to represent an advance on existing methods. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made are freely available online at the URL http://www.jenner.ac.uk/MHCPred.
NASA Astrophysics Data System (ADS)
Zhang, Zhenshan; Zheng, Mingyue; Du, Li; Shen, Jianhua; Luo, Xiaomin; Zhu, Weiliang; Jiang, Hualiang
2006-05-01
To find useful information for discovering dual functional inhibitors against both wild type (WT) and K103N mutant reverse transcriptases (RTs) of HIV-1, molecular docking and 3D-QSAR approaches were applied to a set of twenty-five 4,1-benzoxazepinone analogues of efavirenz (SUSTIVA®), some of them are active against the two RTs. 3D-QSAR models were constructed, based on their binding conformations determined by molecular docking, with r 2 cv values ranging from 0.656 to 0.834 for CoMFA and CoMSIA, respectively. The models were then validated to be highly predictive and extrapolative by inhibitors in two test sets with different molecular skeletons. Furthermore, CoMFA models were found to be well matched with the binding sites of both WT and K103N RTs. Finally, a reasonable pharmacophore model of 4,1-benzoxazepinones were established. The application of the model not only successfully differentiated the experimentally determined inhibitors from non-inhibitors, but also discovered two potent inhibitors from the compound database SPECS. On the basis of both the 3D-QSAR and pharmacophore models, new clues for discovering and designing potent dual functional drug leads against HIV-1 were proposed: (i) adopting positively charged aliphatic group at the cis-substituent of C3; (ii) reducing the electronic density at the position of O4; (iii) positioning a small branched aliphatic group at position of C5; (iv) using the negatively charged bulky substituents at position of C7.
Prasanna, Sivaprakasam; Daga, Pankaj R; Xie, Aihua; Doerksen, Robert J
2009-02-01
Glycogen synthase kinase-3, a serine/threonine kinase, has been implicated in a wide variety of pathological conditions such as diabetes, Alzheimer's disease, stroke, bipolar disorder, malaria and cancer. Herein we report 3D-QSAR analyses using CoMFA and CoMSIA and molecular docking studies on 3-anilino-4-phenylmaleimides as GSK-3alpha inhibitors, in order to better understand the mechanism of action and structure-activity relationship of these compounds. Comparison of the active site residues of GSK-3alpha and GSK-3beta isoforms shows that all the key amino acids involved in polar interactions with the maleimides for the beta isoform are the same in the alpha isoform, except that Asp133 in the beta isoform is replaced by Glu196 in the alpha isoform. We prepared a homology model for GSK-3alpha, and showed that the change from Asp to Glu should not affect maleimide binding significantly. Docking studies revealed the binding poses of three subclasses of these ligands, namely anilino, N-methylanilino and indoline derivatives, within the active site of the beta isoform, and helped to explain the difference in their inhibitory activity.
NASA Astrophysics Data System (ADS)
Prasanna, Sivaprakasam; Daga, Pankaj R.; Xie, Aihua; Doerksen, Robert J.
2009-02-01
Glycogen synthase kinase-3, a serine/threonine kinase, has been implicated in a wide variety of pathological conditions such as diabetes, Alzheimer's disease, stroke, bipolar disorder, malaria and cancer. Herein we report 3D-QSAR analyses using CoMFA and CoMSIA and molecular docking studies on 3-anilino-4-phenylmaleimides as GSK-3α inhibitors, in order to better understand the mechanism of action and structure-activity relationship of these compounds. Comparison of the active site residues of GSK-3α and GSK-3β isoforms shows that all the key amino acids involved in polar interactions with the maleimides for the β isoform are the same in the α isoform, except that Asp133 in the β isoform is replaced by Glu196 in the α isoform. We prepared a homology model for GSK-3α, and showed that the change from Asp to Glu should not affect maleimide binding significantly. Docking studies revealed the binding poses of three subclasses of these ligands, namely anilino, N-methylanilino and indoline derivatives, within the active site of the β isoform, and helped to explain the difference in their inhibitory activity.
Qian, Haiyan; Chen, Jiongjiong; Pan, Youlu; Chen, Jianzhong
2016-09-19
11β-Hydroxysteroid dehydrogenase type 1 (11β-HSD1) is a potential target for the treatment of numerous human disorders, such as diabetes, obesity, and metabolic syndrome. In this work, molecular modeling studies combining molecular docking, 3D-QSAR, MESP, MD simulations and free energy calculations were performed on pyridine amides and 1,2,4-triazolopyridines as 11β-HSD1 inhibitors to explore structure-activity relationships and structural requirement for the inhibitory activity. 3D-QSAR models, including CoMFA and CoMSIA, were developed from the conformations obtained by docking strategy. The derived pharmacophoric features were further supported by MESP and Mulliken charge analyses using density functional theory. In addition, MD simulations and free energy calculations were employed to determine the detailed binding process and to compare the binding modes of inhibitors with different bioactivities. The binding free energies calculated by MM/PBSA showed a good correlation with the experimental biological activities. Free energy analyses and per-residue energy decomposition indicated the van der Waals interaction would be the major driving force for the interactions between an inhibitor and 11β-HSD1. These unified results may provide that hydrogen bond interactions with Ser170 and Tyr183 are favorable for enhancing activity. Thr124, Ser170, Tyr177, Tyr183, Val227, and Val231 are the key amino acid residues in the binding pocket. The obtained results are expected to be valuable for the rational design of novel potent 11β-HSD1 inhibitors.
Yu, Haijing; Fang, Yu; Lu, Xia; Liu, Yongjuan; Zhang, Huabei
2014-01-01
The NS5B RNA-dependent RNA polymerase (RdRP) is a promising therapeutic target for developing novel anti-hepatitis C virus (HCV) drugs. In this work, a combined molecular modeling study was performed on a series of 193 5-hydroxy-2H-pyridazin-3-one derivatives as inhibitors of HCV NS5B Polymerase. The best 3D-QSAR models, including CoMFA and CoMSIA, are based on receptor (or docking). Furthermore, a 40-ns molecular dynamics (MD) simulation and binding free energy calculations using docked structures of NS5B with ten compounds, which have diverse structures and pIC50 values, were employed to determine the detailed binding process and to compare the binding modes of the inhibitors with different activities. On one side, the stability and rationality of molecular docking and 3D-QSAR results were validated by MD simulation. The binding free energies calculated by the MM-PBSA method gave a good correlation with the experimental biological activity. On the other side, by analyzing some differences between the molecular docking and the MD simulation results, we can find that the MD simulation could also remedy the defects of molecular docking. The analyses of the combined molecular modeling results have identified that Tyr448, Ser556, and Asp318 are the key amino acid residues in the NS5B binding pocket. The results from this study can provide some insights into the development of novel potent NS5B inhibitors. © 2013 John Wiley & Sons A/S.
Elnagar, Ahmed Y; Wali, Vikram B; Sylvester, Paul W; El Sayed, Khalid A
2010-01-15
Vitamin E (VE) is a generic term that represents a family of compounds composed of various tocopherol and tocotrienol isoforms. Tocotrienols display potent anti-angiogenic and antiproliferative activities. Redox-silent tocotrienol analogues also display potent anticancer activity. The ultimate objective of this study was to develop semisynthetically C-6-modified redox-silent tocotrienol analogues with enhanced antiproliferative and anti-invasive activities as compared to their parent compound. Examples of these are carbamate and ether analogues of alpha-, gamma-, and delta-tocotrienols (1-3). Various aliphatic, olefinic, and aromatic substituents were used. Steric limitation, electrostatic, hydrogen bond donor (HBD) and hydrogen bond acceptor (HBA) properties were varied at this position and the biological activities of these derivatives were tested. Three-dimensional quantitative structure-activity relationship (3D QSAR) studies were performed using Comparative Molecular Field (CoMFA) and Comparative Molecular Similarity Indices Analyses (CoMSIA) to better understand the structural basis for biological activity and guide the future design of more potent VE analogues. Copyright 2009 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pandit, Amit; Sengupta, Sagnik; Krishnan, Mena Asha; Reddy, Ramesh B.; Sharma, Rajesh; Venkatesh, Chelvam
2018-05-01
Prostate Specific Membrane Antigen (PSMA) or Glutamate carboxypeptidase II (GCPII) has been identified as an important target in diagnosis and therapy of prostate cancer. Among several types of inhibitors, urea based inhibitors are the most common and widely employed in preclinical and clinical studies. Computational studies have been carried out to uncover active sites and interaction of PSMA inhibitors with the protein by modifying the core structure of the ligand. Analysis of the literature, however, show lack of 3-D quantitative structure activity relationship (QSAR) and molecular dynamics based molecular docking study to identify structural modifications responsible for better GCPII inhibitory activity. The present study aims to fulfil this gap by analysing well known PSMA inhibitors reported in the literature with known experimental PSMA inhibition constants. Also in order to validate the in silico study, a new GCPII inhibitor 7 was designed, synthesized and experimental PSMA enzyme inhibition was evaluated by using freshly isolated PSMA protein from human cancer cell line derived from lymph node, LNCaP. 3D-QSAR CoMFA models on 58 urea based GCPII inhibitors were generated, and the best correlation was obtained in Gast-Huck charge assigning method with q2, r2 and predictive r2 values as 0.592, 0.995 and 0.842 respectively. Moreover, steric, electrostatic, and hydrogen bond donor field contribution analysis provided best statistical values from CoMSIA model (q2, r2 and predictive r2 as 0.527, 0.981 and 0.713 respectively). Contour maps study revealed that electrostatic field contribution is the major factor for discovering better binding affinity ligands. Further molecular dynamic assisted molecular docking was also performed on GCPII receptor (PDB ID 4NGM) and most active GCPII inhibitor, DCIBzL. 4NGM co-crystallised ligand, JB7 was used to validate the docking procedure and the amino acid interactions present in JB7 are compared with DCIBzL. The results suggest that Arg210, Asn257, Gly518, Tyr552, Lys699, and Tyr700 amino acid residues may play a crucial role in GCPII inhibition. Molecular Dynamics Simulation provides information about docked pose stability of DCIBzL. By combination of CoMFA-CoMSIA field analysis and docking interaction analysis studies, conclusive SAR was generated for urea based derivatives based on which GCPII inhibitor 7 was designed and chemically synthesized in our laboratory. Evaluation of GCPII inhibitory activity of 7 by performing NAALADase assay provided IC50 value of 113 nM which is in close agreement with in silico predicted value (119 nM). Thus we have successfully validated our 3D-QSAR and molecular docking based designing of GCPII inhibitors methodology through biological experiments. This conclusive SAR would be helpful to generate novel and more potent GCPII inhibitors for drug delivery applications.
Güssregen, Stefan; Matter, Hans; Hessler, Gerhard; Müller, Marco; Schmidt, Friedemann; Clark, Timothy
2012-09-24
Current 3D-QSAR methods such as CoMFA or CoMSIA make use of classical force-field approaches for calculating molecular fields. Thus, they can not adequately account for noncovalent interactions involving halogen atoms like halogen bonds or halogen-π interactions. These deficiencies in the underlying force fields result from the lack of treatment of the anisotropy of the electron density distribution of those atoms, known as the "σ-hole", although recent developments have begun to take specific interactions such as halogen bonding into account. We have now replaced classical force field derived molecular fields by local properties such as the local ionization energy, local electron affinity, or local polarizability, calculated using quantum-mechanical (QM) techniques that do not suffer from the above limitation for 3D-QSAR. We first investigate the characteristics of QM-based local property fields to show that they are suitable for statistical analyses after suitable pretreatment. We then analyze these property fields with partial least-squares (PLS) regression to predict biological affinities of two data sets comprising factor Xa and GABA-A/benzodiazepine receptor ligands. While the resulting models perform equally well or even slightly better in terms of consistency and predictivity than the classical CoMFA fields, the most important aspect of these augmented field-types is that the chemical interpretation of resulting QM-based property field models reveals unique SAR trends driven by electrostatic and polarizability effects, which cannot be extracted directly from CoMFA electrostatic maps. Within the factor Xa set, the interaction of chlorine and bromine atoms with a tyrosine side chain in the protease S1 pocket are correctly predicted. Within the GABA-A/benzodiazepine ligand data set, PLS models of high predictivity resulted for our QM-based property fields, providing novel insights into key features of the SAR for two receptor subtypes and cross-receptor selectivity of the ligands. The detailed interpretation of regression models derived using improved QM-derived property fields thus provides a significant advantage by revealing chemically meaningful correlations with biological activity and helps in understanding novel structure-activity relationship features. This will allow such knowledge to be used to design novel molecules on the basis of interactions additional to steric and hydrogen-bonding features.
Prado-Prado, Francisco; García-Mera, Xerardo; Escobar, Manuel; Alonso, Nerea; Caamaño, Olga; Yañez, Matilde; González-Díaz, Humberto
2012-01-01
The number of neurodegenerative diseases has been increasing in recent years. Many of the drug candidates to be used in the treatment of neurodegenerative diseases present specific 3D structural features. An important protein in this sense is the acetylcholinesterase (AChE), which is the target of many Alzheimer's dementia drugs. Consequently, the prediction of Drug-Protein Interactions (DPIs/nDPIs) between new drug candidates and specific 3D structure and targets is of major importance. To this end, we can use Quantitative Structure-Activity Relationships (QSAR) models to carry out a rational DPIs prediction. Unfortunately, many previous QSAR models developed to predict DPIs take into consideration only 2D structural information and codify the activity against only one target. To solve this problem we can develop some 3D multi-target QSAR (3D mt-QSAR) models. In this study, using the 3D MI-DRAGON technique, we have introduced a new predictor for DPIs based on two different well-known software. We have used the MARCH-INSIDE (MI) and DRAGON software to calculate 3D structural parameters for drugs and targets respectively. Both classes of 3D parameters were used as input to train Artificial Neuronal Network (ANN) algorithms using as benchmark dataset the complex network (CN) made up of all DPIs between US FDA approved drugs and their targets. The entire dataset was downloaded from the DrugBank database. The best 3D mt-QSAR predictor found was an ANN of Multi-Layer Perceptron-type (MLP) with profile MLP 37:37-24-1:1. This MLP classifies correctly 274 out of 321 DPIs (Sensitivity = 85.35%) and 1041 out of 1190 nDPIs (Specificity = 87.48%), corresponding to training Accuracy = 87.03%. We have validated the model with external predicting series with Sensitivity = 84.16% (542/644 DPIs; Specificity = 87.51% (2039/2330 nDPIs) and Accuracy = 86.78%. The new CNs of DPIs reconstructed from US FDA can be used to explore large DPI databases in order to discover both new drugs and/or targets. We have carried out some theoretical-experimental studies to illustrate the practical use of 3D MI-DRAGON. First, we have reported the prediction and pharmacological assay of 22 different rasagiline derivatives with possible AChE inhibitory activity. In this work, we have reviewed different computational studies on Drug- Protein models. First, we have reviewed 10 studies on DP computational models. Next, we have reviewed 2D QSAR, 3D QSAR, CoMFA, CoMSIA and Docking with different compounds to find Drug-Protein QSAR models. Last, we have developped a 3D multi-target QSAR (3D mt-QSAR) models for the prediction of the activity of new compounds against different targets or the discovery of new targets.
Grid-based Continual Analysis of Molecular Interior for Drug Discovery, QSAR and QSPR.
Potemkin, Andrey V; Grishina, Maria A; Potemkin, Vladimir A
2017-01-01
In 1979, R.D.Cramer and M.Milne made a first realization of 3D comparison of molecules by aligning them in space and by mapping their molecular fields to a 3D grid. Further, this approach was developed as the DYLOMMS (Dynamic Lattice- Oriented Molecular Modelling System) approach. In 1984, H.Wold and S.Wold proposed the use of partial least squares (PLS) analysis, instead of principal component analysis, to correlate the field values with biological activities. Then, in 1988, the method which was called CoMFA (Comparative Molecular Field Analysis) was introduced and the appropriate software became commercially available. Since 1988, a lot of 3D QSAR methods, algorithms and their modifications are introduced for solving of virtual drug discovery problems (e.g., CoMSIA, CoMMA, HINT, HASL, GOLPE, GRID, PARM, Raptor, BiS, CiS, ConGO,). All the methods can be divided into two groups (classes):1. Methods studying the exterior of molecules; 2) Methods studying the interior of molecules. A series of grid-based computational technologies for Continual Molecular Interior analysis (CoMIn) are invented in the current paper. The grid-based analysis is fulfilled by means of a lattice construction analogously to many other grid-based methods. The further continual elucidation of molecular structure is performed in various ways. (i) In terms of intermolecular interactions potentials. This can be represented as a superposition of Coulomb, Van der Waals interactions and hydrogen bonds. All the potentials are well known continual functions and their values can be determined in all lattice points for a molecule. (ii) In the terms of quantum functions such as electron density distribution, Laplacian and Hamiltonian of electron density distribution, potential energy distribution, the highest occupied and the lowest unoccupied molecular orbitals distribution and their superposition. To reduce time of calculations using quantum methods based on the first principles, an original quantum free-orbital approach AlteQ is proposed. All the functions can be calculated using a quantum approach at a sufficient level of theory and their values can be determined in all lattice points for a molecule. Then, the molecules of a dataset can be superimposed in the lattice for the maximal coincidence (or minimal deviations) of the potentials (i) or the quantum functions (ii). The methods and criteria of the superimposition are discussed. After that a functional relationship between biological activity or property and characteristics of potentials (i) or functions (ii) is created. The methods of the quantitative relationship construction are discussed. New approaches for rational virtual drug design based on the intermolecular potentials and quantum functions are invented. All the invented methods are realized at www.chemosophia.com web page. Therefore, a set of 3D QSAR approaches for continual molecular interior study giving a lot of opportunities for virtual drug discovery, virtual screening and ligand-based drug design are invented. The continual elucidation of molecular structure is performed in the terms of intermolecular interactions potentials and in the terms of quantum functions such as electron density distribution, Laplacian and Hamiltonian of electron density distribution, potential energy distribution, the highest occupied and the lowest unoccupied molecular orbitals distribution and their superposition. To reduce time of calculations using quantum methods based on the first principles, an original quantum free-orbital approach AlteQ is proposed. The methods of the quantitative relationship construction are discussed. New approaches for rational virtual drug design based on the intermolecular potentials and quantum functions are invented. All the invented methods are realized at www.chemosophia.com web page. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Amini, Ata; Shrimpton, Paul J; Muggleton, Stephen H; Sternberg, Michael J E
2007-12-01
Despite the increased recent use of protein-ligand and protein-protein docking in the drug discovery process due to the increases in computational power, the difficulty of accurately ranking the binding affinities of a series of ligands or a series of proteins docked to a protein receptor remains largely unsolved. This problem is of major concern in lead optimization procedures and has lead to the development of scoring functions tailored to rank the binding affinities of a series of ligands to a specific system. However, such methods can take a long time to develop and their transferability to other systems remains open to question. Here we demonstrate that given a suitable amount of background information a new approach using support vector inductive logic programming (SVILP) can be used to produce system-specific scoring functions. Inductive logic programming (ILP) learns logic-based rules for a given dataset that can be used to describe properties of each member of the set in a qualitative manner. By combining ILP with support vector machine regression, a quantitative set of rules can be obtained. SVILP has previously been used in a biological context to examine datasets containing a series of singular molecular structures and properties. Here we describe the use of SVILP to produce binding affinity predictions of a series of ligands to a particular protein. We also for the first time examine the applicability of SVILP techniques to datasets consisting of protein-ligand complexes. Our results show that SVILP performs comparably with other state-of-the-art methods on five protein-ligand systems as judged by similar cross-validated squares of their correlation coefficients. A McNemar test comparing SVILP to CoMFA and CoMSIA across the five systems indicates our method to be significantly better on one occasion. The ability to graphically display and understand the SVILP-produced rules is demonstrated and this feature of ILP can be used to derive hypothesis for future ligand design in lead optimization procedures. The approach can readily be extended to evaluate the binding affinities of a series of protein-protein complexes. (c) 2007 Wiley-Liss, Inc.
A catalog of automated analysis methods for enterprise models.
Florez, Hector; Sánchez, Mario; Villalobos, Jorge
2016-01-01
Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.
40 CFR 52.1490 - Original identification of plan.
Code of Federal Regulations, 2012 CFR
2012-07-01
... measures. (ii) A modeling analysis indicating 1982 attainment. (iii) Documentation of the modeling analysis... agencies, (ii) Additional supporting documentation for the 1982 attainment modeling analysis which included... factors for the model. (iii) A revised 1982 attainment modeling analysis and supporting documentation...
40 CFR 52.1490 - Original identification of plan.
Code of Federal Regulations, 2013 CFR
2013-07-01
... measures. (ii) A modeling analysis indicating 1982 attainment. (iii) Documentation of the modeling analysis... agencies, (ii) Additional supporting documentation for the 1982 attainment modeling analysis which included... factors for the model. (iii) A revised 1982 attainment modeling analysis and supporting documentation...
40 CFR 52.1490 - Original identification of plan.
Code of Federal Regulations, 2014 CFR
2014-07-01
... measures. (ii) A modeling analysis indicating 1982 attainment. (iii) Documentation of the modeling analysis... agencies, (ii) Additional supporting documentation for the 1982 attainment modeling analysis which included... factors for the model. (iii) A revised 1982 attainment modeling analysis and supporting documentation...
2009-06-01
simulation is the campaign-level Peace Support Operations Model (PSOM). This thesis provides a quantitative analysis of PSOM. The results are based ...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . 15. NUMBER OF PAGES 159...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . vi THIS PAGE
An Integrated Analysis of the Physiological Effects of Space Flight: Executive Summary
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1985-01-01
A large array of models were applied in a unified manner to solve problems in space flight physiology. Mathematical simulation was used as an alternative way of looking at physiological systems and maximizing the yield from previous space flight experiments. A medical data analysis system was created which consist of an automated data base, a computerized biostatistical and data analysis system, and a set of simulation models of physiological systems. Five basic models were employed: (1) a pulsatile cardiovascular model; (2) a respiratory model; (3) a thermoregulatory model; (4) a circulatory, fluid, and electrolyte balance model; and (5) an erythropoiesis regulatory model. Algorithms were provided to perform routine statistical tests, multivariate analysis, nonlinear regression analysis, and autocorrelation analysis. Special purpose programs were prepared for rank correlation, factor analysis, and the integration of the metabolic balance data.
Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data
NASA Astrophysics Data System (ADS)
Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai
2017-11-01
Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.
On Two-Dimensional ARMA Models for Image Analysis.
1980-03-24
2-D ARMA models for image analysis . Particular emphasis is placed on restoration of noisy images using 2-D ARMA models. Computer results are...is concluded that the models are very effective linear models for image analysis . (Author)
NASA Technical Reports Server (NTRS)
LaValley, Brian W.; Little, Phillip D.; Walter, Chris J.
2011-01-01
This report documents the capabilities of the EDICT tools for error modeling and error propagation analysis when operating with models defined in the Architecture Analysis & Design Language (AADL). We discuss our experience using the EDICT error analysis capabilities on a model of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER) architecture that uses the Reliable Optical Bus (ROBUS). Based on these experiences we draw some initial conclusions about model based design techniques for error modeling and analysis of highly reliable computing architectures.
NASA Astrophysics Data System (ADS)
Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian
2016-04-01
Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and for describing model-service deployment information are also included in the proposed strategy. Hence, the model-description interface, model-execution interface and model-deployment interface are studied to help model providers and model users more easily share, reuse and integrate geo-analysis models in an open web environment. Finally, a prototype system is established, and the WPS standard is employed as an example to verify the capability and practicability of the model-encapsulation strategy. The results show that it is more convenient for modellers to share and integrate heterogeneous geo-analysis models in cloud computing platforms.
Aircraft/Air Traffic Management Functional Analysis Model. Version 2.0; User's Guide
NASA Technical Reports Server (NTRS)
Etheridge, Melvin; Plugge, Joana; Retina, Nusrat
1998-01-01
The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) a National Aeronautics and Space Administration (NASA) contract. This document provides a guide for using the model in analysis. Those interested in making enhancements or modification to the model should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Technical Description.
Illustrating a Model-Game-Model Paradigm for Using Human Wargames in Analysis
2017-02-01
Working Paper Illustrating a Model- Game -Model Paradigm for Using Human Wargames in Analysis Paul K. Davis RAND National Security Research...paper proposes and illustrates an analysis-centric paradigm (model- game -model or what might be better called model-exercise-model in some cases) for...to involve stakehold- ers in model development from the outset. The model- game -model paradigm was illustrated in an application to crisis planning
Muñoz-Tamayo, R; Puillet, L; Daniel, J B; Sauvant, D; Martin, O; Taghipoor, M; Blavy, P
2018-04-01
What is a good (useful) mathematical model in animal science? For models constructed for prediction purposes, the question of model adequacy (usefulness) has been traditionally tackled by statistical analysis applied to observed experimental data relative to model-predicted variables. However, little attention has been paid to analytic tools that exploit the mathematical properties of the model equations. For example, in the context of model calibration, before attempting a numerical estimation of the model parameters, we might want to know if we have any chance of success in estimating a unique best value of the model parameters from available measurements. This question of uniqueness is referred to as structural identifiability; a mathematical property that is defined on the sole basis of the model structure within a hypothetical ideal experiment determined by a setting of model inputs (stimuli) and observable variables (measurements). Structural identifiability analysis applied to dynamic models described by ordinary differential equations (ODEs) is a common practice in control engineering and system identification. This analysis demands mathematical technicalities that are beyond the academic background of animal science, which might explain the lack of pervasiveness of identifiability analysis in animal science modelling. To fill this gap, in this paper we address the analysis of structural identifiability from a practitioner perspective by capitalizing on the use of dedicated software tools. Our objectives are (i) to provide a comprehensive explanation of the structural identifiability notion for the community of animal science modelling, (ii) to assess the relevance of identifiability analysis in animal science modelling and (iii) to motivate the community to use identifiability analysis in the modelling practice (when the identifiability question is relevant). We focus our study on ODE models. By using illustrative examples that include published mathematical models describing lactation in cattle, we show how structural identifiability analysis can contribute to advancing mathematical modelling in animal science towards the production of useful models and, moreover, highly informative experiments via optimal experiment design. Rather than attempting to impose a systematic identifiability analysis to the modelling community during model developments, we wish to open a window towards the discovery of a powerful tool for model construction and experiment design.
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Nemeth, Michael P.; Hilburger, Mark W.
2004-01-01
A technology review and assessment of modeling and analysis efforts underway in support of a safe return to flight of the thermal protection system (TPS) for the Space Shuttle external tank (ET) are summarized. This review and assessment effort focuses on the structural modeling and analysis practices employed for ET TPS foam design and analysis and on identifying analysis capabilities needed in the short-term and long-term. The current understanding of the relationship between complex flight environments and ET TPS foam failure modes are reviewed as they relate to modeling and analysis. A literature review on modeling and analysis of TPS foam material systems is also presented. Finally, a review of modeling and analysis tools employed in the Space Shuttle Program is presented for the ET TPS acreage and close-out foam regions. This review includes existing simplified engineering analysis tools are well as finite element analysis procedures.
NASA Technical Reports Server (NTRS)
Noor, A. K.
1983-01-01
Advances in continuum modeling, progress in reduction methods, and analysis and modeling needs for large space structures are covered with specific attention given to repetitive lattice trusses. As far as continuum modeling is concerned, an effective and verified analysis capability exists for linear thermoelastic stress, birfurcation buckling, and free vibration problems of repetitive lattices. However, application of continuum modeling to nonlinear analysis needs more development. Reduction methods are very effective for bifurcation buckling and static (steady-state) nonlinear analysis. However, more work is needed to realize their full potential for nonlinear dynamic and time-dependent problems. As far as analysis and modeling needs are concerned, three areas are identified: loads determination, modeling and nonclassical behavior characteristics, and computational algorithms. The impact of new advances in computer hardware, software, integrated analysis, CAD/CAM stems, and materials technology is also discussed.
On the Relations among Regular, Equal Unique Variances, and Image Factor Analysis Models.
ERIC Educational Resources Information Center
Hayashi, Kentaro; Bentler, Peter M.
2000-01-01
Investigated the conditions under which the matrix of factor loadings from the factor analysis model with equal unique variances will give a good approximation to the matrix of factor loadings from the regular factor analysis model. Extends the results to the image factor analysis model. Discusses implications for practice. (SLD)
ON IDENTIFIABILITY OF NONLINEAR ODE MODELS AND APPLICATIONS IN VIRAL DYNAMICS
MIAO, HONGYU; XIA, XIAOHUA; PERELSON, ALAN S.; WU, HULIN
2011-01-01
Ordinary differential equations (ODE) are a powerful tool for modeling dynamic processes with wide applications in a variety of scientific fields. Over the last 2 decades, ODEs have also emerged as a prevailing tool in various biomedical research fields, especially in infectious disease modeling. In practice, it is important and necessary to determine unknown parameters in ODE models based on experimental data. Identifiability analysis is the first step in determing unknown parameters in ODE models and such analysis techniques for nonlinear ODE models are still under development. In this article, we review identifiability analysis methodologies for nonlinear ODE models developed in the past one to two decades, including structural identifiability analysis, practical identifiability analysis and sensitivity-based identifiability analysis. Some advanced topics and ongoing research are also briefly reviewed. Finally, some examples from modeling viral dynamics of HIV, influenza and hepatitis viruses are given to illustrate how to apply these identifiability analysis methods in practice. PMID:21785515
The Use of Object-Oriented Analysis Methods in Surety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.
1999-05-01
Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less
NASA Technical Reports Server (NTRS)
Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.
2004-01-01
This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.
Underwood, Peter; Waterson, Patrick
2014-07-01
The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.
Using Structural Equation Modeling To Fit Models Incorporating Principal Components.
ERIC Educational Resources Information Center
Dolan, Conor; Bechger, Timo; Molenaar, Peter
1999-01-01
Considers models incorporating principal components from the perspectives of structural-equation modeling. These models include the following: (1) the principal-component analysis of patterned matrices; (2) multiple analysis of variance based on principal components; and (3) multigroup principal-components analysis. Discusses fitting these models…
NASA Technical Reports Server (NTRS)
Towner, Robert L.; Band, Jonathan L.
2012-01-01
An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.
Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J
2013-05-01
Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.
Aircraft/Air Traffic Management Functional Analysis Model: Technical Description. 2.0
NASA Technical Reports Server (NTRS)
Etheridge, Melvin; Plugge, Joana; Retina, Nusrat
1998-01-01
The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) under a National Aeronautics and Space Administration (NASA) contract. This document provides a technical description of FAM 2.0 and its computer files to enable the modeler and programmer to make enhancements or modifications to the model. Those interested in a guide for using the model in analysis should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Users Manual.
National Centers for Environmental Prediction
Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Ice group works on sea ice analysis from satellite, sea ice modeling, and ice-atmosphere-ocean / VISION | About EMC Analysis Drift Model KISS Model Numerical Forecast Systems The Polar and Great Lakes
Practical Use of Computationally Frugal Model Analysis Methods
Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...
2015-03-21
Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less
Using multiple group modeling to test moderators in meta-analysis.
Schoemann, Alexander M
2016-12-01
Meta-analysis is a popular and flexible analysis that can be fit in many modeling frameworks. Two methods of fitting meta-analyses that are growing in popularity are structural equation modeling (SEM) and multilevel modeling (MLM). By using SEM or MLM to fit a meta-analysis researchers have access to powerful techniques associated with SEM and MLM. This paper details how to use one such technique, multiple group analysis, to test categorical moderators in meta-analysis. In a multiple group meta-analysis a model is fit to each level of the moderator simultaneously. By constraining parameters across groups any model parameter can be tested for equality. Using multiple groups to test for moderators is especially relevant in random-effects meta-analysis where both the mean and the between studies variance of the effect size may be compared across groups. A simulation study and the analysis of a real data set are used to illustrate multiple group modeling with both SEM and MLM. Issues related to multiple group meta-analysis and future directions for research are discussed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Amundsen, R. M.; Feldhaus, W. S.; Little, A. D.; Mitchum, M. V.
1995-01-01
Electronic integration of design and analysis processes was achieved and refined at Langley Research Center (LaRC) during the development of an optical bench for a laser-based aerospace experiment. Mechanical design has been integrated with thermal, structural and optical analyses. Electronic import of the model geometry eliminates the repetitive steps of geometry input to develop each analysis model, leading to faster and more accurate analyses. Guidelines for integrated model development are given. This integrated analysis process has been built around software that was already in use by designers and analysis at LaRC. The process as currently implemented used Pro/Engineer for design, Pro/Manufacturing for fabrication, PATRAN for solid modeling, NASTRAN for structural analysis, SINDA-85 and P/Thermal for thermal analysis, and Code V for optical analysis. Currently, the only analysis model to be built manually is the Code V model; all others can be imported for the Pro/E geometry. The translator from PATRAN results to Code V optical analysis (PATCOD) was developed and tested at LaRC. Directions for use of the translator or other models are given.
NASA Astrophysics Data System (ADS)
Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.
2017-12-01
Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
ModelMate - A graphical user interface for model analysis
Banta, Edward R.
2011-01-01
ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.
[Model-based biofuels system analysis: a review].
Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin
2011-03-01
Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.
Model Based Analysis and Test Generation for Flight Software
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep
2009-01-01
We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.
The application of sensitivity analysis to models of large scale physiological systems
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1974-01-01
A survey of the literature of sensitivity analysis as it applies to biological systems is reported as well as a brief development of sensitivity theory. A simple population model and a more complex thermoregulatory model illustrate the investigatory techniques and interpretation of parameter sensitivity analysis. The role of sensitivity analysis in validating and verifying models, and in identifying relative parameter influence in estimating errors in model behavior due to uncertainty in input data is presented. This analysis is valuable to the simulationist and the experimentalist in allocating resources for data collection. A method for reducing highly complex, nonlinear models to simple linear algebraic models that could be useful for making rapid, first order calculations of system behavior is presented.
NASA Technical Reports Server (NTRS)
Parker, K. C.; Torian, J. G.
1980-01-01
A sample environmental control and life support model performance analysis using the environmental analysis routines library is presented. An example of a complete model set up and execution is provided. The particular model was synthesized to utilize all of the component performance routines and most of the program options.
Airflow and Particle Transport Through Human Airways: A Systematic Review
NASA Astrophysics Data System (ADS)
Kharat, S. B.; Deoghare, A. B.; Pandey, K. M.
2017-08-01
This paper describes review of the relevant literature about two phase analysis of air and particle flow through human airways. An emphasis of the review is placed on elaborating the steps involved in two phase analysis, which are Geometric modelling methods and Mathematical models. The first two parts describes various approaches that are followed for constructing an Airway model upon which analysis are conducted. Broad two categories of geometric modelling viz. Simplified modelling and Accurate modelling using medical scans are discussed briefly. Ease and limitations of simplified models, then examples of CT based models are discussed. In later part of the review different mathematical models implemented by researchers for analysis are briefed. Mathematical models used for Air and Particle phases are elaborated separately.
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
Crash Certification by Analysis - Are We There Yet?
NASA Technical Reports Server (NTRS)
Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.
2006-01-01
This paper addresses the issue of crash certification by analysis. This broad topic encompasses many ancillary issues including model validation procedures, uncertainty in test data and analysis models, probabilistic techniques for test-analysis correlation, verification of the mathematical formulation, and establishment of appropriate qualification requirements. This paper will focus on certification requirements for crashworthiness of military helicopters; capabilities of the current analysis codes used for crash modeling and simulation, including some examples of simulations from the literature to illustrate the current approach to model validation; and future directions needed to achieve "crash certification by analysis."
Guikema, Seth
2012-07-01
Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.
Modeling, Analysis, and Optimization Issues for Large Space Structures
NASA Technical Reports Server (NTRS)
Pinson, L. D. (Compiler); Amos, A. K. (Compiler); Venkayya, V. B. (Compiler)
1983-01-01
Topics concerning the modeling, analysis, and optimization of large space structures are discussed including structure-control interaction, structural and structural dynamics modeling, thermal analysis, testing, and design.
Rasmussen's legacy: A paradigm change in engineering for safety.
Leveson, Nancy G
2017-03-01
This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
COBRA ATD minefield detection model initial performance analysis
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.
UNCERTAINTY ANALYSIS IN WATER QUALITY MODELING USING QUAL2E
A strategy for incorporating uncertainty analysis techniques (sensitivity analysis, first order error analysis, and Monte Carlo simulation) into the mathematical water quality model QUAL2E is described. The model, named QUAL2E-UNCAS, automatically selects the input variables or p...
services. Marine Modeling and Analysis Branch Logo Click here to go to the EMC/MMAB homepage MMAB Mission Statement The Marine Modeling and Analysis Branch is part of the Environmental Modeling Center, which is Environmental Prediction Environmental Modeling Center Marine Modeling and Analysis Branch 5830 University
Developing a Learning Progression for Number Sense Based on the Rule Space Model in China
ERIC Educational Resources Information Center
Chen, Fu; Yan, Yue; Xin, Tao
2017-01-01
The current study focuses on developing the learning progression of number sense for primary school students, and it applies a cognitive diagnostic model, the rule space model, to data analysis. The rule space model analysis firstly extracted nine cognitive attributes and their hierarchy model from the analysis of previous research and the…
NASA Technical Reports Server (NTRS)
Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.
2006-01-01
System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.
Combining Static Analysis and Model Checking for Software Analysis
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)
2003-01-01
We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.
Modeling and Analysis of Wrinkled Membranes: An Overview
NASA Technical Reports Server (NTRS)
Yang, B.; Ding, H.; Lou, M.; Fang, H.; Broduer, Steve (Technical Monitor)
2001-01-01
Thin-film membranes are basic elements of a variety of space inflatable/deployable structures. Wrinkling degrades the performance and reliability of these membrane structures, and hence has been a topic of continued interest. Wrinkling analysis of membranes for general geometry and arbitrary boundary conditions is quite challenging. The objective of this presentation is two-fold. Firstly, the existing models of wrinkled membranes and related numerical solution methods are reviewed. The important issues to be discussed are the capability of a membrane model to characterize taut, wrinkled and slack states of membranes in a consistent and physically reasonable manner; the ability of a wrinkling analysis method to predict the formation and growth of wrinkled regions, and to determine out-of-plane deformation and wrinkled waves; the convergence of a numerical solution method for wrinkling analysis; and the compatibility of a wrinkling analysis with general-purpose finite element codes. According to this review, several opening issues in modeling and analysis of wrinkled membranes that are to be addressed in future research are summarized, The second objective of this presentation is to discuss a newly developed membrane model of two viable parameters (2-VP model) and associated parametric finite element method (PFEM) for wrinkling analysis are introduced. The innovations and advantages of the proposed membrane model and PFEM-based wrinkling analysis are: (1) Via a unified stress-strain relation; the 2-VP model treat the taut, wrinkled, and slack states of membranes consistently; (2) The PFEM-based wrinkling analysis has guaranteed convergence; (3) The 2-VP model along with PFEM is capable of predicting membrane out-of-plane deformations; and (4) The PFEM can be integrated into any existing finite element code. Preliminary numerical examples are also included in this presentation to demonstrate the 2-VP model and PFEM-based wrinkling analysis approach.
NASA Astrophysics Data System (ADS)
Tian, F.; Sivapalan, M.; Li, H.; Hu, H.
2007-12-01
The importance of diagnostic analysis of hydrological models is increasingly recognized by the scientific community (M. Sivapalan, et al., 2003; H. V. Gupta, et al., 2007). Model diagnosis refers to model structures and parameters being identified not only by statistical comparison of system state variables and outputs but also by process understanding in a specific watershed. Process understanding can be gained by the analysis of observational data and model results at the specific watershed as well as through regionalization. Although remote sensing technology can provide valuable data about the inputs, state variables, and outputs of the hydrological system, observational rainfall-runoff data still constitute the most accurate, reliable, direct, and thus a basic component of hydrology related database. One critical question in model diagnostic analysis is, therefore, what signature characteristic can we extract from rainfall and runoff data. To this date only a few studies have focused on this question, such as Merz et al. (2006) and Lana-Renault et al. (2007), still none of these studies related event analysis with model diagnosis in an explicit, rigorous, and systematic manner. Our work focuses on the identification of the dominant runoff generation mechanisms from event analysis of rainfall-runoff data, including correlation analysis and analysis of timing pattern. The correlation analysis involves the identification of the complex relationship among rainfall depth, intensity, runoff coefficient, and antecedent conditions, and the timing pattern analysis aims to identify the clustering pattern of runoff events in relation to the patterns of rainfall events. Our diagnostic analysis illustrates the changing pattern of runoff generation mechanisms in the DMIP2 test watersheds located in Oklahoma region, which is also well recognized by numerical simulations based on TsingHua Representative Elementary Watershed (THREW) model. The result suggests the usefulness of rainfall-runoff event analysis for model development as well as model diagnostics.
[How to fit and interpret multilevel models using SPSS].
Pardo, Antonio; Ruiz, Miguel A; San Martín, Rafael
2007-05-01
Hierarchic or multilevel models are used to analyse data when cases belong to known groups and sample units are selected both from the individual level and from the group level. In this work, the multilevel models most commonly discussed in the statistic literature are described, explaining how to fit these models using the SPSS program (any version as of the 11 th ) and how to interpret the outcomes of the analysis. Five particular models are described, fitted, and interpreted: (1) one-way analysis of variance with random effects, (2) regression analysis with means-as-outcomes, (3) one-way analysis of covariance with random effects, (4) regression analysis with random coefficients, and (5) regression analysis with means- and slopes-as-outcomes. All models are explained, trying to make them understandable to researchers in health and behaviour sciences.
Karabatsos, George
2017-02-01
Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected functionals and values of covariates. The software is illustrated through the BNP regression analysis of real data.
Integrating Cognitive Task Analysis into Instructional Systems Development.
ERIC Educational Resources Information Center
Ryder, Joan M.; Redding, Richard E.
1993-01-01
Discussion of instructional systems development (ISD) focuses on recent developments in cognitive task analysis and describes the Integrated Task Analysis Model, a framework for integrating cognitive and behavioral task analysis methods within the ISD model. Three components of expertise are analyzed: skills, knowledge, and mental models. (96…
Branches Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact EMC , state and local government Web resources and services. Real-time, global, sea surface temperature (RTG_SST_HR) analysis For a regional map, click the desired area in the global SST analysis and anomaly maps
SMP: A solid modeling program version 2.0
NASA Technical Reports Server (NTRS)
Randall, D. P.; Jones, K. H.; Vonofenheim, W. H.; Gates, R. L.; Matthews, C. G.
1986-01-01
The Solid Modeling Program (SMP) provides the capability to model complex solid objects through the composition of primitive geometric entities. In addition to the construction of solid models, SMP has extensive facilities for model editing, display, and analysis. The geometric model produced by the software system can be output in a format compatible with existing analysis programs such as PATRAN-G. The present version of the SMP software supports six primitives: boxes, cones, spheres, paraboloids, tori, and trusses. The details for creating each of the major primitive types is presented. The analysis capabilities of SMP, including interfaces to existing analysis programs, are discussed.
Analytic uncertainty and sensitivity analysis of models with input correlations
NASA Astrophysics Data System (ADS)
Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu
2018-03-01
Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.
USDA-ARS?s Scientific Manuscript database
This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...
NASA Technical Reports Server (NTRS)
Lovejoy, Andrew E.; Hilburger, Mark W.
2013-01-01
This document outlines a Modeling and Analysis Plan (MAP) to be followed by the SBKF analysts. It includes instructions on modeling and analysis formulation and execution, model verification and validation, identifying sources of error and uncertainty, and documentation. The goal of this MAP is to provide a standardized procedure that ensures uniformity and quality of the results produced by the project and corresponding documentation.
Requirements analysis, domain knowledge, and design
NASA Technical Reports Server (NTRS)
Potts, Colin
1988-01-01
Two improvements to current requirements analysis practices are suggested: domain modeling, and the systematic application of analysis heuristics. Domain modeling is the representation of relevant application knowledge prior to requirements specification. Artificial intelligence techniques may eventually be applicable for domain modeling. In the short term, however, restricted domain modeling techniques, such as that in JSD, will still be of practical benefit. Analysis heuristics are standard patterns of reasoning about the requirements. They usually generate questions of clarification or issues relating to completeness. Analysis heuristics can be represented and therefore systematically applied in an issue-based framework. This is illustrated by an issue-based analysis of JSD's domain modeling and functional specification heuristics. They are discussed in the context of the preliminary design of simple embedded systems.
Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial
This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...
Performance analysis and dynamic modeling of a single-spool turbojet engine
NASA Astrophysics Data System (ADS)
Andrei, Irina-Carmen; Toader, Adrian; Stroe, Gabriela; Frunzulica, Florin
2017-01-01
The purposes of modeling and simulation of a turbojet engine are the steady state analysis and transient analysis. From the steady state analysis, which consists in the investigation of the operating, equilibrium regimes and it is based on appropriate modeling describing the operation of a turbojet engine at design and off-design regimes, results the performance analysis, concluded by the engine's operational maps (i.e. the altitude map, velocity map and speed map) and the engine's universal map. The mathematical model that allows the calculation of the design and off-design performances, in case of a single spool turbojet is detailed. An in house code was developed, its calibration was done for the J85 turbojet engine as the test case. The dynamic modeling of the turbojet engine is obtained from the energy balance equations for compressor, combustor and turbine, as the engine's main parts. The transient analysis, which is based on appropriate modeling of engine and its main parts, expresses the dynamic behavior of the turbojet engine, and further, provides details regarding the engine's control. The aim of the dynamic analysis is to determine a control program for the turbojet, based on the results provided by performance analysis. In case of the single-spool turbojet engine, with fixed nozzle geometry, the thrust is controlled by one parameter, which is the fuel flow rate. The design and management of the aircraft engine controls are based on the results of the transient analysis. The construction of the design model is complex, since it is based on both steady-state and transient analysis, further allowing the flight path cycle analysis and optimizations. This paper presents numerical simulations for a single-spool turbojet engine (J85 as test case), with appropriate modeling for steady-state and dynamic analysis.
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.
1998-01-01
The use of response surface models and kriging models are compared for approximating non-random, deterministic computer analyses. After discussing the traditional response surface approach for constructing polynomial models for approximation, kriging is presented as an alternative statistical-based approximation method for the design and analysis of computer experiments. Both approximation methods are applied to the multidisciplinary design and analysis of an aerospike nozzle which consists of a computational fluid dynamics model and a finite element analysis model. Error analysis of the response surface and kriging models is performed along with a graphical comparison of the approximations. Four optimization problems are formulated and solved using both approximation models. While neither approximation technique consistently outperforms the other in this example, the kriging models using only a constant for the underlying global model and a Gaussian correlation function perform as well as the second order polynomial response surface models.
Finite element modeling and analysis of tires
NASA Technical Reports Server (NTRS)
Noor, A. K.; Andersen, C. M.
1983-01-01
Predicting the response of tires under various loading conditions using finite element technology is addressed. Some of the recent advances in finite element technology which have high potential for application to tire modeling problems are reviewed. The analysis and modeling needs for tires are identified. Reduction methods for large-scale nonlinear analysis, with particular emphasis on treatment of combined loads, displacement-dependent and nonconservative loadings; development of simple and efficient mixed finite element models for shell analysis, identification of equivalent mixed and purely displacement models, and determination of the advantages of using mixed models; and effective computational models for large-rotation nonlinear problems, based on a total Lagrangian description of the deformation are included.
Design and Analysis Tools for Supersonic Inlets
NASA Technical Reports Server (NTRS)
Slater, John W.; Folk, Thomas C.
2009-01-01
Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.
Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model
Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance
2014-01-01
Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...
Branches Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact EMC Weather Service NWS logo - Click to go to the NWS homepage Environmental Modeling Center Home News Organization Search Go Search Polar Go MMAB SST Analysis Main page About MMAB Our Mission Our Personnel EMC
Information Flow in an Atmospheric Model and Data Assimilation
ERIC Educational Resources Information Center
Yoon, Young-noh
2011-01-01
Weather forecasting consists of two processes, model integration and analysis (data assimilation). During the model integration, the state estimate produced by the analysis evolves to the next cycle time according to the atmospheric model to become the background estimate. The analysis then produces a new state estimate by combining the background…
Meta-analysis of Gaussian individual patient data: Two-stage or not two-stage?
Morris, Tim P; Fisher, David J; Kenward, Michael G; Carpenter, James R
2018-04-30
Quantitative evidence synthesis through meta-analysis is central to evidence-based medicine. For well-documented reasons, the meta-analysis of individual patient data is held in higher regard than aggregate data. With access to individual patient data, the analysis is not restricted to a "two-stage" approach (combining estimates and standard errors) but can estimate parameters of interest by fitting a single model to all of the data, a so-called "one-stage" analysis. There has been debate about the merits of one- and two-stage analysis. Arguments for one-stage analysis have typically noted that a wider range of models can be fitted and overall estimates may be more precise. The two-stage side has emphasised that the models that can be fitted in two stages are sufficient to answer the relevant questions, with less scope for mistakes because there are fewer modelling choices to be made in the two-stage approach. For Gaussian data, we consider the statistical arguments for flexibility and precision in small-sample settings. Regarding flexibility, several of the models that can be fitted only in one stage may not be of serious interest to most meta-analysis practitioners. Regarding precision, we consider fixed- and random-effects meta-analysis and see that, for a model making certain assumptions, the number of stages used to fit this model is irrelevant; the precision will be approximately equal. Meta-analysts should choose modelling assumptions carefully. Sometimes relevant models can only be fitted in one stage. Otherwise, meta-analysts are free to use whichever procedure is most convenient to fit the identified model. © 2018 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
ADAM: analysis of discrete models of biological systems using computer algebra.
Hinkelmann, Franziska; Brandon, Madison; Guang, Bonny; McNeill, Rustin; Blekherman, Grigoriy; Veliz-Cuba, Alan; Laubenbacher, Reinhard
2011-07-20
Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics.
Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies
NASA Technical Reports Server (NTRS)
Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.
2014-01-01
This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio and number of control surfaces. A doublet lattice approach is taken to compute generalized forces. A rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. Although, all parameters can be easily modified if desired.The focus of this paper is on tool presentation, verification and validation. This process is carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool. Therefore the flutter speed and frequency for a clamped plate are computed using V-g and V-f analysis. The computational results are compared to a previously published computational analysis and wind tunnel results for the same structure. Finally a case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to V-g and V-f analysis. This also includes the analysis of the model in response to a 1-cos gust.
Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies
NASA Technical Reports Server (NTRS)
Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.
2015-01-01
This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this paper is on tool presentation, verification, and validation. These processes are carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.
Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies
NASA Technical Reports Server (NTRS)
Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.
2015-01-01
This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.
Image analysis and modeling in medical image computing. Recent developments and advances.
Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T
2012-01-01
Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.
Using structural equation modeling for network meta-analysis.
Tu, Yu-Kang; Wu, Yun-Chun
2017-07-14
Network meta-analysis overcomes the limitations of traditional pair-wise meta-analysis by incorporating all available evidence into a general statistical framework for simultaneous comparisons of several treatments. Currently, network meta-analyses are undertaken either within the Bayesian hierarchical linear models or frequentist generalized linear mixed models. Structural equation modeling (SEM) is a statistical method originally developed for modeling causal relations among observed and latent variables. As random effect is explicitly modeled as a latent variable in SEM, it is very flexible for analysts to specify complex random effect structure and to make linear and nonlinear constraints on parameters. The aim of this article is to show how to undertake a network meta-analysis within the statistical framework of SEM. We used an example dataset to demonstrate the standard fixed and random effect network meta-analysis models can be easily implemented in SEM. It contains results of 26 studies that directly compared three treatment groups A, B and C for prevention of first bleeding in patients with liver cirrhosis. We also showed that a new approach to network meta-analysis based on the technique of unrestricted weighted least squares (UWLS) method can also be undertaken using SEM. For both the fixed and random effect network meta-analysis, SEM yielded similar coefficients and confidence intervals to those reported in the previous literature. The point estimates of two UWLS models were identical to those in the fixed effect model but the confidence intervals were greater. This is consistent with results from the traditional pairwise meta-analyses. Comparing to UWLS model with common variance adjusted factor, UWLS model with unique variance adjusted factor has greater confidence intervals when the heterogeneity was larger in the pairwise comparison. The UWLS model with unique variance adjusted factor reflects the difference in heterogeneity within each comparison. SEM provides a very flexible framework for univariate and multivariate meta-analysis, and its potential as a powerful tool for advanced meta-analysis is still to be explored.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.
Box truss analysis and technology development. Task 1: Mesh analysis and control
NASA Technical Reports Server (NTRS)
Bachtell, E. E.; Bettadapur, S. S.; Coyner, J. V.
1985-01-01
An analytical tool was developed to model, analyze and predict RF performance of box truss antennas with reflective mesh surfaces. The analysis system is unique in that it integrates custom written programs for cord tied mesh surfaces, thereby drastically reducing the cost of analysis. The analysis system is capable of determining the RF performance of antennas under any type of manufacturing or operating environment by integrating together the various disciplines of design, finite element analysis, surface best fit analysis and RF analysis. The Integrated Mesh Analysis System consists of six separate programs: The Mesh Tie System Model Generator, The Loadcase Generator, The Model Optimizer, The Model Solver, The Surface Topography Solver and The RF Performance Solver. Additionally, a study using the mesh analysis system was performed to determine the effect of on orbit calibration, i.e., surface adjustment, on a typical box truss antenna.
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
NASA Astrophysics Data System (ADS)
Hill, M. C.; Jakeman, J.; Razavi, S.; Tolson, B.
2015-12-01
For many environmental systems model runtimes have remained very long as more capable computers have been used to add more processes and more time and space discretization. Scientists have also added more parameters and kinds of observations, and many model runs are needed to explore the models. Computational demand equals run time multiplied by number of model runs divided by parallelization opportunities. Model exploration is conducted using sensitivity analysis, optimization, and uncertainty quantification. Sensitivity analysis is used to reveal consequences of what may be very complex simulated relations, optimization is used to identify parameter values that fit the data best, or at least better, and uncertainty quantification is used to evaluate the precision of simulated results. The long execution times make such analyses a challenge. Methods for addressing this challenges include computationally frugal analysis of the demanding original model and a number of ingenious surrogate modeling methods. Both commonly use about 50-100 runs of the demanding original model. In this talk we consider the tradeoffs between (1) original model development decisions, (2) computationally frugal analysis of the original model, and (3) using many model runs of the fast surrogate model. Some questions of interest are as follows. If the added processes and discretization invested in (1) are compared with the restrictions and approximations in model analysis produced by long model execution times, is there a net benefit related of the goals of the model? Are there changes to the numerical methods that could reduce the computational demands while giving up less fidelity than is compromised by using computationally frugal methods or surrogate models for model analysis? Both the computationally frugal methods and surrogate models require that the solution of interest be a smooth function of the parameters or interest. How does the information obtained from the local methods typical of (2) and the global averaged methods typical of (3) compare for typical systems? The discussion will use examples of response of the Greenland glacier to global warming and surface and groundwater modeling.
Chen, Hongming; Carlsson, Lars; Eriksson, Mats; Varkonyi, Peter; Norinder, Ulf; Nilsson, Ingemar
2013-06-24
A novel methodology was developed to build Free-Wilson like local QSAR models by combining R-group signatures and the SVM algorithm. Unlike Free-Wilson analysis this method is able to make predictions for compounds with R-groups not present in a training set. Eleven public data sets were chosen as test cases for comparing the performance of our new method with several other traditional modeling strategies, including Free-Wilson analysis. Our results show that the R-group signature SVM models achieve better prediction accuracy compared with Free-Wilson analysis in general. Moreover, the predictions of R-group signature models are also comparable to the models using ECFP6 fingerprints and signatures for the whole compound. Most importantly, R-group contributions to the SVM model can be obtained by calculating the gradient for R-group signatures. For most of the studied data sets, a significant correlation with that of a corresponding Free-Wilson analysis is shown. These results suggest that the R-group contribution can be used to interpret bioactivity data and highlight that the R-group signature based SVM modeling method is as interpretable as Free-Wilson analysis. Hence the signature SVM model can be a useful modeling tool for any drug discovery project.
Educational and Scientific Applications of Climate Model Diagnostic Analyzer
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.
2016-12-01
Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of Two Variables, and the datasets used are NCAR CAM total cloud fraction and MODIS total cloud fraction. The scientific highlight of the use case is that the CAM5 model overall does a fairly decent job at simulating total cloud cover, though simulates too few clouds especially near and offshore of the eastern ocean basins where low clouds are dominant.
An NCME Instructional Module on Latent DIF Analysis Using Mixture Item Response Models
ERIC Educational Resources Information Center
Cho, Sun-Joo; Suh, Youngsuk; Lee, Woo-yeol
2016-01-01
The purpose of this ITEMS module is to provide an introduction to differential item functioning (DIF) analysis using mixture item response models. The mixture item response models for DIF analysis involve comparing item profiles across latent groups, instead of manifest groups. First, an overview of DIF analysis based on latent groups, called…
Latent Transition Analysis of Pre-Service Teachers' Efficacy in Mathematics and Science
ERIC Educational Resources Information Center
Ward, Elizabeth Kennedy
2009-01-01
This study modeled changes in pre-service teacher efficacy in mathematics and science over the course of the final year of teacher preparation using latent transition analysis (LTA), a longitudinal form of analysis that builds on two modeling traditions (latent class analysis (LCA) and auto-regressive modeling). Data were collected using the…
ERIC Educational Resources Information Center
McCabe, Declan J.; Knight, Evelyn J.
2016-01-01
Since being introduced by Connor and Simberloff in response to Diamond's assembly rules, null model analysis has been a controversial tool in community ecology. Despite being commonly used in the primary literature, null model analysis has not featured prominently in general textbooks. Complexity of approaches along with difficulty in interpreting…
ERIC Educational Resources Information Center
Feingold, Alan
2009-01-01
The use of growth-modeling analysis (GMA)--including hierarchical linear models, latent growth models, and general estimating equations--to evaluate interventions in psychology, psychiatry, and prevention science has grown rapidly over the last decade. However, an effect size associated with the difference between the trajectories of the…
Updating the Behavior Engineering Model.
ERIC Educational Resources Information Center
Chevalier, Roger
2003-01-01
Considers Thomas Gilbert's Behavior Engineering Model as a tool for systematically identifying barriers to individual and organizational performance. Includes a detailed case study and a performance aid that incorporates gap analysis, cause analysis, and force field analysis to update the original model. (Author/LRW)
Cullinane Thomas, Catherine; Huber, Christopher C.; Koontz, Lynne
2014-01-01
This 2012 analysis marks a major revision to the NPS visitor spending effects analyses, with the development of a new visitor spending effects model (VSE model) that replaces the former Money Generation Model (MGM2). Many of the hallmarks and processes of the MGM2 model are preserved in the new VSE model, but the new model makes significant strides in improving the accuracy and transparency of the analysis. Because of this change from the MGM2 model to the VSE model, estimates from this year’s analysis are not directly comparable to previous analyses.
Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling
NASA Astrophysics Data System (ADS)
Schum, William K.; Doolittle, Christina M.; Boyarko, George A.
2006-05-01
During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.
Combining multiple imputation and meta-analysis with individual participant data
Burgess, Stephen; White, Ian R; Resche-Rigon, Matthieu; Wood, Angela M
2013-01-01
Multiple imputation is a strategy for the analysis of incomplete data such that the impact of the missingness on the power and bias of estimates is mitigated. When data from multiple studies are collated, we can propose both within-study and multilevel imputation models to impute missing data on covariates. It is not clear how to choose between imputation models or how to combine imputation and inverse-variance weighted meta-analysis methods. This is especially important as often different studies measure data on different variables, meaning that we may need to impute data on a variable which is systematically missing in a particular study. In this paper, we consider a simulation analysis of sporadically missing data in a single covariate with a linear analysis model and discuss how the results would be applicable to the case of systematically missing data. We find in this context that ensuring the congeniality of the imputation and analysis models is important to give correct standard errors and confidence intervals. For example, if the analysis model allows between-study heterogeneity of a parameter, then we should incorporate this heterogeneity into the imputation model to maintain the congeniality of the two models. In an inverse-variance weighted meta-analysis, we should impute missing data and apply Rubin's rules at the study level prior to meta-analysis, rather than meta-analyzing each of the multiple imputations and then combining the meta-analysis estimates using Rubin's rules. We illustrate the results using data from the Emerging Risk Factors Collaboration. PMID:23703895
Bayesian structural equation modeling: a more flexible representation of substantive theory.
Muthén, Bengt; Asparouhov, Tihomir
2012-09-01
This article proposes a new approach to factor analysis and structural equation modeling using Bayesian analysis. The new approach replaces parameter specifications of exact zeros with approximate zeros based on informative, small-variance priors. It is argued that this produces an analysis that better reflects substantive theories. The proposed Bayesian approach is particularly beneficial in applications where parameters are added to a conventional model such that a nonidentified model is obtained if maximum-likelihood estimation is applied. This approach is useful for measurement aspects of latent variable modeling, such as with confirmatory factor analysis, and the measurement part of structural equation modeling. Two application areas are studied, cross-loadings and residual correlations in confirmatory factor analysis. An example using a full structural equation model is also presented, showing an efficient way to find model misspecification. The approach encompasses 3 elements: model testing using posterior predictive checking, model estimation, and model modification. Monte Carlo simulations and real data are analyzed using Mplus. The real-data analyses use data from Holzinger and Swineford's (1939) classic mental abilities study, Big Five personality factor data from a British survey, and science achievement data from the National Educational Longitudinal Study of 1988.
SensA: web-based sensitivity analysis of SBML models.
Floettmann, Max; Uhlendorf, Jannis; Scharp, Till; Klipp, Edda; Spiesser, Thomas W
2014-10-01
SensA is a web-based application for sensitivity analysis of mathematical models. The sensitivity analysis is based on metabolic control analysis, computing the local, global and time-dependent properties of model components. Interactive visualization facilitates interpretation of usually complex results. SensA can contribute to the analysis, adjustment and understanding of mathematical models for dynamic systems. SensA is available at http://gofid.biologie.hu-berlin.de/ and can be used with any modern browser. The source code can be found at https://bitbucket.org/floettma/sensa/ (MIT license) © The Author 2014. Published by Oxford University Press.
Fraysse, Bodvaël; Barthélémy, Inès; Qannari, El Mostafa; Rouger, Karl; Thorin, Chantal; Blot, Stéphane; Le Guiner, Caroline; Chérel, Yan; Hogrel, Jean-Yves
2017-04-12
Accelerometric analysis of gait abnormalities in golden retriever muscular dystrophy (GRMD) dogs is of limited sensitivity, and produces highly complex data. The use of discriminant analysis may enable simpler and more sensitive evaluation of treatment benefits in this important preclinical model. Accelerometry was performed twice monthly between the ages of 2 and 12 months on 8 healthy and 20 GRMD dogs. Seven accelerometric parameters were analysed using linear discriminant analysis (LDA). Manipulation of the dependent and independent variables produced three distinct models. The ability of each model to detect gait alterations and their pattern change with age was tested using a leave-one-out cross-validation approach. Selecting genotype (healthy or GRMD) as the dependent variable resulted in a model (Model 1) allowing a good discrimination between the gait phenotype of GRMD and healthy dogs. However, this model was not sufficiently representative of the disease progression. In Model 2, age in months was added as a supplementary dependent variable (GRMD_2 to GRMD_12 and Healthy_2 to Healthy_9.5), resulting in a high overall misclassification rate (83.2%). To improve accuracy, a third model (Model 3) was created in which age was also included as an explanatory variable. This resulted in an overall misclassification rate lower than 12%. Model 3 was evaluated using blinded data pertaining to 81 healthy and GRMD dogs. In all but one case, the model correctly matched gait phenotype to the actual genotype. Finally, we used Model 3 to reanalyse data from a previous study regarding the effects of immunosuppressive treatments on muscular dystrophy in GRMD dogs. Our model identified significant effect of immunosuppressive treatments on gait quality, corroborating the original findings, with the added advantages of direct statistical analysis with greater sensitivity and more comprehensible data representation. Gait analysis using LDA allows for improved analysis of accelerometry data by applying a decision-making analysis approach to the evaluation of preclinical treatment benefits in GRMD dogs.
Factor analysis and multiple regression between topography and precipitation on Jeju Island, Korea
NASA Astrophysics Data System (ADS)
Um, Myoung-Jin; Yun, Hyeseon; Jeong, Chang-Sam; Heo, Jun-Haeng
2011-11-01
SummaryIn this study, new factors that influence precipitation were extracted from geographic variables using factor analysis, which allow for an accurate estimation of orographic precipitation. Correlation analysis was also used to examine the relationship between nine topographic variables from digital elevation models (DEMs) and the precipitation in Jeju Island. In addition, a spatial analysis was performed in order to verify the validity of the regression model. From the results of the correlation analysis, it was found that all of the topographic variables had a positive correlation with the precipitation. The relations between the variables also changed in accordance with a change in the precipitation duration. However, upon examining the correlation matrix, no significant relationship between the latitude and the aspect was found. According to the factor analysis, eight topographic variables (latitude being the exception) were found to have a direct influence on the precipitation. Three factors were then extracted from the eight topographic variables. By directly comparing the multiple regression model with the factors (model 1) to the multiple regression model with the topographic variables (model 3), it was found that model 1 did not violate the limits of statistical significance and multicollinearity. As such, model 1 was considered to be appropriate for estimating the precipitation when taking into account the topography. In the study of model 1, the multiple regression model using factor analysis was found to be the best method for estimating the orographic precipitation on Jeju Island.
A Hierarchical Visualization Analysis Model of Power Big Data
NASA Astrophysics Data System (ADS)
Li, Yongjie; Wang, Zheng; Hao, Yang
2018-01-01
Based on the conception of integrating VR scene and power big data analysis, a hierarchical visualization analysis model of power big data is proposed, in which levels are designed, targeting at different abstract modules like transaction, engine, computation, control and store. The regularly departed modules of power data storing, data mining and analysis, data visualization are integrated into one platform by this model. It provides a visual analysis solution for the power big data.
NASA Technical Reports Server (NTRS)
Johnston, John; Mosier, Mark; Howard, Joe; Hyde, Tupper; Parrish, Keith; Ha, Kong; Liu, Frank; McGinnis, Mark
2004-01-01
This paper presents viewgraphs about structural analysis activities and integrated modeling for the James Webb Space Telescope (JWST). The topics include: 1) JWST Overview; 2) Observatory Structural Models; 3) Integrated Performance Analysis; and 4) Future Work and Challenges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wnek, W.J.; Ramshaw, J.D.; Trapp, J.A.
1975-11-01
A mathematical model and a numerical solution scheme for thermal- hydraulic analysis of fuel rod arrays are given. The model alleviates the two major deficiencies associated with existing rod array analysis models, that of a correct transverse momentum equation and the capability of handling reversing and circulatory flows. Possible applications of the model include steady state and transient subchannel calculations as well as analysis of flows in heat exchangers, other engineering equipment, and porous media. (auth)
Computer-aided-engineering system for modeling and analysis of ECLSS integration testing
NASA Technical Reports Server (NTRS)
Sepahban, Sonbol
1987-01-01
The accurate modeling and analysis of two-phase fluid networks found in environmental control and life support systems is presently undertaken by computer-aided engineering (CAE) techniques whose generalized fluid dynamics package can solve arbitrary flow networks. The CAE system for integrated test bed modeling and analysis will also furnish interfaces and subsystem/test-article mathematical models. Three-dimensional diagrams of the test bed are generated by the system after performing the requisite simulation and analysis.
Model Performance Evaluation and Scenario Analysis (MPESA)
Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)
A quantitative analysis of the F18 flight control system
NASA Technical Reports Server (NTRS)
Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann
1993-01-01
This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.
A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network
NASA Astrophysics Data System (ADS)
Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.
2018-02-01
Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.
Simulation skill of APCC set of global climate models for Asian summer monsoon rainfall variability
NASA Astrophysics Data System (ADS)
Singh, U. K.; Singh, G. P.; Singh, Vikas
2015-04-01
The performance of 11 Asia-Pacific Economic Cooperation Climate Center (APCC) global climate models (coupled and uncoupled both) in simulating the seasonal summer (June-August) monsoon rainfall variability over Asia (especially over India and East Asia) has been evaluated in detail using hind-cast data (3 months advance) generated from APCC which provides the regional climate information product services based on multi-model ensemble dynamical seasonal prediction systems. The skill of each global climate model over Asia was tested separately in detail for the period of 21 years (1983-2003), and simulated Asian summer monsoon rainfall (ASMR) has been verified using various statistical measures for Indian and East Asian land masses separately. The analysis found a large variation in spatial ASMR simulated with uncoupled model compared to coupled models (like Predictive Ocean Atmosphere Model for Australia, National Centers for Environmental Prediction and Japan Meteorological Agency). The simulated ASMR in coupled model was closer to Climate Prediction Centre Merged Analysis of Precipitation (CMAP) compared to uncoupled models although the amount of ASMR was underestimated in both models. Analysis also found a high spread in simulated ASMR among the ensemble members (suggesting that the model's performance is highly dependent on its initial conditions). The correlation analysis between sea surface temperature (SST) and ASMR shows that that the coupled models are strongly associated with ASMR compared to the uncoupled models (suggesting that air-sea interaction is well cared in coupled models). The analysis of rainfall using various statistical measures suggests that the multi-model ensemble (MME) performed better compared to individual model and also separate study indicate that Indian and East Asian land masses are more useful compared to Asia monsoon rainfall as a whole. The results of various statistical measures like skill of multi-model ensemble, large spread among the ensemble members of individual model, strong teleconnection (correlation analysis) with SST, coefficient of variation, inter-annual variability, analysis of Taylor diagram, etc. suggest that there is a need to improve coupled model instead of uncoupled model for the development of a better dynamical seasonal forecast system.
Separate-channel analysis of two-channel microarrays: recovering inter-spot information.
Smyth, Gordon K; Altman, Naomi S
2013-05-26
Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.
Determination of Uncertainties for the New SSME Model
NASA Technical Reports Server (NTRS)
Coleman, Hugh W.; Hawk, Clark W.
1996-01-01
This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.
Multiscale hidden Markov models for photon-limited imaging
NASA Astrophysics Data System (ADS)
Nowak, Robert D.
1999-06-01
Photon-limited image analysis is often hindered by low signal-to-noise ratios. A novel Bayesian multiscale modeling and analysis method is developed in this paper to assist in these challenging situations. In addition to providing a very natural and useful framework for modeling an d processing images, Bayesian multiscale analysis is often much less computationally demanding compared to classical Markov random field models. This paper focuses on a probabilistic graph model called the multiscale hidden Markov model (MHMM), which captures the key inter-scale dependencies present in natural image intensities. The MHMM framework presented here is specifically designed for photon-limited imagin applications involving Poisson statistics, and applications to image intensity analysis are examined.
Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination
NASA Technical Reports Server (NTRS)
Groen, Frank
2010-01-01
This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.
Practical Application of Model-based Programming and State-based Architecture to Space Missions
NASA Technical Reports Server (NTRS)
Horvath, Gregory; Ingham, Michel; Chung, Seung; Martin, Oliver; Williams, Brian
2006-01-01
A viewgraph presentation to develop models from systems engineers that accomplish mission objectives and manage the health of the system is shown. The topics include: 1) Overview; 2) Motivation; 3) Objective/Vision; 4) Approach; 5) Background: The Mission Data System; 6) Background: State-based Control Architecture System; 7) Background: State Analysis; 8) Overview of State Analysis; 9) Background: MDS Software Frameworks; 10) Background: Model-based Programming; 10) Background: Titan Model-based Executive; 11) Model-based Execution Architecture; 12) Compatibility Analysis of MDS and Titan Architectures; 13) Integrating Model-based Programming and Execution into the Architecture; 14) State Analysis and Modeling; 15) IMU Subsystem State Effects Diagram; 16) Titan Subsystem Model: IMU Health; 17) Integrating Model-based Programming and Execution into the Software IMU; 18) Testing Program; 19) Computationally Tractable State Estimation & Fault Diagnosis; 20) Diagnostic Algorithm Performance; 21) Integration and Test Issues; 22) Demonstrated Benefits; and 23) Next Steps
The HIV Cure Research Agenda: The Role of Mathematical Modelling and Cost-Effectiveness Analysis.
Freedberg, Kenneth A; Possas, Cristina; Deeks, Steven; Ross, Anna Laura; Rosettie, Katherine L; Di Mascio, Michele; Collins, Chris; Walensky, Rochelle P; Yazdanpanah, Yazdan
The research agenda towards an HIV cure is building rapidly. In this article, we discuss the reasons for and methodological approach to using mathematical modeling and cost-effectiveness analysis in this agenda. We provide a brief description of the proof of concept for cure and the current directions of cure research. We then review the types of clinical economic evaluations, including cost analysis, cost-benefit analysis, and cost-effectiveness analysis. We describe the use of mathematical modeling and cost-effectiveness analysis early in the HIV epidemic as well as in the era of combination antiretroviral therapy. We then highlight the novel methodology of Value of Information analysis and its potential role in the planning of clinical trials. We close with recommendations for modeling and cost-effectiveness analysis in the HIV cure agenda.
Problem Solving Model for Science Learning
NASA Astrophysics Data System (ADS)
Alberida, H.; Lufri; Festiyed; Barlian, E.
2018-04-01
This research aims to develop problem solving model for science learning in junior high school. The learning model was developed using the ADDIE model. An analysis phase includes curriculum analysis, analysis of students of SMP Kota Padang, analysis of SMP science teachers, learning analysis, as well as the literature review. The design phase includes product planning a science-learning problem-solving model, which consists of syntax, reaction principle, social system, support system, instructional impact and support. Implementation of problem-solving model in science learning to improve students' science process skills. The development stage consists of three steps: a) designing a prototype, b) performing a formative evaluation and c) a prototype revision. Implementation stage is done through a limited trial. A limited trial was conducted on 24 and 26 August 2015 in Class VII 2 SMPN 12 Padang. The evaluation phase was conducted in the form of experiments at SMPN 1 Padang, SMPN 12 Padang and SMP National Padang. Based on the development research done, the syntax model problem solving for science learning at junior high school consists of the introduction, observation, initial problems, data collection, data organization, data analysis/generalization, and communicating.
Evaluation of RCAS Inflow Models for Wind Turbine Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tangler, J.; Bir, G.
The finite element structural modeling in the Rotorcraft Comprehensive Analysis System (RCAS) provides a state-of-the-art approach to aeroelastic analysis. This, coupled with its ability to model all turbine components, results in a methodology that can simulate complex system interactions characteristic of large wind. In addition, RCAS is uniquely capable of modeling advanced control algorithms and the resulting dynamic responses.
Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation
ERIC Educational Resources Information Center
Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom
2014-01-01
Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…
A LISREL Model for the Analysis of Repeated Measures with a Patterned Covariance Matrix.
ERIC Educational Resources Information Center
Rovine, Michael J.; Molenaar, Peter C. M.
1998-01-01
Presents a LISREL model for the estimation of the repeated measures analysis of variance (ANOVA) with a patterned covariance matrix. The model is demonstrated for a 5 x 2 (Time x Group) ANOVA in which the data are assumed to be serially correlated. Similarities with the Statistical Analysis System PROC MIXED model are discussed. (SLD)
Organization Domain Modeling. Volume 1. Conceptual Foundations, Process and Workproduct Description
1993-07-31
J.A. Hess, W.E. Novak, and A.S. Peterson. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report CMU/SEI-90-TR-21, Software...domain analysis (DA) and modeling, including a structured set of workproducts, a tailorable process model and a set of modeling techniques and guidelines...23 5.3.1 U sability Analysis (Rescoping) ..................................................... 24
Methodology for object-oriented real-time systems analysis and design: Software engineering
NASA Technical Reports Server (NTRS)
Schoeffler, James D.
1991-01-01
Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.
Modeling time-to-event (survival) data using classification tree analysis.
Linden, Ariel; Yarnold, Paul R
2017-12-01
Time to the occurrence of an event is often studied in health research. Survival analysis differs from other designs in that follow-up times for individuals who do not experience the event by the end of the study (called censored) are accounted for in the analysis. Cox regression is the standard method for analysing censored data, but the assumptions required of these models are easily violated. In this paper, we introduce classification tree analysis (CTA) as a flexible alternative for modelling censored data. Classification tree analysis is a "decision-tree"-like classification model that provides parsimonious, transparent (ie, easy to visually display and interpret) decision rules that maximize predictive accuracy, derives exact P values via permutation tests, and evaluates model cross-generalizability. Using empirical data, we identify all statistically valid, reproducible, longitudinally consistent, and cross-generalizable CTA survival models and then compare their predictive accuracy to estimates derived via Cox regression and an unadjusted naïve model. Model performance is assessed using integrated Brier scores and a comparison between estimated survival curves. The Cox regression model best predicts average incidence of the outcome over time, whereas CTA survival models best predict either relatively high, or low, incidence of the outcome over time. Classification tree analysis survival models offer many advantages over Cox regression, such as explicit maximization of predictive accuracy, parsimony, statistical robustness, and transparency. Therefore, researchers interested in accurate prognoses and clear decision rules should consider developing models using the CTA-survival framework. © 2017 John Wiley & Sons, Ltd.
A global sensitivity analysis approach for morphogenesis models.
Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G
2015-11-21
Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.
Hierarchical model analysis of the Atlantic Flyway Breeding Waterfowl Survey
Sauer, John R.; Zimmerman, Guthrie S.; Klimstra, Jon D.; Link, William A.
2014-01-01
We used log-linear hierarchical models to analyze data from the Atlantic Flyway Breeding Waterfowl Survey. The survey has been conducted by state biologists each year since 1989 in the northeastern United States from Virginia north to New Hampshire and Vermont. Although yearly population estimates from the survey are used by the United States Fish and Wildlife Service for estimating regional waterfowl population status for mallards (Anas platyrhynchos), black ducks (Anas rubripes), wood ducks (Aix sponsa), and Canada geese (Branta canadensis), they are not routinely adjusted to control for time of day effects and other survey design issues. The hierarchical model analysis permits estimation of year effects and population change while accommodating the repeated sampling of plots and controlling for time of day effects in counting. We compared population estimates from the current stratified random sample analysis to population estimates from hierarchical models with alternative model structures that describe year to year changes as random year effects, a trend with random year effects, or year effects modeled as 1-year differences. Patterns of population change from the hierarchical model results generally were similar to the patterns described by stratified random sample estimates, but significant visibility differences occurred between twilight to midday counts in all species. Controlling for the effects of time of day resulted in larger population estimates for all species in the hierarchical model analysis relative to the stratified random sample analysis. The hierarchical models also provided a convenient means of estimating population trend as derived statistics from the analysis. We detected significant declines in mallard and American black ducks and significant increases in wood ducks and Canada geese, a trend that had not been significant for 3 of these 4 species in the prior analysis. We recommend using hierarchical models for analysis of the Atlantic Flyway Breeding Waterfowl Survey.
Entrance and exit region friction factor models for annular seal analysis. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Elrod, David Alan
1988-01-01
The Mach number definition and boundary conditions in Nelson's nominally-centered, annular gas seal analysis are revised. A method is described for determining the wall shear stress characteristics of an annular gas seal experimentally. Two friction factor models are developed for annular seal analysis; one model is based on flat-plate flow theory; the other uses empirical entrance and exit region friction factors. The friction factor predictions of the models are compared to experimental results. Each friction model is used in an annular gas seal analysis. The seal characteristics predicted by the two seal analyses are compared to experimental results and to the predictions of Nelson's analysis. The comparisons are for smooth-rotor seals with smooth and honeycomb stators. The comparisons show that the analysis which uses empirical entrance and exit region shear stress models predicts the static and stability characteristics of annular gas seals better than the other analyses. The analyses predict direct stiffness poorly.
Regression Model Optimization for the Analysis of Experimental Data
NASA Technical Reports Server (NTRS)
Ulbrich, N.
2009-01-01
A candidate math model search algorithm was developed at Ames Research Center that determines a recommended math model for the multivariate regression analysis of experimental data. The search algorithm is applicable to classical regression analysis problems as well as wind tunnel strain gage balance calibration analysis applications. The algorithm compares the predictive capability of different regression models using the standard deviation of the PRESS residuals of the responses as a search metric. This search metric is minimized during the search. Singular value decomposition is used during the search to reject math models that lead to a singular solution of the regression analysis problem. Two threshold dependent constraints are also applied. The first constraint rejects math models with insignificant terms. The second constraint rejects math models with near-linear dependencies between terms. The math term hierarchy rule may also be applied as an optional constraint during or after the candidate math model search. The final term selection of the recommended math model depends on the regressor and response values of the data set, the user s function class combination choice, the user s constraint selections, and the result of the search metric minimization. A frequently used regression analysis example from the literature is used to illustrate the application of the search algorithm to experimental data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ostermann, Lars; Seidel, Christian
2015-03-10
The numerical analysis of hydro power stations is an important method of the hydraulic design and is used for the development and optimisation of hydro power stations in addition to the experiments with the physical submodel of a full model in the hydraulic laboratory. For the numerical analysis, 2D and 3D models are appropriate and commonly used.The 2D models refer mainly to the shallow water equations (SWE), since for this flow model a large experience on a wide field of applications for the flow analysis of numerous problems in hydraulic engineering already exists. Often, the flow model is verified bymore » in situ measurements. In order to consider 3D flow phenomena close to singularities like weirs, hydro power stations etc. the development of a hybrid fluid model is advantageous to improve the quality and significance of the global model. Here, an extended hybrid flow model based on the principle of the SWE is presented. The hybrid flow model directly links the numerical model with the experimental data, which may originate from physical full models, physical submodels and in-situ measurements. Hence a wide field of application of the hybrid model emerges including the improvement of numerical models and the strong coupling of numerical and experimental analysis.« less
Uncertainty Analysis and Parameter Estimation For Nearshore Hydrodynamic Models
NASA Astrophysics Data System (ADS)
Ardani, S.; Kaihatu, J. M.
2012-12-01
Numerical models represent deterministic approaches used for the relevant physical processes in the nearshore. Complexity of the physics of the model and uncertainty involved in the model inputs compel us to apply a stochastic approach to analyze the robustness of the model. The Bayesian inverse problem is one powerful way to estimate the important input model parameters (determined by apriori sensitivity analysis) and can be used for uncertainty analysis of the outputs. Bayesian techniques can be used to find the range of most probable parameters based on the probability of the observed data and the residual errors. In this study, the effect of input data involving lateral (Neumann) boundary conditions, bathymetry and off-shore wave conditions on nearshore numerical models are considered. Monte Carlo simulation is applied to a deterministic numerical model (the Delft3D modeling suite for coupled waves and flow) for the resulting uncertainty analysis of the outputs (wave height, flow velocity, mean sea level and etc.). Uncertainty analysis of outputs is performed by random sampling from the input probability distribution functions and running the model as required until convergence to the consistent results is achieved. The case study used in this analysis is the Duck94 experiment, which was conducted at the U.S. Army Field Research Facility at Duck, North Carolina, USA in the fall of 1994. The joint probability of model parameters relevant for the Duck94 experiments will be found using the Bayesian approach. We will further show that, by using Bayesian techniques to estimate the optimized model parameters as inputs and applying them for uncertainty analysis, we can obtain more consistent results than using the prior information for input data which means that the variation of the uncertain parameter will be decreased and the probability of the observed data will improve as well. Keywords: Monte Carlo Simulation, Delft3D, uncertainty analysis, Bayesian techniques, MCMC
2016-06-01
characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira
The CICT Earth Science Systems Analysis Model
NASA Technical Reports Server (NTRS)
Pell, Barney; Coughlan, Joe; Biegel, Bryan; Stevens, Ken; Hansson, Othar; Hayes, Jordan
2004-01-01
Contents include the following: Computing Information and Communications Technology (CICT) Systems Analysis. Our modeling approach: a 3-part schematic investment model of technology change, impact assessment and prioritization. A whirlwind tour of our model. Lessons learned.
Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review
NASA Technical Reports Server (NTRS)
Antonsson, Erik; Gombosi, Tamas
2005-01-01
Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.
A brief review of models of DC-DC power electronic converters for analysis of their stability
NASA Astrophysics Data System (ADS)
Siewniak, Piotr; Grzesik, Bogusław
2014-10-01
A brief review of models of DC-DC power electronic converters (PECs) is presented in this paper. It contains the most popular, continuous-time and discrete-time models used for PEC simulation, design, stability analysis and other applications. Both large-signal and small-signal models are considered. Special attention is paid to models that are used in practice for the analysis of the global and local stability of PECs.
Computational models for the nonlinear analysis of reinforced concrete plates
NASA Technical Reports Server (NTRS)
Hinton, E.; Rahman, H. H. A.; Huq, M. M.
1980-01-01
A finite element computational model for the nonlinear analysis of reinforced concrete solid, stiffened and cellular plates is briefly outlined. Typically, Mindlin elements are used to model the plates whereas eccentric Timoshenko elements are adopted to represent the beams. The layering technique, common in the analysis of reinforced concrete flexural systems, is incorporated in the model. The proposed model provides an inexpensive and reasonably accurate approach which can be extended for use with voided plates.
2001-10-25
Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the Dynamic Pulmonary Imaging technique 18,5,17,6. We have proposed and evaluated a multiresolutional method with an explicit ventilation model based on pyramid images for ventilation analysis. We have further extended the method for ventilation analysis to pulmonary perfusion. This paper focuses on the clinical evaluation of our method for
DOT National Transportation Integrated Search
2012-09-01
A peer exchange on Modeling and Analysis Needs and Resources for Small Metropolitan Area Transportation Planning was convened on August 28 and 29, 2011, to explore the state of transportation modeling and analysis practice in communities with populat...
ADAM: Analysis of Discrete Models of Biological Systems Using Computer Algebra
2011-01-01
Background Many biological systems are modeled qualitatively with discrete models, such as probabilistic Boolean networks, logical models, Petri nets, and agent-based models, to gain a better understanding of them. The computational complexity to analyze the complete dynamics of these models grows exponentially in the number of variables, which impedes working with complex models. There exist software tools to analyze discrete models, but they either lack the algorithmic functionality to analyze complex models deterministically or they are inaccessible to many users as they require understanding the underlying algorithm and implementation, do not have a graphical user interface, or are hard to install. Efficient analysis methods that are accessible to modelers and easy to use are needed. Results We propose a method for efficiently identifying attractors and introduce the web-based tool Analysis of Dynamic Algebraic Models (ADAM), which provides this and other analysis methods for discrete models. ADAM converts several discrete model types automatically into polynomial dynamical systems and analyzes their dynamics using tools from computer algebra. Specifically, we propose a method to identify attractors of a discrete model that is equivalent to solving a system of polynomial equations, a long-studied problem in computer algebra. Based on extensive experimentation with both discrete models arising in systems biology and randomly generated networks, we found that the algebraic algorithms presented in this manuscript are fast for systems with the structure maintained by most biological systems, namely sparseness and robustness. For a large set of published complex discrete models, ADAM identified the attractors in less than one second. Conclusions Discrete modeling techniques are a useful tool for analyzing complex biological systems and there is a need in the biological community for accessible efficient analysis tools. ADAM provides analysis methods based on mathematical algorithms as a web-based tool for several different input formats, and it makes analysis of complex models accessible to a larger community, as it is platform independent as a web-service and does not require understanding of the underlying mathematics. PMID:21774817
Structural mode significance using INCA. [Interactive Controls Analysis computer program
NASA Technical Reports Server (NTRS)
Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.
1990-01-01
Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.
NASA Astrophysics Data System (ADS)
Cole, H. E.; Fuller, R. E.
1980-09-01
Four of the major models used by DOE for energy conservation analyses in the residential and commercial building sectors are reviewed and critically analyzed to determine how these models can serve as tools for DOE and its Conservation Policy Office in evaluating and quantifying their policy and program requirements. The most effective role for each model in addressing future issues of buildings energy conservation policy and analysis is assessed. The four models covered are: Oak Ridge Residential Energy Model; Micro Analysis of Transfers to Households/Comprehensive Human Resources Data System (MATH/CHRDS) Model; Oak Ridge Commercial Energy Model; and Brookhaven Buildings Energy Conservation Optimization Model (BECOM).
Automation for System Safety Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul
2009-01-01
This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool
Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda
2008-01-01
Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080
SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.
Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda
2008-08-15
It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.
Dynamical System Analysis of Reynolds Stress Closure Equations
NASA Technical Reports Server (NTRS)
Girimaji, Sharath S.
1997-01-01
In this paper, we establish the causality between the model coefficients in the standard pressure-strain correlation model and the predicted equilibrium states for homogeneous turbulence. We accomplish this by performing a comprehensive fixed point analysis of the modeled Reynolds stress and dissipation rate equations. The results from this analysis will be very useful for developing improved pressure-strain correlation models to yield observed equilibrium behavior.
Fusing modeling techniques to support domain analysis for reuse opportunities identification
NASA Technical Reports Server (NTRS)
Hall, Susan Main; Mcguire, Eileen
1993-01-01
Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.
Factors Influencing Progressive Failure Analysis Predictions for Laminated Composite Structure
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.
2008-01-01
Progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model for use with a nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details are described in the present paper. Parametric studies for laminated composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented and to demonstrate their influence on progressive failure analysis predictions.
Barnes, Samuel R; Ng, Thomas S C; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V; Jacobs, Russell E
2015-06-16
Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at https://github.com/petmri/ROCKETSHIP .
Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets
Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge
2014-01-01
SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111
Simplified and refined structural modeling for economical flutter analysis and design
NASA Technical Reports Server (NTRS)
Ricketts, R. H.; Sobieszczanski, J.
1977-01-01
A coordinated use of two finite-element models of different levels of refinement is presented to reduce the computer cost of the repetitive flutter analysis commonly encountered in structural resizing to meet flutter requirements. One model, termed a refined model (RM), represents a high degree of detail needed for strength-sizing and flutter analysis of an airframe. The other model, called a simplified model (SM), has a relatively much smaller number of elements and degrees-of-freedom. A systematic method of deriving an SM from a given RM is described. The method consists of judgmental and numerical operations to make the stiffness and mass of the SM elements equivalent to the corresponding substructures of RM. The structural data are automatically transferred between the two models. The bulk of analysis is performed on the SM with periodical verifications carried out by analysis of the RM. In a numerical example of a supersonic cruise aircraft with an arrow wing, this approach permitted substantial savings in computer costs and acceleration of the job turn-around.
Chen, Yong; Luo, Sheng; Chu, Haitao; Wei, Peng
2013-05-01
Multivariate meta-analysis is useful in combining evidence from independent studies which involve several comparisons among groups based on a single outcome. For binary outcomes, the commonly used statistical models for multivariate meta-analysis are multivariate generalized linear mixed effects models which assume risks, after some transformation, follow a multivariate normal distribution with possible correlations. In this article, we consider an alternative model for multivariate meta-analysis where the risks are modeled by the multivariate beta distribution proposed by Sarmanov (1966). This model have several attractive features compared to the conventional multivariate generalized linear mixed effects models, including simplicity of likelihood function, no need to specify a link function, and has a closed-form expression of distribution functions for study-specific risk differences. We investigate the finite sample performance of this model by simulation studies and illustrate its use with an application to multivariate meta-analysis of adverse events of tricyclic antidepressants treatment in clinical trials.
Comprehensive analysis of transport aircraft flight performance
NASA Astrophysics Data System (ADS)
Filippone, Antonio
2008-04-01
This paper reviews the state-of-the art in comprehensive performance codes for fixed-wing aircraft. The importance of system analysis in flight performance is discussed. The paper highlights the role of aerodynamics, propulsion, flight mechanics, aeroacoustics, flight operation, numerical optimisation, stochastic methods and numerical analysis. The latter discipline is used to investigate the sensitivities of the sub-systems to uncertainties in critical state parameters or functional parameters. The paper discusses critically the data used for performance analysis, and the areas where progress is required. Comprehensive analysis codes can be used for mission fuel planning, envelope exploration, competition analysis, a wide variety of environmental studies, marketing analysis, aircraft certification and conceptual aircraft design. A comprehensive program that uses the multi-disciplinary approach for transport aircraft is presented. The model includes a geometry deck, a separate engine input deck with the main parameters, a database of engine performance from an independent simulation, and an operational deck. The comprehensive code has modules for deriving the geometry from bitmap files, an aerodynamics model for all flight conditions, a flight mechanics model for flight envelopes and mission analysis, an aircraft noise model and engine emissions. The model is validated at different levels. Validation of the aerodynamic model is done against the scale models DLR-F4 and F6. A general model analysis and flight envelope exploration are shown for the Boeing B-777-300 with GE-90 turbofan engines with intermediate passenger capacity (394 passengers in 2 classes). Validation of the flight model is done by sensitivity analysis on the wetted area (or profile drag), on the specific air range, the brake-release gross weight and the aircraft noise. A variety of results is shown, including specific air range charts, take-off weight-altitude charts, payload-range performance, atmospheric effects, economic Mach number and noise trajectories at F.A.R. landing points.
Gradient-based model calibration with proxy-model assistance
NASA Astrophysics Data System (ADS)
Burrows, Wesley; Doherty, John
2016-02-01
Use of a proxy model in gradient-based calibration and uncertainty analysis of a complex groundwater model with large run times and problematic numerical behaviour is described. The methodology is general, and can be used with models of all types. The proxy model is based on a series of analytical functions that link all model outputs used in the calibration process to all parameters requiring estimation. In enforcing history-matching constraints during the calibration and post-calibration uncertainty analysis processes, the proxy model is run for the purposes of populating the Jacobian matrix, while the original model is run when testing parameter upgrades; the latter process is readily parallelized. Use of a proxy model in this fashion dramatically reduces the computational burden of complex model calibration and uncertainty analysis. At the same time, the effect of model numerical misbehaviour on calculation of local gradients is mitigated, this allowing access to the benefits of gradient-based analysis where lack of integrity in finite-difference derivatives calculation would otherwise have impeded such access. Construction of a proxy model, and its subsequent use in calibration of a complex model, and in analysing the uncertainties of predictions made by that model, is implemented in the PEST suite.
Atmospheric model development in support of SEASAT. Volume 1: Summary of findings
NASA Technical Reports Server (NTRS)
Kesel, P. G.
1977-01-01
Atmospheric analysis and prediction models of varying (grid) resolution were developed. The models were tested using real observational data for the purpose of assessing the impact of grid resolution on short range numerical weather prediction. The discretionary model procedures were examined so that the computational viability of SEASAT data might be enhanced during the conduct of (future) sensitivity tests. The analysis effort covers: (1) examining the procedures for allowing data to influence the analysis; (2) examining the effects of varying the weights in the analysis procedure; (3) testing and implementing procedures for solving the minimization equation in an optimal way; (4) describing the impact of grid resolution on analysis; and (5) devising and implementing numerous practical solutions to analysis problems, generally.
NASA Astrophysics Data System (ADS)
Khodachenko, Maxim; Miller, Steven; Stoeckler, Robert; Topf, Florian
2010-05-01
Computational modeling and observational data analysis are two major aspects of the modern scientific research. Both appear nowadays under extensive development and application. Many of the scientific goals of planetary space missions require robust models of planetary objects and environments as well as efficient data analysis algorithms, to predict conditions for mission planning and to interpret the experimental data. Europe has great strength in these areas, but it is insufficiently coordinated; individual groups, models, techniques and algorithms need to be coupled and integrated. Existing level of scientific cooperation and the technical capabilities for operative communication, allow considerable progress in the development of a distributed international Research Infrastructure (RI) which is based on the existing in Europe computational modelling and data analysis centers, providing the scientific community with dedicated services in the fields of their computational and data analysis expertise. These services will appear as a product of the collaborative communication and joint research efforts of the numerical and data analysis experts together with planetary scientists. The major goal of the EUROPLANET-RI / EMDAF is to make computational models and data analysis algorithms associated with particular national RIs and teams, as well as their outputs, more readily available to their potential user community and more tailored to scientific user requirements, without compromising front-line specialized research on model and data analysis algorithms development and software implementation. This objective will be met through four keys subdivisions/tasks of EMAF: 1) an Interactive Catalogue of Planetary Models; 2) a Distributed Planetary Modelling Laboratory; 3) a Distributed Data Analysis Laboratory, and 4) enabling Models and Routines for High Performance Computing Grids. Using the advantages of the coordinated operation and efficient communication between the involved computational modelling, research and data analysis expert teams and their related research infrastructures, EMDAF will provide a 1) flexible, 2) scientific user oriented, 3) continuously developing and fast upgrading computational and data analysis service to support and intensify the European planetary scientific research. At the beginning EMDAF will create a set of demonstrators and operational tests of this service in key areas of European planetary science. This work will aim at the following objectives: (a) Development and implementation of tools for distant interactive communication between the planetary scientists and computing experts (including related RIs); (b) Development of standard routine packages, and user-friendly interfaces for operation of the existing numerical codes and data analysis algorithms by the specialized planetary scientists; (c) Development of a prototype of numerical modelling services "on demand" for space missions and planetary researchers; (d) Development of a prototype of data analysis services "on demand" for space missions and planetary researchers; (e) Development of a prototype of coordinated interconnected simulations of planetary phenomena and objects (global multi-model simulators); (f) Providing the demonstrators of a coordinated use of high performance computing facilities (super-computer networks), done in cooperation with European HPC Grid DEISA.
Computational fluid dynamic modelling of cavitation
NASA Technical Reports Server (NTRS)
Deshpande, Manish; Feng, Jinzhang; Merkle, Charles L.
1993-01-01
Models in sheet cavitation in cryogenic fluids are developed for use in Euler and Navier-Stokes codes. The models are based upon earlier potential-flow models but enable the cavity inception point, length, and shape to be determined as part of the computation. In the present paper, numerical solutions are compared with experimental measurements for both pressure distribution and cavity length. Comparisons between models are also presented. The CFD model provides a relatively simple modification to an existing code to enable cavitation performance predictions to be included. The analysis also has the added ability of incorporating thermodynamic effects of cryogenic fluids into the analysis. Extensions of the current two-dimensional steady state analysis to three-dimensions and/or time-dependent flows are, in principle, straightforward although geometrical issues become more complicated. Linearized models, however offer promise of providing effective cavitation modeling in three-dimensions. This analysis presents good potential for improved understanding of many phenomena associated with cavity flows.
Data-Flow Based Model Analysis
NASA Technical Reports Server (NTRS)
Saad, Christian; Bauer, Bernhard
2010-01-01
The concept of (meta) modeling combines an intuitive way of formalizing the structure of an application domain with a high expressiveness that makes it suitable for a wide variety of use cases and has therefore become an integral part of many areas in computer science. While the definition of modeling languages through the use of meta models, e.g. in Unified Modeling Language (UML), is a well-understood process, their validation and the extraction of behavioral information is still a challenge. In this paper we present a novel approach for dynamic model analysis along with several fields of application. Examining the propagation of information along the edges and nodes of the model graph allows to extend and simplify the definition of semantic constraints in comparison to the capabilities offered by e.g. the Object Constraint Language. Performing a flow-based analysis also enables the simulation of dynamic behavior, thus providing an "abstract interpretation"-like analysis method for the modeling domain.
Efficiency Analysis of Public Universities in Thailand
ERIC Educational Resources Information Center
Kantabutra, Saranya; Tang, John C. S.
2010-01-01
This paper examines the performance of Thai public universities in terms of efficiency, using a non-parametric approach called data envelopment analysis. Two efficiency models, the teaching efficiency model and the research efficiency model, are developed and the analysis is conducted at the faculty level. Further statistical analyses are also…
Automated Loads Analysis System (ATLAS)
NASA Technical Reports Server (NTRS)
Gardner, Stephen; Frere, Scot; O’Reilly, Patrick
2013-01-01
ATLAS is a generalized solution that can be used for launch vehicles. ATLAS is used to produce modal transient analysis and quasi-static analysis results (i.e., accelerations, displacements, and forces) for the payload math models on a specific Shuttle Transport System (STS) flight using the shuttle math model and associated forcing functions. This innovation solves the problem of coupling of payload math models into a shuttle math model. It performs a transient loads analysis simulating liftoff, landing, and all flight events between liftoff and landing. ATLAS utilizes efficient and numerically stable algorithms available in MSC/NASTRAN.
MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models
Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines
2016-08-03
Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less
Simulation-based sensitivity analysis for non-ignorably missing data.
Yin, Peng; Shi, Jian Q
2017-01-01
Sensitivity analysis is popular in dealing with missing data problems particularly for non-ignorable missingness, where full-likelihood method cannot be adopted. It analyses how sensitively the conclusions (output) may depend on assumptions or parameters (input) about missing data, i.e. missing data mechanism. We call models with the problem of uncertainty sensitivity models. To make conventional sensitivity analysis more useful in practice we need to define some simple and interpretable statistical quantities to assess the sensitivity models and make evidence based analysis. We propose a novel approach in this paper on attempting to investigate the possibility of each missing data mechanism model assumption, by comparing the simulated datasets from various MNAR models with the observed data non-parametrically, using the K-nearest-neighbour distances. Some asymptotic theory has also been provided. A key step of this method is to plug in a plausibility evaluation system towards each sensitivity parameter, to select plausible values and reject unlikely values, instead of considering all proposed values of sensitivity parameters as in the conventional sensitivity analysis method. The method is generic and has been applied successfully to several specific models in this paper including meta-analysis model with publication bias, analysis of incomplete longitudinal data and mean estimation with non-ignorable missing data.
NASA Astrophysics Data System (ADS)
Bhargava, K.; Kalnay, E.; Carton, J.; Yang, F.
2017-12-01
Systematic forecast errors, arising from model deficiencies, form a significant portion of the total forecast error in weather prediction models like the Global Forecast System (GFS). While much effort has been expended to improve models, substantial model error remains. The aim here is to (i) estimate the model deficiencies in the GFS that lead to systematic forecast errors, (ii) implement an online correction (i.e., within the model) scheme to correct GFS following the methodology of Danforth et al. [2007] and Danforth and Kalnay [2008, GRL]. Analysis Increments represent the corrections that new observations make on, in this case, the 6-hr forecast in the analysis cycle. Model bias corrections are estimated from the time average of the analysis increments divided by 6-hr, assuming that initial model errors grow linearly and first ignoring the impact of observation bias. During 2012-2016, seasonal means of the 6-hr model bias are generally robust despite changes in model resolution and data assimilation systems, and their broad continental scales explain their insensitivity to model resolution. The daily bias dominates the sub-monthly analysis increments and consists primarily of diurnal and semidiurnal components, also requiring a low dimensional correction. Analysis increments in 2015 and 2016 are reduced over oceans, which is attributed to improvements in the specification of the SSTs. These results encourage application of online correction, as suggested by Danforth and Kalnay, for mean, seasonal and diurnal and semidiurnal model biases in GFS to reduce both systematic and random errors. As the error growth in the short-term is still linear, estimated model bias corrections can be added as a forcing term in the model tendency equation to correct online. Preliminary experiments with GFS, correcting temperature and specific humidity online show reduction in model bias in 6-hr forecast. This approach can then be used to guide and optimize the design of sub-grid scale physical parameterizations, more accurate discretization of the model dynamics, boundary conditions, radiative transfer codes, and other potential model improvements which can then replace the empirical correction scheme. The analysis increments also provide guidance in testing new physical parameterizations.
[Analysis of the stability and adaptability of near infrared spectra qualitative analysis model].
Cao, Wu; Li, Wei-jun; Wang, Ping; Zhang, Li-ping
2014-06-01
The stability and adaptability of model of near infrared spectra qualitative analysis were studied. Method of separate modeling can significantly improve the stability and adaptability of model; but its ability of improving adaptability of model is limited. Method of joint modeling can not only improve the adaptability of the model, but also the stability of model, at the same time, compared to separate modeling, the method can shorten the modeling time, reduce the modeling workload; extend the term of validity of model, and improve the modeling efficiency. The experiment of model adaptability shows that, the correct recognition rate of separate modeling method is relatively low, which can not meet the requirements of application, and joint modeling method can reach the correct recognition rate of 90%, and significantly enhances the recognition effect. The experiment of model stability shows that, the identification results of model by joint modeling are better than the model by separate modeling, and has good application value.
NASA Astrophysics Data System (ADS)
Parumasur, N.; Willie, R.
2008-09-01
We consider a simple HIV/AIDs finite dimensional mathematical model on interactions of the blood cells, the HIV/AIDs virus and the immune system for consistence of the equations to the real biomedical situation that they model. A better understanding to a cure solution to the illness modeled by the finite dimensional equations is given. This is accomplished through rigorous mathematical analysis and is reinforced by numerical analysis of models developed for real life cases.
Applying STAMP in Accident Analysis
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen
2003-01-01
Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.
Uncertainty analysis of hydrological modeling in a tropical area using different algorithms
NASA Astrophysics Data System (ADS)
Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh
2018-01-01
Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor <0.56 and R 2>0.91, NSE>0.89, and 0.18
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
Traffic analysis toolbox volume XI : weather and traffic analysis, modeling and simulation.
DOT National Transportation Integrated Search
2010-12-01
This document presents a weather module for the traffic analysis tools program. It provides traffic engineers, transportation modelers and decisions makers with a guide that can incorporate weather impacts into transportation system analysis and mode...
DOT National Transportation Integrated Search
2017-02-01
As part of the Federal Highway Administration (FHWA) Traffic Analysis Toolbox (Volume XIII), this guide was designed to help corridor stakeholders implement the Integrated Corridor Management (ICM) Analysis, Modeling, and Simulation (AMS) methodology...
Further evidence for the increased power of LOD scores compared with nonparametric methods.
Durner, M; Vieland, V J; Greenberg, D A
1999-01-01
In genetic analysis of diseases in which the underlying model is unknown, "model free" methods-such as affected sib pair (ASP) tests-are often preferred over LOD-score methods, although LOD-score methods under the correct or even approximately correct model are more powerful than ASP tests. However, there might be circumstances in which nonparametric methods will outperform LOD-score methods. Recently, Dizier et al. reported that, in some complex two-locus (2L) models, LOD-score methods with segregation analysis-derived parameters had less power to detect linkage than ASP tests. We investigated whether these particular models, in fact, represent a situation that ASP tests are more powerful than LOD scores. We simulated data according to the parameters specified by Dizier et al. and analyzed the data by using a (a) single locus (SL) LOD-score analysis performed twice, under a simple dominant and a recessive mode of inheritance (MOI), (b) ASP methods, and (c) nonparametric linkage (NPL) analysis. We show that SL analysis performed twice and corrected for the type I-error increase due to multiple testing yields almost as much linkage information as does an analysis under the correct 2L model and is more powerful than either the ASP method or the NPL method. We demonstrate that, even for complex genetic models, the most important condition for linkage analysis is that the assumed MOI at the disease locus being tested is approximately correct, not that the inheritance of the disease per se is correctly specified. In the analysis by Dizier et al., segregation analysis led to estimates of dominance parameters that were grossly misspecified for the locus tested in those models in which ASP tests appeared to be more powerful than LOD-score analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seong W. Lee
During this reporting period, the literature survey including the gasifier temperature measurement literature, the ultrasonic application and its background study in cleaning application, and spray coating process are completed. The gasifier simulator (cold model) testing has been successfully conducted. Four factors (blower voltage, ultrasonic application, injection time intervals, particle weight) were considered as significant factors that affect the temperature measurement. The Analysis of Variance (ANOVA) was applied to analyze the test data. The analysis shows that all four factors are significant to the temperature measurements in the gasifier simulator (cold model). The regression analysis for the case with the normalizedmore » room temperature shows that linear model fits the temperature data with 82% accuracy (18% error). The regression analysis for the case without the normalized room temperature shows 72.5% accuracy (27.5% error). The nonlinear regression analysis indicates a better fit than that of the linear regression. The nonlinear regression model's accuracy is 88.7% (11.3% error) for normalized room temperature case, which is better than the linear regression analysis. The hot model thermocouple sleeve design and fabrication are completed. The gasifier simulator (hot model) design and the fabrication are completed. The system tests of the gasifier simulator (hot model) have been conducted and some modifications have been made. Based on the system tests and results analysis, the gasifier simulator (hot model) has met the proposed design requirement and the ready for system test. The ultrasonic cleaning method is under evaluation and will be further studied for the gasifier simulator (hot model) application. The progress of this project has been on schedule.« less
Multiphysics Nuclear Thermal Rocket Thrust Chamber Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See
2005-01-01
The objective of this effort is t o develop an efficient and accurate thermo-fluid computational methodology to predict environments for hypothetical thrust chamber design and analysis. The current task scope is to perform multidimensional, multiphysics analysis of thrust performance and heat transfer analysis for a hypothetical solid-core, nuclear thermal engine including thrust chamber and nozzle. The multiphysics aspects of the model include: real fluid dynamics, chemical reactivity, turbulent flow, and conjugate heat transfer. The model will be designed to identify thermal, fluid, and hydrogen environments in all flow paths and materials. This model would then be used to perform non- nuclear reproduction of the flow element failures demonstrated in the Rover/NERVA testing, investigate performance of specific configurations and assess potential issues and enhancements. A two-pronged approach will be employed in this effort: a detailed analysis of a multi-channel, flow-element, and global modeling of the entire thrust chamber assembly with a porosity modeling technique. It is expected that the detailed analysis of a single flow element would provide detailed fluid, thermal, and hydrogen environments for stress analysis, while the global thrust chamber assembly analysis would promote understanding of the effects of hydrogen dissociation and heat transfer on thrust performance. These modeling activities will be validated as much as possible by testing performed by other related efforts.
Brief Lags in Interrupted Sequential Performance: Evaluating a Model and Model Evaluation Method
2015-01-05
rehearsal mechanism in the model. To evaluate the model we developed a simple new goodness-of-fit test based on analysis of variance that offers an...repeated step). Sequen- tial constraints are common in medicine, equipment maintenance, computer programming and technical support, data analysis ...legal analysis , accounting, and many other home and workplace environ- ments. Sequential constraints also play a role in such basic cognitive processes
Tularosa Basin Play Fairway Analysis Data and Models
Nash, Greg
2017-07-11
This submission includes raster datasets for each layer of evidence used for weights of evidence analysis as well as the deterministic play fairway analysis (PFA). Data representative of heat, permeability and groundwater comprises some of the raster datasets. Additionally, the final deterministic PFA model is provided along with a certainty model. All of these datasets are best used with an ArcGIS software package, specifically Spatial Data Modeler.
A general framework for the use of logistic regression models in meta-analysis.
Simmonds, Mark C; Higgins, Julian Pt
2016-12-01
Where individual participant data are available for every randomised trial in a meta-analysis of dichotomous event outcomes, "one-stage" random-effects logistic regression models have been proposed as a way to analyse these data. Such models can also be used even when individual participant data are not available and we have only summary contingency table data. One benefit of this one-stage regression model over conventional meta-analysis methods is that it maximises the correct binomial likelihood for the data and so does not require the common assumption that effect estimates are normally distributed. A second benefit of using this model is that it may be applied, with only minor modification, in a range of meta-analytic scenarios, including meta-regression, network meta-analyses and meta-analyses of diagnostic test accuracy. This single model can potentially replace the variety of often complex methods used in these areas. This paper considers, with a range of meta-analysis examples, how random-effects logistic regression models may be used in a number of different types of meta-analyses. This one-stage approach is compared with widely used meta-analysis methods including Bayesian network meta-analysis and the bivariate and hierarchical summary receiver operating characteristic (ROC) models for meta-analyses of diagnostic test accuracy. © The Author(s) 2014.
How Many Separable Sources? Model Selection In Independent Components Analysis
Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen
2015-01-01
Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988
MPTinR: analysis of multinomial processing tree models in R.
Singmann, Henrik; Kellen, David
2013-06-01
We introduce MPTinR, a software package developed for the analysis of multinomial processing tree (MPT) models. MPT models represent a prominent class of cognitive measurement models for categorical data with applications in a wide variety of fields. MPTinR is the first software for the analysis of MPT models in the statistical programming language R, providing a modeling framework that is more flexible than standalone software packages. MPTinR also introduces important features such as (1) the ability to calculate the Fisher information approximation measure of model complexity for MPT models, (2) the ability to fit models for categorical data outside the MPT model class, such as signal detection models, (3) a function for model selection across a set of nested and nonnested candidate models (using several model selection indices), and (4) multicore fitting. MPTinR is available from the Comprehensive R Archive Network at http://cran.r-project.org/web/packages/MPTinR/ .
Metrics for evaluating performance and uncertainty of Bayesian network models
Bruce G. Marcot
2012-01-01
This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...
Bayesian Model Averaging for Propensity Score Analysis
ERIC Educational Resources Information Center
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
Demographics of reintroduced populations: estimation, modeling, and decision analysis
Converse, Sarah J.; Moore, Clinton T.; Armstrong, Doug P.
2013-01-01
Reintroduction can be necessary for recovering populations of threatened species. However, the success of reintroduction efforts has been poorer than many biologists and managers would hope. To increase the benefits gained from reintroduction, management decision making should be couched within formal decision-analytic frameworks. Decision analysis is a structured process for informing decision making that recognizes that all decisions have a set of components—objectives, alternative management actions, predictive models, and optimization methods—that can be decomposed, analyzed, and recomposed to facilitate optimal, transparent decisions. Because the outcome of interest in reintroduction efforts is typically population viability or related metrics, models used in decision analysis efforts for reintroductions will need to include population models. In this special section of the Journal of Wildlife Management, we highlight examples of the construction and use of models for informing management decisions in reintroduced populations. In this introductory contribution, we review concepts in decision analysis, population modeling for analysis of decisions in reintroduction settings, and future directions. Increased use of formal decision analysis, including adaptive management, has great potential to inform reintroduction efforts. Adopting these practices will require close collaboration among managers, decision analysts, population modelers, and field biologists.
Mathematical properties and parameter estimation for transit compartment pharmacodynamic models.
Yates, James W T
2008-07-03
One feature of recent research in pharmacodynamic modelling has been the move towards more mechanistically based model structures. However, in all of these models there are common sub-systems, such as feedback loops and time-delays, whose properties and contribution to the model behaviour merit some mathematical analysis. In this paper a common pharmacodynamic model sub-structure is considered: the linear transit compartment. These models have a number of interesting properties as the length of the cascade chain is increased. In the limiting case a pure time-delay is achieved [Milsum, J.H., 1966. Biological Control Systems Analysis. McGraw-Hill Book Company, New York] and the initial behaviour becoming increasingly sensitive to parameter value perturbation. It is also shown that the modelled drug effect is attenuated, though the duration of action is longer. Through this analysis the range of behaviours that such models are capable of reproducing are characterised. The properties of these models and the experimental requirements are discussed in order to highlight how mathematical analysis prior to experimentation can enhance the utility of mathematical modelling.
The concept of shared mental models in healthcare collaboration.
McComb, Sara; Simpson, Vicki
2014-07-01
To report an analysis of the concept of shared mental models in health care. Shared mental models have been described as facilitators of effective teamwork. The complexity and criticality of the current healthcare system requires shared mental models to enhance safe and effective patient/client care. Yet, the current concept definition in the healthcare literature is vague and, therefore, difficult to apply consistently in research and practice. Concept analysis. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed and MEDLINE (EBSCO Interface), for the years 1997-2013. Walker and Avant's approach to concept analysis was employed and, following Paley's guidance, embedded in extant theory from the team literature. Although teamwork and collaboration are discussed frequently in healthcare literature, the concept of shared mental models in that context is not as commonly found but is increasing in appearance. Our concept analysis defines shared mental models as individually held knowledge structures that help team members function collaboratively in their environments and are comprised of the attributes of content, similarity, accuracy and dynamics. This theoretically grounded concept analysis provides a foundation for a middle-range descriptive theory of shared mental models in nursing and health care. Further research concerning the impact of shared mental models in the healthcare setting can result in development and refinement of shared mental models to support effective teamwork and collaboration. © 2013 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Uchidate, M.
2018-09-01
In this study, with the aim of establishing a systematic knowledge on the impact of summit extraction methods and stochastic model selection in rough contact analysis, the contact area ratio (A r /A a ) obtained by statistical contact models with different summit extraction methods was compared with a direct simulation using the boundary element method (BEM). Fifty areal topography datasets with different autocorrelation functions in terms of the power index and correlation length were used for investigation. The non-causal 2D auto-regressive model which can generate datasets with specified parameters was employed in this research. Three summit extraction methods, Nayak’s theory, 8-point analysis and watershed segmentation, were examined. With regard to the stochastic model, Bhushan’s model and BGT (Bush-Gibson-Thomas) model were applied. The values of A r /A a from the stochastic models tended to be smaller than BEM. The discrepancy between the Bhushan’s model with the 8-point analysis and BEM was slightly smaller than Nayak’s theory. The results with the watershed segmentation was similar to those with the 8-point analysis. The impact of the Wolf pruning on the discrepancy between the stochastic analysis and BEM was not very clear. In case of the BGT model which employs surface gradients, good quantitative agreement against BEM was obtained when the Nayak’s bandwidth parameter was large.
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Burton, Paul W.
2010-09-01
The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.
Software Safety Analysis of a Flight Guidance System
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.
2004-01-01
This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.
Integrating Reliability Analysis with a Performance Tool
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael
1995-01-01
A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.
Models for Analyzing Environmental Issues in the Classroom.
ERIC Educational Resources Information Center
Chiras, Daniel D.
1980-01-01
Presents several conceptual models dealing with issues in environmental education. Models described are: (1) Population/Resources/Pollution, (2) Cause-and-Effect Analysis, and (3) Ethical Analysis. (CS)
ERIC Educational Resources Information Center
Hall, John S.
This review analyzes the trend in educational decision making to replace hierarchical authority structures with more rational models for decision making drawn from management science. Emphasis is also placed on alternatives to a hierarchical decision-making model, including governing models, union models, and influence models. A 54-item…
ERIC Educational Resources Information Center
Leventhal, Brian C.; Stone, Clement A.
2018-01-01
Interest in Bayesian analysis of item response theory (IRT) models has grown tremendously due to the appeal of the paradigm among psychometricians, advantages of these methods when analyzing complex models, and availability of general-purpose software. Possible models include models which reflect multidimensionality due to designed test structure,…
Function modeling: improved raster analysis through delayed reading and function raster datasets
John S. Hogland; Nathaniel M. Anderson; J .Greg Jones
2013-01-01
Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...
Principal process analysis of biological models.
Casagranda, Stefano; Touzeau, Suzanne; Ropers, Delphine; Gouzé, Jean-Luc
2018-06-14
Understanding the dynamical behaviour of biological systems is challenged by their large number of components and interactions. While efforts have been made in this direction to reduce model complexity, they often prove insufficient to grasp which and when model processes play a crucial role. Answering these questions is fundamental to unravel the functioning of living organisms. We design a method for dealing with model complexity, based on the analysis of dynamical models by means of Principal Process Analysis. We apply the method to a well-known model of circadian rhythms in mammals. The knowledge of the system trajectories allows us to decompose the system dynamics into processes that are active or inactive with respect to a certain threshold value. Process activities are graphically represented by Boolean and Dynamical Process Maps. We detect model processes that are always inactive, or inactive on some time interval. Eliminating these processes reduces the complex dynamics of the original model to the much simpler dynamics of the core processes, in a succession of sub-models that are easier to analyse. We quantify by means of global relative errors the extent to which the simplified models reproduce the main features of the original system dynamics and apply global sensitivity analysis to test the influence of model parameters on the errors. The results obtained prove the robustness of the method. The analysis of the sub-model dynamics allows us to identify the source of circadian oscillations. We find that the negative feedback loop involving proteins PER, CRY, CLOCK-BMAL1 is the main oscillator, in agreement with previous modelling and experimental studies. In conclusion, Principal Process Analysis is a simple-to-use method, which constitutes an additional and useful tool for analysing the complex dynamical behaviour of biological systems.
Windowed and Wavelet Analysis of Marine Stratocumulus Cloud Inhomogeneity
NASA Technical Reports Server (NTRS)
Gollmer, Steven M.; Harshvardhan; Cahalan, Robert F.; Snider, Jack B.
1995-01-01
To improve radiative transfer calculations for inhomogeneous clouds, a consistent means of modeling inhomogeneity is needed. One current method of modeling cloud inhomogeneity is through the use of fractal parameters. This method is based on the supposition that cloud inhomogeneity over a large range of scales is related. An analysis technique named wavelet analysis provides a means of studying the multiscale nature of cloud inhomogeneity. In this paper, the authors discuss the analysis and modeling of cloud inhomogeneity through the use of wavelet analysis. Wavelet analysis as well as other windowed analysis techniques are used to study liquid water path (LWP) measurements obtained during the marine stratocumulus phase of the First ISCCP (International Satellite Cloud Climatology Project) Regional Experiment. Statistics obtained using analysis windows, which are translated to span the LWP dataset, are used to study the local (small scale) properties of the cloud field as well as their time dependence. The LWP data are transformed onto an orthogonal wavelet basis that represents the data as a number of times series. Each of these time series lies within a frequency band and has a mean frequency that is half the frequency of the previous band. Wavelet analysis combined with translated analysis windows reveals that the local standard deviation of each frequency band is correlated with the local standard deviation of the other frequency bands. The ratio between the standard deviation of adjacent frequency bands is 0.9 and remains constant with respect to time. This ratio defined as the variance coupling parameter is applicable to all of the frequency bands studied and appears to be related to the slope of the data's power spectrum. Similar analyses are performed on two cloud inhomogeneity models, which use fractal-based concepts to introduce inhomogeneity into a uniform cloud field. The bounded cascade model does this by iteratively redistributing LWP at each scale using the value of the local mean. This model is reformulated into a wavelet multiresolution framework, thereby presenting a number of variants of the bounded cascade model. One variant introduced in this paper is the 'variance coupled model,' which redistributes LWP using the local standard deviation and the variance coupling parameter. While the bounded cascade model provides an elegant two- parameter model for generating cloud inhomogeneity, the multiresolution framework provides more flexibility at the expense of model complexity. Comparisons are made with the results from the LWP data analysis to demonstrate both the strengths and weaknesses of these models.
Appliance of Independent Component Analysis to System Intrusion Analysis
NASA Astrophysics Data System (ADS)
Ishii, Yoshikazu; Takagi, Tarou; Nakai, Kouji
In order to analyze the output of the intrusion detection system and the firewall, we evaluated the applicability of ICA(independent component analysis). We developed a simulator for evaluation of intrusion analysis method. The simulator consists of the network model of an information system, the service model and the vulnerability model of each server, and the action model performed on client and intruder. We applied the ICA for analyzing the audit trail of simulated information system. We report the evaluation result of the ICA on intrusion analysis. In the simulated case, ICA separated two attacks correctly, and related an attack and the abnormalities of the normal application produced under the influence of the attach.
Computer-aided operations engineering with integrated models of systems and operations
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Ryan, Dan; Fleming, Land
1994-01-01
CONFIG 3 is a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operation of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. Integration is supported among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. Support is provided for integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems. CONFIG supports abstracted qualitative and symbolic modeling, for early conceptual design. System models are component structure models with operating modes, with embedded time-related behavior models. CONFIG supports failure modeling and modeling of state or configuration changes that result in dynamic changes in dependencies among components. Operations and procedure models are activity structure models that interact with system models. CONFIG is designed to support evaluation of system operability, diagnosability and fault tolerance, and analysis of the development of system effects of problems over time, including faults, failures, and procedural or environmental difficulties.
40 CFR 93.158 - Criteria for determining conformity of general Federal actions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... requirements: (i) Specified in paragraph (b) of this section, based on areawide air quality modeling analysis and local air quality modeling analysis; or (ii) Meet the requirements of paragraph (a)(5) of this section and, for local air quality modeling analysis, the requirement of paragraph (b) of this section; (4...
40 CFR 93.158 - Criteria for determining conformity of general Federal actions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... requirements: (i) Specified in paragraph (b) of this section, based on areawide air quality modeling analysis and local air quality modeling analysis; or (ii) Meet the requirements of paragraph (a)(5) of this section and, for local air quality modeling analysis, the requirement of paragraph (b) of this section; (4...
40 CFR 93.158 - Criteria for determining conformity of general Federal actions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... requirements: (i) Specified in paragraph (b) of this section, based on areawide air quality modeling analysis and local air quality modeling analysis; or (ii) Meet the requirements of paragraph (a)(5) of this section and, for local air quality modeling analysis, the requirement of paragraph (b) of this section; (4...
Psychometric Properties on Lecturers' Beliefs on Teaching Function: Rasch Model Analysis
ERIC Educational Resources Information Center
Mofreh, Samah Ali Mohsen; Ghafar, Mohammed Najib Abdul; Omar, Abdul Hafiz Hj; Mosaku, Monsurat; Ma'ruf, Amar
2014-01-01
This paper focuses on the psychometric analysis of lecturers' beliefs on teaching function (LBTF) survey using Rasch Model analysis. The sample comprised 34 Community Colleges' lecturers. The Rasch Model is applied to produce specific measurements on the lecturers' beliefs on teaching function in order to generalize results and inferential…
Economic and Power System Modeling and Analysis | Water Power | NREL
Economic and Power System Modeling and Analysis Economic and Power System Modeling and Analysis technologies, their possible deployment scenarios, and the economic impacts of this deployment. As a research approaches used to estimate direct and indirect economic impacts of offshore renewable energy projects
Methods of Technological Forecasting,
1977-05-01
Trend Extrapolation Progress Curve Analogy Trend Correlation Substitution Analysis or Substitution Growth Curves Envelope Curve Advances in the State of...the Art Technological Mapping Contextual Mapping Matrix Input-Output Analysis Mathematical Models Simulation Models Dynamic Modelling. CHAPTER IV...Generation Interaction between Needs and Possibilities Map of the Technological Future — (‘ross- Impact Matri x Discovery Matrix Morphological Analysis
A 5 year (2002-2006) simulation of CMAQ covering the eastern United States is evaluated using principle component analysis in order to identify and characterize statistically significant patterns of model bias. Such analysis is useful in that in can identify areas of poor model ...
Integrating fire management analysis into land management planning
Thomas J. Mills
1983-01-01
The analysis of alternative fire management programs should be integrated into the land and resource management planning process, but a single fire management analysis model cannot meet all planning needs. Therefore, a set of simulation models that are analytically separate from integrated land management planning models are required. The design of four levels of fire...
NASA Technical Reports Server (NTRS)
Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.
1974-01-01
The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.
Numerical bifurcation analysis of immunological models with time delays
NASA Astrophysics Data System (ADS)
Luzyanina, Tatyana; Roose, Dirk; Bocharov, Gennady
2005-12-01
In recent years, a large number of mathematical models that are described by delay differential equations (DDEs) have appeared in the life sciences. To analyze the models' dynamics, numerical methods are necessary, since analytical studies can only give limited results. In turn, the availability of efficient numerical methods and software packages encourages the use of time delays in mathematical modelling, which may lead to more realistic models. We outline recently developed numerical methods for bifurcation analysis of DDEs and illustrate the use of these methods in the analysis of a mathematical model of human hepatitis B virus infection.
2014-01-01
In adsorption study, to describe sorption process and evaluation of best-fitting isotherm model is a key analysis to investigate the theoretical hypothesis. Hence, numerous statistically analysis have been extensively used to estimate validity of the experimental equilibrium adsorption values with the predicted equilibrium values. Several statistical error analysis were carried out. In the present study, the following statistical analysis were carried out to evaluate the adsorption isotherm model fitness, like the Pearson correlation, the coefficient of determination and the Chi-square test, have been used. The ANOVA test was carried out for evaluating significance of various error functions and also coefficient of dispersion were evaluated for linearised and non-linearised models. The adsorption of phenol onto natural soil (Local name Kalathur soil) was carried out, in batch mode at 30 ± 20 C. For estimating the isotherm parameters, to get a holistic view of the analysis the models were compared between linear and non-linear isotherm models. The result reveled that, among above mentioned error functions and statistical functions were designed to determine the best fitting isotherm. PMID:25018878
Development of Test-Analysis Models (TAM) for correlation of dynamic test and analysis results
NASA Technical Reports Server (NTRS)
Angelucci, Filippo; Javeed, Mehzad; Mcgowan, Paul
1992-01-01
The primary objective of structural analysis of aerospace applications is to obtain a verified finite element model (FEM). The verified FEM can be used for loads analysis, evaluate structural modifications, or design control systems. Verification of the FEM is generally obtained as the result of correlating test and FEM models. A test analysis model (TAM) is very useful in the correlation process. A TAM is essentially a FEM reduced to the size of the test model, which attempts to preserve the dynamic characteristics of the original FEM in the analysis range of interest. Numerous methods for generating TAMs have been developed in the literature. The major emphasis of this paper is a description of the procedures necessary for creation of the TAM and the correlation of the reduced models with the FEM or the test results. Herein, three methods are discussed, namely Guyan, Improved Reduced System (IRS), and Hybrid. Also included are the procedures for performing these analyses using MSC/NASTRAN. Finally, application of the TAM process is demonstrated with an experimental test configuration of a ten bay cantilevered truss structure.
Linear Instability Analysis of non-uniform Bubbly Mixing layer with Two-Fluid model
NASA Astrophysics Data System (ADS)
Sharma, Subash; Chetty, Krishna; Lopez de Bertodano, Martin
We examine the inviscid instability of a non-uniform adiabatic bubbly shear layer with a Two-Fluid model. The Two-Fluid model is made well-posed with the closure relations for interfacial forces. First, a characteristic analysis is carried out to study the well posedness of the model over range of void fraction with interfacial forces for virtual mass, interfacial drag, interfacial pressure. A dispersion analysis then allow us to obtain growth rate and wavelength. Then, the well-posed two-fluid model is solved using CFD to validate the results obtained with the linear stability analysis. The effect of the void fraction and the distribution profile on stability is analyzed.
Is job a viable unit of analysis? A multilevel analysis of demand-control-support models.
Morrison, David; Payne, Roy L; Wall, Toby D
2003-07-01
The literature has ignored the fact that the demand-control (DC) and demand-control-support (DCS) models of stress are about jobs and not individuals' perceptions of their jobs. Using multilevel modeling, the authors report results of individual- and job-level analyses from a study of over 6,700 people in 81 different jobs. Support for additive versions of the models came when individuals were the unit of analysis. DC and DCS models are only helpful for understanding the effects of individual perceptions of jobs and their relationship to psychological states. When job perceptions are aggregated and their relationship to the collective experience of jobholders is assessed, the models prove of little value. Role set may be a better unit of analysis.
Application of Interface Technology in Progressive Failure Analysis of Composite Panels
NASA Technical Reports Server (NTRS)
Sleight, D. W.; Lotts, C. G.
2002-01-01
A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.
Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B
2016-11-01
As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.
PeTTSy: a computational tool for perturbation analysis of complex systems biology models.
Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A
2016-03-10
Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and signalling systems. It allows for simulation and analysis of models under a variety of environmental conditions and for experimental optimisation of complex combined experiments. With its unique set of tools it makes a valuable addition to the current library of sensitivity analysis toolboxes. We believe that this software will be of great use to the wider biological, systems biology and modelling communities.
Comparative analysis of zonal systems for macro-level crash modeling.
Cai, Qing; Abdel-Aty, Mohamed; Lee, Jaeyoung; Eluru, Naveen
2017-06-01
Macro-level traffic safety analysis has been undertaken at different spatial configurations. However, clear guidelines for the appropriate zonal system selection for safety analysis are unavailable. In this study, a comparative analysis was conducted to determine the optimal zonal system for macroscopic crash modeling considering census tracts (CTs), state-wide traffic analysis zones (STAZs), and a newly developed traffic-related zone system labeled traffic analysis districts (TADs). Poisson lognormal models for three crash types (i.e., total, severe, and non-motorized mode crashes) are developed based on the three zonal systems without and with consideration of spatial autocorrelation. The study proposes a method to compare the modeling performance of the three types of geographic units at different spatial configurations through a grid based framework. Specifically, the study region is partitioned to grids of various sizes and the model prediction accuracy of the various macro models is considered within these grids of various sizes. These model comparison results for all crash types indicated that the models based on TADs consistently offer a better performance compared to the others. Besides, the models considering spatial autocorrelation outperform the ones that do not consider it. Based on the modeling results and motivation for developing the different zonal systems, it is recommended using CTs for socio-demographic data collection, employing TAZs for transportation demand forecasting, and adopting TADs for transportation safety planning. The findings from this study can help practitioners select appropriate zonal systems for traffic crash modeling, which leads to develop more efficient policies to enhance transportation safety. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.
NASA Technical Reports Server (NTRS)
Cole, Bjorn; Chung, Seung
2012-01-01
One of the challenges of systems engineering is in working multidisciplinary problems in a cohesive manner. When planning analysis of these problems, system engineers must trade between time and cost for analysis quality and quantity. The quality often correlates with greater run time in multidisciplinary models and the quantity is associated with the number of alternatives that can be analyzed. The trade-off is due to the resource intensive process of creating a cohesive multidisciplinary systems model and analysis. Furthermore, reuse or extension of the models used in one stage of a product life cycle for another is a major challenge. Recent developments have enabled a much less resource-intensive and more rigorous approach than hand-written translation scripts between multi-disciplinary models and their analyses. The key is to work from a core systems model defined in a MOF-based language such as SysML and in leveraging the emerging tool ecosystem, such as Query/View/Transformation (QVT), from the OMG community. SysML was designed to model multidisciplinary systems. The QVT standard was designed to transform SysML models into other models, including those leveraged by engineering analyses. The Europa Habitability Mission (EHM) team has begun to exploit these capabilities. In one case, a Matlab/Simulink model is generated on the fly from a system description for power analysis written in SysML. In a more general case, symbolic analysis (supported by Wolfram Mathematica) is coordinated by data objects transformed from the systems model, enabling extremely flexible and powerful design exploration and analytical investigations of expected system performance.
[Three dimensional mathematical model of tooth for finite element analysis].
Puskar, Tatjana; Vasiljević, Darko; Marković, Dubravka; Jevremović, Danimir; Pantelić, Dejan; Savić-Sević, Svetlana; Murić, Branka
2010-01-01
The mathematical model of the abutment tooth is the starting point of the finite element analysis of stress and deformation of dental structures. The simplest and easiest way is to form a model according to the literature data of dimensions and morphological characteristics of teeth. Our method is based on forming 3D models using standard geometrical forms (objects) in programmes for solid modeling. Forming the mathematical model of abutment of the second upper premolar for finite element analysis of stress and deformation of dental structures. The abutment tooth has a form of a complex geometric object. It is suitable for modeling in programs for solid modeling SolidWorks. After analysing the literature data about the morphological characteristics of teeth, we started the modeling dividing the tooth (complex geometric body) into simple geometric bodies (cylinder, cone, pyramid,...). Connecting simple geometric bodies together or substricting bodies from the basic body, we formed complex geometric body, tooth. The model is then transferred into Abaqus, a computational programme for finite element analysis. Transferring the data was done by standard file format for transferring 3D models ACIS SAT. Using the programme for solid modeling SolidWorks, we developed three models of abutment of the second maxillary premolar: the model of the intact abutment, the model of the endodontically treated tooth with two remaining cavity walls and the model of the endodontically treated tooth with two remaining walls and inserted post. Mathematical models of the abutment made according to the literature data are very similar with the real abutment and the simplifications are minimal. These models enable calculations of stress and deformation of the dental structures. The finite element analysis provides useful information in understanding biomechanical problems and gives guidance for clinical research.
Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall
2016-01-01
Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.
Impact of Retirement Choices of Early Career Marines: A Choice Analysis Model
2013-03-01
CHOICES OF EARLY CAREER MARINES: A CHOICE ANALYSIS MODEL by André G. La Taste Aaron Masaitis March 2013 Thesis Advisor: Michael Dixon... ANALYSIS MODEL 5. FUNDING NUMBERS 6. AUTHOR(S) André G. La Taste, Aaron Masaitis 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate...system. The research will be conducted using a discrete choice analysis methodology that is often used to differentiate factors that lead to
Dynamic test/analysis correlation using reduced analytical models
NASA Technical Reports Server (NTRS)
Mcgowan, Paul E.; Angelucci, A. Filippo; Javeed, Mehzad
1992-01-01
Test/analysis correlation is an important aspect of the verification of analysis models which are used to predict on-orbit response characteristics of large space structures. This paper presents results of a study using reduced analysis models for performing dynamic test/analysis correlation. The reduced test-analysis model (TAM) has the same number and orientation of DOF as the test measurements. Two reduction methods, static (Guyan) reduction and the Improved Reduced System (IRS) reduction, are applied to the test/analysis correlation of a laboratory truss structure. Simulated test results and modal test data are used to examine the performance of each method. It is shown that selection of DOF to be retained in the TAM is critical when large structural masses are involved. In addition, the use of modal test results may provide difficulties in TAM accuracy even if a large number of DOF are retained in the TAM.
Intelligent switching between different noise propagation algorithms: analysis and sensitivity
DOT National Transportation Integrated Search
2012-08-10
When modeling aircraft noise on a large scale (such as an analysis of annual aircraft : operations at an airport), it is important that the noise propagation model used for the : analysis be both efficient and accurate. In this analysis, three differ...
NASA Technical Reports Server (NTRS)
Johnson, Donald R.
1998-01-01
The goal of this research is the continued development and application of global isentropic modeling and analysis capabilities to describe hydrologic processes and energy exchange in the climate system, and discern regional climate change. This work involves a combination of modeling and analysis efforts involving 4DDA datasets and simulations from the University of Wisconsin (UW) hybrid isentropic-sigma (theta-sigma) coordinate model and the GEOS GCM.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-22
... To Support Specific Success Criteria in the Standardized Plant Analysis Risk Models--Surry and Peach... Specific Success Criteria in the Standardized Plant Analysis Risk Models--Surry and Peach Bottom, Draft..., ``Confirmatory Thermal-Hydraulic Analysis to Support Specific Success Criteria in the Standardized Plant Analysis...
Statistical analysis of life history calendar data.
Eerola, Mervi; Helske, Satu
2016-04-01
The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and contrast data-driven and user-defined substitution costs. As an example, we study young adults' transition to adulthood as a sequence of events in three life domains. The events define the multistate event history model and the parallel life domains in multidimensional sequence analysis. The relationship between life trajectories and excess depressive symptoms in middle age is further studied by their joint prediction in the multistate model and by regressing the symptom scores on individual-specific cluster indices. The two approaches complement each other in life course analysis; sequence analysis can effectively find typical and atypical life patterns while event history analysis is needed for causal inquiries. © The Author(s) 2012.
2008-06-01
PESTEL Analysis PESTEL Analysis examines the general environment surrounding the defense industry in a macro perspective. Its focus is on six main...legislation (p. 575). C. STRATEGIC ANALYSIS OF THE MACRO ENVIRONMENT 1. Porter’s Five-Forces Model Analysis Porter’s Five-Forces Model is used to analyze...B. PREVIOUS ANALYSES.................................................................................9 C. STRATEGIC ANALYSIS OF THE MACRO
Voulgarelis, Dimitrios; Velayudhan, Ajoy; Smith, Frank
2017-01-01
Agent-based models provide a formidable tool for exploring complex and emergent behaviour of biological systems as well as accurate results but with the drawback of needing a lot of computational power and time for subsequent analysis. On the other hand, equation-based models can more easily be used for complex analysis in a much shorter timescale. This paper formulates an ordinary differential equations and stochastic differential equations model to capture the behaviour of an existing agent-based model of tumour cell reprogramming and applies it to optimization of possible treatment as well as dosage sensitivity analysis. For certain values of the parameter space a close match between the equation-based and agent-based models is achieved. The need for division of labour between the two approaches is explored. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Analysis/forecast experiments with a multivariate statistical analysis scheme using FGGE data
NASA Technical Reports Server (NTRS)
Baker, W. E.; Bloom, S. C.; Nestler, M. S.
1985-01-01
A three-dimensional, multivariate, statistical analysis method, optimal interpolation (OI) is described for modeling meteorological data from widely dispersed sites. The model was developed to analyze FGGE data at the NASA-Goddard Laboratory of Atmospherics. The model features a multivariate surface analysis over the oceans, including maintenance of the Ekman balance and a geographically dependent correlation function. Preliminary comparisons are made between the OI model and similar schemes employed at the European Center for Medium Range Weather Forecasts and the National Meteorological Center. The OI scheme is used to provide input to a GCM, and model error correlations are calculated for forecasts of 500 mb vertical water mixing ratios and the wind profiles. Comparisons are made between the predictions and measured data. The model is shown to be as accurate as a successive corrections model out to 4.5 days.
Reusable Launch Vehicle (RLV) Market Analysis Model
NASA Technical Reports Server (NTRS)
Prince, Frank A.
1999-01-01
The RLV Market Analysis model is at best a rough order approximation of actual market behavior. However, it does give a quick indication if the flights exists to enable an economically viable RLV, and the assumptions necessary for the vehicle to capture those flights. Additional analysis, market research, and updating with the latest information on payloads and launches would improve the model. Plans are to update the model as new information becomes available and new requirements are levied. This tool will continue to be a vital part of NASA's RLV business analysis capability for the foreseeable future.
Model prototype utilization in the analysis of fault tolerant control and data processing systems
NASA Astrophysics Data System (ADS)
Kovalev, I. V.; Tsarev, R. Yu; Gruzenkin, D. V.; Prokopenko, A. V.; Knyazkov, A. N.; Laptenok, V. D.
2016-04-01
The procedure assessing the profit of control and data processing system implementation is presented in the paper. The reasonability of model prototype creation and analysis results from the implementing of the approach of fault tolerance provision through the inclusion of structural and software assessment redundancy. The developed procedure allows finding the best ratio between the development cost and the analysis of model prototype and earnings from the results of this utilization and information produced. The suggested approach has been illustrated by the model example of profit assessment and analysis of control and data processing system.
PSAMM: A Portable System for the Analysis of Metabolic Models
Steffensen, Jon Lund; Dufault-Thompson, Keith; Zhang, Ying
2016-01-01
The genome-scale models of metabolic networks have been broadly applied in phenotype prediction, evolutionary reconstruction, community functional analysis, and metabolic engineering. Despite the development of tools that support individual steps along the modeling procedure, it is still difficult to associate mathematical simulation results with the annotation and biological interpretation of metabolic models. In order to solve this problem, here we developed a Portable System for the Analysis of Metabolic Models (PSAMM), a new open-source software package that supports the integration of heterogeneous metadata in model annotations and provides a user-friendly interface for the analysis of metabolic models. PSAMM is independent of paid software environments like MATLAB, and all its dependencies are freely available for academic users. Compared to existing tools, PSAMM significantly reduced the running time of constraint-based analysis and enabled flexible settings of simulation parameters using simple one-line commands. The integration of heterogeneous, model-specific annotation information in PSAMM is achieved with a novel format of YAML-based model representation, which has several advantages, such as providing a modular organization of model components and simulation settings, enabling model version tracking, and permitting the integration of multiple simulation problems. PSAMM also includes a number of quality checking procedures to examine stoichiometric balance and to identify blocked reactions. Applying PSAMM to 57 models collected from current literature, we demonstrated how the software can be used for managing and simulating metabolic models. We identified a number of common inconsistencies in existing models and constructed an updated model repository to document the resolution of these inconsistencies. PMID:26828591
Reliability of four models for clinical gait analysis.
Kainz, Hans; Graham, David; Edwards, Julie; Walsh, Henry P J; Maine, Sheanna; Boyd, Roslyn N; Lloyd, David G; Modenese, Luca; Carty, Christopher P
2017-05-01
Three-dimensional gait analysis (3DGA) has become a common clinical tool for treatment planning in children with cerebral palsy (CP). Many clinical gait laboratories use the conventional gait analysis model (e.g. Plug-in-Gait model), which uses Direct Kinematics (DK) for joint kinematic calculations, whereas, musculoskeletal models, mainly used for research, use Inverse Kinematics (IK). Musculoskeletal IK models have the advantage of enabling additional analyses which might improve the clinical decision-making in children with CP. Before any new model can be used in a clinical setting, its reliability has to be evaluated and compared to a commonly used clinical gait model (e.g. Plug-in-Gait model) which was the purpose of this study. Two testers performed 3DGA in eleven CP and seven typically developing participants on two occasions. Intra- and inter-tester standard deviations (SD) and standard error of measurement (SEM) were used to compare the reliability of two DK models (Plug-in-Gait and a six degrees-of-freedom model solved using Vicon software) and two IK models (two modifications of 'gait2392' solved using OpenSim). All models showed good reliability (mean SEM of 3.0° over all analysed models and joint angles). Variations in joint kinetics were less in typically developed than in CP participants. The modified 'gait2392' model which included all the joint rotations commonly reported in clinical 3DGA, showed reasonable reliable joint kinematic and kinetic estimates, and allows additional musculoskeletal analysis on surgically adjustable parameters, e.g. muscle-tendon lengths, and, therefore, is a suitable model for clinical gait analysis. Copyright © 2017. Published by Elsevier B.V.
UNCERTAINTY ANALYSIS OF TCE USING THE DOSE EXPOSURE ESTIMATING MODEL (DEEM) IN ACSL
The ACSL-based Dose Exposure Estimating Model(DEEM) under development by EPA is used to perform art uncertainty analysis of a physiologically based pharmacokinetic (PSPK) model of trichloroethylene (TCE). This model involves several circulating metabolites such as trichloroacet...
A Conceptual Model for Multidimensional Analysis of Documents
NASA Astrophysics Data System (ADS)
Ravat, Franck; Teste, Olivier; Tournier, Ronan; Zurlfluh, Gilles
Data warehousing and OLAP are mainly used for the analysis of transactional data. Nowadays, with the evolution of Internet, and the development of semi-structured data exchange format (such as XML), it is possible to consider entire fragments of data such as documents as analysis sources. As a consequence, an adapted multidimensional analysis framework needs to be provided. In this paper, we introduce an OLAP multidimensional conceptual model without facts. This model is based on the unique concept of dimensions and is adapted for multidimensional document analysis. We also provide a set of manipulation operations.
Parametric versus Cox's model: an illustrative analysis of divorce in Canada.
Balakrishnan, T R; Rao, K V; Krotki, K J; Lapierre-adamcyk, E
1988-06-01
Recent demographic literature clearly recognizes the importance of survival modes in the analysis of cross-sectional event histories. Of the various survival models, Cox's (1972) partial parametric model has been very popular due to its simplicity, and readily available computer software for estimation, sometimes at the cost of precision and parsimony of the model. This paper focuses on parametric failure time models for event history analysis such as Weibell, lognormal, loglogistic, and exponential models. The authors also test the goodness of fit of these parametric models versus the Cox's proportional hazards model taking Kaplan-Meier estimate as base. As an illustration, the authors reanalyze the Canadian Fertility Survey data on 1st marriage dissolution with parametric models. Though these parametric model estimates were not very different from each other, there seemed to be a slightly better fit with loglogistic. When 8 covariates were used in the analysis, it was found that the coefficients were similar in the models, and the overall conclusions about the relative risks would not have been different. The findings reveal that in marriage dissolution, the differences according to demographic and socioeconomic characteristics may be far more important than is generally found in many studies. Therefore, one should not treat the population as homogeneous in analyzing survival probabilities of marriages, other than for cursory analysis of overall trends.
National Centers for Environmental Prediction
Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Contacts Change Log Events Calendar Numerical Forecast Systems NCEP Model Analysis and Guidance Page [< Modeling Center NOAA Center for Weather and Climate Prediction (NCWCP) 5830 University Research Court
A mixed-unit input-output model for environmental life-cycle assessment and material flow analysis.
Hawkins, Troy; Hendrickson, Chris; Higgins, Cortney; Matthews, H Scott; Suh, Sangwon
2007-02-01
Materials flow analysis models have traditionally been used to track the production, use, and consumption of materials. Economic input-output modeling has been used for environmental systems analysis, with a primary benefit being the capability to estimate direct and indirect economic and environmental impacts across the entire supply chain of production in an economy. We combine these two types of models to create a mixed-unit input-output model that is able to bettertrack economic transactions and material flows throughout the economy associated with changes in production. A 13 by 13 economic input-output direct requirements matrix developed by the U.S. Bureau of Economic Analysis is augmented with material flow data derived from those published by the U.S. Geological Survey in the formulation of illustrative mixed-unit input-output models for lead and cadmium. The resulting model provides the capabilities of both material flow and input-output models, with detailed material tracking through entire supply chains in response to any monetary or material demand. Examples of these models are provided along with a discussion of uncertainty and extensions to these models.
Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2009-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
MacLean, Adam L; Harrington, Heather A; Stumpf, Michael P H; Byrne, Helen M
2016-01-01
The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.
A UML Profile for State Analysis
NASA Technical Reports Server (NTRS)
Murray, Alex; Rasmussen, Robert
2010-01-01
State Analysis is a systems engineering methodology for the specification and design of control systems, developed at the Jet Propulsion Laboratory. The methodology emphasizes an analysis of the system under control in terms of States and their properties and behaviors and their effects on each other, a clear separation of the control system from the controlled system, cognizance in the control system of the controlled system's State, goal-based control built on constraining the controlled system's States, and disciplined techniques for State discovery and characterization. State Analysis (SA) introduces two key diagram types: State Effects and Goal Network diagrams. The team at JPL developed a tool for performing State Analysis. The tool includes a drawing capability, backed by a database that supports the diagram types and the organization of the elements of the SA models. But the tool does not support the usual activities of software engineering and design - a disadvantage, since systems to which State Analysis can be applied tend to be very software-intensive. This motivated the work described in this paper: the development of a preliminary Unified Modeling Language (UML) profile for State Analysis. Having this profile would enable systems engineers to specify a system using the methods and graphical language of State Analysis, which is easily linked with a larger system model in SysML (Systems Modeling Language), while also giving software engineers engaged in implementing the specified control system immediate access to and use of the SA model, in the same language, UML, used for other software design. That is, a State Analysis profile would serve as a shared modeling bridge between system and software models for the behavior aspects of the system. This paper begins with an overview of State Analysis and its underpinnings, followed by an overview of the mapping of SA constructs to the UML metamodel. It then delves into the details of these mappings and the constraints associated with them. Finally, we give an example of the use of the profile for expressing an example SA model.
Hierarchical Processing of Auditory Objects in Humans
Kumar, Sukhbinder; Stephan, Klaas E; Warren, Jason D; Friston, Karl J; Griffiths, Timothy D
2007-01-01
This work examines the computational architecture used by the brain during the analysis of the spectral envelope of sounds, an important acoustic feature for defining auditory objects. Dynamic causal modelling and Bayesian model selection were used to evaluate a family of 16 network models explaining functional magnetic resonance imaging responses in the right temporal lobe during spectral envelope analysis. The models encode different hypotheses about the effective connectivity between Heschl's Gyrus (HG), containing the primary auditory cortex, planum temporale (PT), and superior temporal sulcus (STS), and the modulation of that coupling during spectral envelope analysis. In particular, we aimed to determine whether information processing during spectral envelope analysis takes place in a serial or parallel fashion. The analysis provides strong support for a serial architecture with connections from HG to PT and from PT to STS and an increase of the HG to PT connection during spectral envelope analysis. The work supports a computational model of auditory object processing, based on the abstraction of spectro-temporal “templates” in the PT before further analysis of the abstracted form in anterior temporal lobe areas. PMID:17542641
Adam, Asrul; Ibrahim, Zuwairie; Mokhtar, Norrima; Shapiai, Mohd Ibrahim; Cumming, Paul; Mubin, Marizan
2016-01-01
Various peak models have been introduced to detect and analyze peaks in the time domain analysis of electroencephalogram (EEG) signals. In general, peak model in the time domain analysis consists of a set of signal parameters, such as amplitude, width, and slope. Models including those proposed by Dumpala, Acir, Liu, and Dingle are routinely used to detect peaks in EEG signals acquired in clinical studies of epilepsy or eye blink. The optimal peak model is the most reliable peak detection performance in a particular application. A fair measure of performance of different models requires a common and unbiased platform. In this study, we evaluate the performance of the four different peak models using the extreme learning machine (ELM)-based peak detection algorithm. We found that the Dingle model gave the best performance, with 72 % accuracy in the analysis of real EEG data. Statistical analysis conferred that the Dingle model afforded significantly better mean testing accuracy than did the Acir and Liu models, which were in the range 37-52 %. Meanwhile, the Dingle model has no significant difference compared to Dumpala model.
Analysis of rocket engine injection combustion processes
NASA Technical Reports Server (NTRS)
Salmon, J. W.
1976-01-01
A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.
An analysis of the Petri net based model of the human body iron homeostasis process.
Sackmann, Andrea; Formanowicz, Dorota; Formanowicz, Piotr; Koch, Ina; Blazewicz, Jacek
2007-02-01
In the paper a Petri net based model of the human body iron homeostasis is presented and analyzed. The body iron homeostasis is an important but not fully understood complex process. The modeling of the process presented in the paper is expressed in the language of Petri net theory. An application of this theory to the description of biological processes allows for very precise analysis of the resulting models. Here, such an analysis of the body iron homeostasis model from a mathematical point of view is given.
NASA Astrophysics Data System (ADS)
Wray, Richard B.
1991-12-01
A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.
NASA Technical Reports Server (NTRS)
Wray, Richard B.
1991-01-01
A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.
Development of the mathematical model for design and verification of acoustic modal analysis methods
NASA Astrophysics Data System (ADS)
Siner, Alexander; Startseva, Maria
2016-10-01
To reduce the turbofan noise it is necessary to develop methods for the analysis of the sound field generated by the blade machinery called modal analysis. Because modal analysis methods are very difficult and their testing on the full scale measurements are very expensive and tedious it is necessary to construct some mathematical models allowing to test modal analysis algorithms fast and cheap. At this work the model allowing to set single modes at the channel and to analyze generated sound field is presented. Modal analysis of the sound generated by the ring array of point sound sources is made. Comparison of experimental and numerical modal analysis results is presented at this work.
Gowd, Snigdha; Shankar, T; Dash, Samarendra; Sahoo, Nivedita; Chatterjee, Suravi; Mohanty, Pritam
2017-01-01
The aim of the study was to evaluate the reliability of cone beam computed tomography (CBCT) obtained image over plaster model for the assessment of mixed dentition analysis. Thirty CBCT-derived images and thirty plaster models were derived from the dental archives, and Moyer's and Tanaka-Johnston analyses were performed. The data obtained were interpreted and analyzed statistically using SPSS 10.0/PC (SPSS Inc., Chicago, IL, USA). Descriptive and analytical analysis along with Student's t -test was performed to qualitatively evaluate the data and P < 0.05 was considered statistically significant. Statistically, significant results were obtained on data comparison between CBCT-derived images and plaster model; the mean for Moyer's analysis in the left and right lower arch for CBCT and plaster model was 21.2 mm, 21.1 mm and 22.5 mm, 22.5 mm, respectively. CBCT-derived images were less reliable as compared to data obtained directly from plaster model for mixed dentition analysis.
Verification and validation of a Work Domain Analysis with turing machine task analysis.
Rechard, J; Bignon, A; Berruet, P; Morineau, T
2015-03-01
While the use of Work Domain Analysis as a methodological framework in cognitive engineering is increasing rapidly, verification and validation of work domain models produced by this method are becoming a significant issue. In this article, we propose the use of a method based on Turing machine formalism named "Turing Machine Task Analysis" to verify and validate work domain models. The application of this method on two work domain analyses, one of car driving which is an "intentional" domain, and the other of a ship water system which is a "causal domain" showed the possibility of highlighting improvements needed by these models. More precisely, the step by step analysis of a degraded task scenario in each work domain model pointed out unsatisfactory aspects in the first modelling, like overspecification, underspecification, omission of work domain affordances, or unsuitable inclusion of objects in the work domain model. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.
[Lake eutrophication modeling in considering climatic factors change: a review].
Su, Jie-Qiong; Wang, Xuan; Yang, Zhi-Feng
2012-11-01
Climatic factors are considered as the key factors affecting the trophic status and its process in most lakes. Under the background of global climate change, to incorporate the variations of climatic factors into lake eutrophication models could provide solid technical support for the analysis of the trophic evolution trend of lake and the decision-making of lake environment management. This paper analyzed the effects of climatic factors such as air temperature, precipitation, sunlight, and atmosphere on lake eutrophication, and summarized the research results about the lake eutrophication modeling in considering in considering climatic factors change, including the modeling based on statistical analysis, ecological dynamic analysis, system analysis, and intelligent algorithm. The prospective approaches to improve the accuracy of lake eutrophication modeling with the consideration of climatic factors change were put forward, including 1) to strengthen the analysis of the mechanisms related to the effects of climatic factors change on lake trophic status, 2) to identify the appropriate simulation models to generate several scenarios under proper temporal and spatial scales and resolutions, and 3) to integrate the climatic factors change simulation, hydrodynamic model, ecological simulation, and intelligent algorithm into a general modeling system to achieve an accurate prediction of lake eutrophication under climatic change.
User's Guide To CHEAP0 II-Economic Analysis of Stand Prognosis Model Outputs
Joseph E. Horn; E. Lee Medema; Ervin G. Schuster
1986-01-01
CHEAP0 II provides supplemental economic analysis capability for users of version 5.1 of the Stand Prognosis Model, including recent regeneration and insect outbreak extensions. Although patterned after the old CHEAP0 model, CHEAP0 II has more features and analytic capabilities, especially for analysis of existing and uneven-aged stands....
DOT National Transportation Integrated Search
1979-12-01
An econometric model is developed which provides long-run policy analysis and forecasting of annual trends, for U.S. auto stock, new sales, and their composition by auto size-class. The concept of "desired" (equilibrium) stock is introduced. "Desired...
ERIC Educational Resources Information Center
Tsai, Tien-Lung; Shau, Wen-Yi; Hu, Fu-Chang
2006-01-01
This article generalizes linear path analysis (PA) and simultaneous equations models (SiEM) to deal with mixed responses of different types in a recursive or triangular system. An efficient instrumental variable (IV) method for estimating the structural coefficients of a 2-equation partially recursive generalized path analysis (GPA) model and…
ERIC Educational Resources Information Center
Eschenfelder, Kristin R.; Tsai, Tien-I; Zhu, Xiaohua; Stewart, Brenton
2013-01-01
This paper explored the degree to which use terms proposed by model licenses have become institutionalized across different publishers' licenses. It examined model license use terms in four areas: downloading, scholarly sharing, interlibrary loan, and electronic reserves. Data collection and analysis involved content analysis of 224 electronic…
Semiparametric mixed-effects analysis of PK/PD models using differential equations.
Wang, Yi; Eskridge, Kent M; Zhang, Shunpu
2008-08-01
Motivated by the use of semiparametric nonlinear mixed-effects modeling on longitudinal data, we develop a new semiparametric modeling approach to address potential structural model misspecification for population pharmacokinetic/pharmacodynamic (PK/PD) analysis. Specifically, we use a set of ordinary differential equations (ODEs) with form dx/dt = A(t)x + B(t) where B(t) is a nonparametric function that is estimated using penalized splines. The inclusion of a nonparametric function in the ODEs makes identification of structural model misspecification feasible by quantifying the model uncertainty and provides flexibility for accommodating possible structural model deficiencies. The resulting model will be implemented in a nonlinear mixed-effects modeling setup for population analysis. We illustrate the method with an application to cefamandole data and evaluate its performance through simulations.
40 CFR 68.28 - Alternative release scenario analysis.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...
40 CFR 68.28 - Alternative release scenario analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...
40 CFR 68.28 - Alternative release scenario analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...
Evaluating Mixture Modeling for Clustering: Recommendations and Cautions
ERIC Educational Resources Information Center
Steinley, Douglas; Brusco, Michael J.
2011-01-01
This article provides a large-scale investigation into several of the properties of mixture-model clustering techniques (also referred to as latent class cluster analysis, latent profile analysis, model-based clustering, probabilistic clustering, Bayesian classification, unsupervised learning, and finite mixture models; see Vermunt & Magdison,…
Analysis of modeling cumulative noise from simultaneous flights volume 2 : supplemental analysis
DOT National Transportation Integrated Search
2012-12-31
This is the second of two volumes of the report on modeling cumulative noise from simultaneous flights. This volume examines the effect of several modeling input cases on Percent Time Audible results calculated by the Integrated Noise Model. The case...
A Feature Fusion Based Forecasting Model for Financial Time Series
Guo, Zhiqiang; Wang, Huaiqing; Liu, Quan; Yang, Jie
2014-01-01
Predicting the stock market has become an increasingly interesting research area for both researchers and investors, and many prediction models have been proposed. In these models, feature selection techniques are used to pre-process the raw data and remove noise. In this paper, a prediction model is constructed to forecast stock market behavior with the aid of independent component analysis, canonical correlation analysis, and a support vector machine. First, two types of features are extracted from the historical closing prices and 39 technical variables obtained by independent component analysis. Second, a canonical correlation analysis method is utilized to combine the two types of features and extract intrinsic features to improve the performance of the prediction model. Finally, a support vector machine is applied to forecast the next day's closing price. The proposed model is applied to the Shanghai stock market index and the Dow Jones index, and experimental results show that the proposed model performs better in the area of prediction than other two similar models. PMID:24971455
NASA Technical Reports Server (NTRS)
Patt, Frederick S.; Hoisington, Charles M.; Gregg, Watson W.; Coronado, Patrick L.; Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Indest, A. W. (Editor)
1993-01-01
An analysis of orbit propagation models was performed by the Mission Operations element of the Sea-viewing Wide Field-of-View Sensor (SeaWiFS) Project, which has overall responsibility for the instrument scheduling. The orbit propagators selected for this analysis are widely available general perturbations models. The analysis includes both absolute accuracy determination and comparisons of different versions of the models. The results show that all of the models tested meet accuracy requirements for scheduling and data acquisition purposes. For internal Project use the SGP4 propagator, developed by the North American Air Defense (NORAD) Command, has been selected. This model includes atmospheric drag effects and, therefore, provides better accuracy. For High Resolution Picture Transmission (HRPT) ground stations, which have less stringent accuracy requirements, the publicly available Brouwer-Lyddane models are recommended. The SeaWiFS Project will make available portable source code for a version of this model developed by the Data Capture Facility (DCF).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Billman, L.; Keyser, D.
The Jobs and Economic Development Impacts (JEDI) models, developed by the National Renewable Energy Laboratory (NREL) for the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), use input-output methodology to estimate gross (not net) jobs and economic impacts of building and operating selected types of renewable electricity generation and fuel plants. This analysis provides the DOE with an assessment of the value, impact, and validity of the JEDI suite of models. While the models produce estimates of jobs, earnings, and economic output, this analysis focuses only on jobs estimates. This validation report includes an introductionmore » to JEDI models, an analysis of the value and impact of the JEDI models, and an analysis of the validity of job estimates generated by JEDI model through comparison to other modeled estimates and comparison to empirical, observed jobs data as reported or estimated for a commercial project, a state, or a region.« less
NASA Astrophysics Data System (ADS)
Kurniati, Devi; Hoyyi, Abdul; Widiharih, Tatik
2018-05-01
Time series data is a series of data taken or measured based on observations at the same time interval. Time series data analysis is used to perform data analysis considering the effect of time. The purpose of time series analysis is to know the characteristics and patterns of a data and predict a data value in some future period based on data in the past. One of the forecasting methods used for time series data is the state space model. This study discusses the modeling and forecasting of electric energy consumption using the state space model for univariate data. The modeling stage is began with optimal Autoregressive (AR) order selection, determination of state vector through canonical correlation analysis, estimation of parameter, and forecasting. The result of this research shows that modeling of electric energy consumption using state space model of order 4 with Mean Absolute Percentage Error (MAPE) value 3.655%, so the model is very good forecasting category.
Hagger, Martin S; Chatzisarantis, Nikos L D
2016-06-01
The trans-contextual model outlines the processes by which autonomous motivation toward activities in a physical education context predicts autonomous motivation toward physical activity outside of school, and beliefs about, intentions toward, and actual engagement in, out-of-school physical activity. In the present article, we clarify the fundamental propositions of the model and resolve some outstanding conceptual issues, including its generalizability across multiple educational domains, criteria for its rejection or failed replication, the role of belief-based antecedents of intentions, and the causal ordering of its constructs. We also evaluate the consistency of model relationships in previous tests of the model using path-analytic meta-analysis. The analysis supported model hypotheses but identified substantial heterogeneity in the hypothesized relationships across studies unattributed to sampling and measurement error. Based on our meta-analysis, future research needs to provide further replications of the model in diverse educational settings beyond physical education and test model hypotheses using experimental methods.
Pastor, Dena A; Lazowski, Rory A
2018-01-01
The term "multilevel meta-analysis" is encountered not only in applied research studies, but in multilevel resources comparing traditional meta-analysis to multilevel meta-analysis. In this tutorial, we argue that the term "multilevel meta-analysis" is redundant since all meta-analysis can be formulated as a special kind of multilevel model. To clarify the multilevel nature of meta-analysis the four standard meta-analytic models are presented using multilevel equations and fit to an example data set using four software programs: two specific to meta-analysis (metafor in R and SPSS macros) and two specific to multilevel modeling (PROC MIXED in SAS and HLM). The same parameter estimates are obtained across programs underscoring that all meta-analyses are multilevel in nature. Despite the equivalent results, not all software programs are alike and differences are noted in the output provided and estimators available. This tutorial also recasts distinctions made in the literature between traditional and multilevel meta-analysis as differences between meta-analytic choices, not between meta-analytic models, and provides guidance to inform choices in estimators, significance tests, moderator analyses, and modeling sequence. The extent to which the software programs allow flexibility with respect to these decisions is noted, with metafor emerging as the most favorable program reviewed.
Wang, Gaoqi; Zhang, Song; Bian, Cuirong; Kong, Hui
2016-01-01
The purpose of the study was to verify the finite element analysis model of three-unite fixed partial denture with in vitro electronic strain analysis and analyze clinical situation with the verified model. First, strain gauges were attached to the critical areas of a three-unit fixed partial denture. Strain values were measured under 300 N load perpendicular to the occlusal plane. Secondly, a three-dimensional finite element model in accordance with the electronic strain analysis experiment was constructed from the scanning data. And the strain values obtained by finite element analysis and in vitro measurements were compared. Finally, the clinical destruction of the fixed partial denture was evaluated with the verified finite element analysis model. There was a mutual agreement and consistency between the finite element analysis results and experimental data. The finite element analysis revealed that failure will occur in the veneer layer on buccal surface of the connector under occlusal force of 570 N. The results indicate that the electronic strain analysis is an appropriate and cost saving method to verify the finite element model. The veneer layer on buccal surface of the connector is the weakest area in the fixed partial denture. Copyright © 2015 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
MOVES regional level sensitivity analysis
DOT National Transportation Integrated Search
2012-01-01
The MOVES Regional Level Sensitivity Analysis was conducted to increase understanding of the operations of the MOVES Model in regional emissions analysis and to highlight the following: : the relative sensitivity of selected MOVES Model input paramet...
Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system
NASA Astrophysics Data System (ADS)
Lu, Yunfan; Wang, Jun; Niu, Hongli
2015-10-01
Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.
NASA Technical Reports Server (NTRS)
Yeager, W. T., Jr.; Hamouda, M. N. H.; Mantay, W. R.
1983-01-01
A research effort of analysis and testing was conducted to investigate the ground resonance phenomenon of a soft in-plane hingeless rotor. Experimental data were obtained using a 9 ft. (2.74 m) diameter model rotor in hover and forward flight. Eight model rotor configurations were investigated. Configuration parameters included pitch flap coupling, blade sweep and droop, and precone of the blade feathering axis. An analysis based on a comprehensive analytical model of rotorcraft aerodynamics and dynamics was used. The moving block was used to experimentally determine the regressing lead lag mode damping. Good agreement was obtained between the analysis and test. Both analysis and experiment indicated ground resonance instability in hover. An outline of the analysis, a description of the experimental model and procedures, and comparison of the analytical and experimental data are presented.
NASA Astrophysics Data System (ADS)
Hameed, M.; Demirel, M. C.; Moradkhani, H.
2015-12-01
Global Sensitivity Analysis (GSA) approach helps identify the effectiveness of model parameters or inputs and thus provides essential information about the model performance. In this study, the effects of the Sacramento Soil Moisture Accounting (SAC-SMA) model parameters, forcing data, and initial conditions are analysed by using two GSA methods: Sobol' and Fourier Amplitude Sensitivity Test (FAST). The simulations are carried out over five sub-basins within the Columbia River Basin (CRB) for three different periods: one-year, four-year, and seven-year. Four factors are considered and evaluated by using the two sensitivity analysis methods: the simulation length, parameter range, model initial conditions, and the reliability of the global sensitivity analysis methods. The reliability of the sensitivity analysis results is compared based on 1) the agreement between the two sensitivity analysis methods (Sobol' and FAST) in terms of highlighting the same parameters or input as the most influential parameters or input and 2) how the methods are cohered in ranking these sensitive parameters under the same conditions (sub-basins and simulation length). The results show the coherence between the Sobol' and FAST sensitivity analysis methods. Additionally, it is found that FAST method is sufficient to evaluate the main effects of the model parameters and inputs. Another conclusion of this study is that the smaller parameter or initial condition ranges, the more consistency and coherence between the sensitivity analysis methods results.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
1986-01-01
the information that has been determined experimentally. The Labyrinth Seal Analysis program was, therefore, directed to the develop - ment of an...labyrinth seal performance, the program included the development of an improved empirical design model to pro- j. .,’ vide the calculation of the flow... program . * Phase I was directed to the analytical development of both an *analysis* model and an improvwd empirical *design" model. Supporting rig tests
ASTROP2 users manual: A program for aeroelastic stability analysis of propfans
NASA Technical Reports Server (NTRS)
Narayanan, G. V.; Kaza, K. R. V.
1991-01-01
A user's manual is presented for the aeroelastic stability and response of propulsion systems computer program called ASTROP2. The ASTROP2 code preforms aeroelastic stability analysis of rotating propfan blades. This analysis uses a two-dimensional, unsteady cascade aerodynamics model and a three-dimensional, normal-mode structural model. Analytical stability results from this code are compared with published experimental results of a rotating composite advanced turboprop model and of nonrotating metallic wing model.
NASA Technical Reports Server (NTRS)
Baron, S.; Levison, W. H.
1977-01-01
Application of the optimal control model of the human operator to problems in display analysis is discussed. Those aspects of the model pertaining to the operator-display interface and to operator information processing are reviewed and discussed. The techniques are then applied to the analysis of advanced display/control systems for a Terminal Configured Vehicle. Model results are compared with those obtained in a large, fixed-base simulation.
Exploratory Model Analysis of the Space Based Infrared System (SBIRS) Low Global Scheduler Problem
1999-12-01
solution. The non- linear least squares model is defined as Y = f{e,t) where: 0 =M-element parameter vector Y =N-element vector of all data t...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS EXPLORATORY MODEL ANALYSIS OF THE SPACE BASED INFRARED SYSTEM (SBIRS) LOW GLOBAL SCHEDULER...December 1999 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE EXPLORATORY MODEL ANALYSIS OF THE SPACE BASED INFRARED SYSTEM
Research relative to automated multisensor image registration
NASA Technical Reports Server (NTRS)
Kanal, L. N.
1983-01-01
The basic aproaches to image registration are surveyed. Three image models are presented as models of the subpixel problem. A variety of approaches to the analysis of subpixel analysis are presented using these models.
Williams, Claire; Lewsey, James D; Briggs, Andrew H; Mackay, Daniel F
2017-05-01
This tutorial provides a step-by-step guide to performing cost-effectiveness analysis using a multi-state modeling approach. Alongside the tutorial, we provide easy-to-use functions in the statistics package R. We argue that this multi-state modeling approach using a package such as R has advantages over approaches where models are built in a spreadsheet package. In particular, using a syntax-based approach means there is a written record of what was done and the calculations are transparent. Reproducing the analysis is straightforward as the syntax just needs to be run again. The approach can be thought of as an alternative way to build a Markov decision-analytic model, which also has the option to use a state-arrival extended approach. In the state-arrival extended multi-state model, a covariate that represents patients' history is included, allowing the Markov property to be tested. We illustrate the building of multi-state survival models, making predictions from the models and assessing fits. We then proceed to perform a cost-effectiveness analysis, including deterministic and probabilistic sensitivity analyses. Finally, we show how to create 2 common methods of visualizing the results-namely, cost-effectiveness planes and cost-effectiveness acceptability curves. The analysis is implemented entirely within R. It is based on adaptions to functions in the existing R package mstate to accommodate parametric multi-state modeling that facilitates extrapolation of survival curves.
NASA Astrophysics Data System (ADS)
Manfreda, G.; Bellina, F.
2016-12-01
The paper describes the new lumped thermal model recently implemented in THELMA code for the coupled electromagnetic-thermal analysis of superconducting cables. A new geometrical model is also presented, which describes the Rutherford cables used for the accelerator magnets. A first validation of these models has been given by the analysis of the quench longitudinal propagation velocity in the Nb3Sn prototype coil SMC3, built and tested in the frame of the EUCARD project for the development of high field magnets for LHC machine. This paper shows in detail the models, while their application to the quench propagation analysis is presented in a companion paper.
On 3D inelastic analysis methods for hot section components
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Chen, P. C.; Dame, L. T.; Holt, R. V.; Huang, H.; Hartle, M.; Gellin, S.; Allen, D. H.; Haisler, W. E.
1986-01-01
Accomplishments are described for the 2-year program, to develop advanced 3-D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades and vanes. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulations models were developed; an eight-noded mid-surface shell element, a nine-noded mid-surface shell element and a twenty-noded isoparametric solid element. A separate computer program was developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.
The 3D inelastic analysis methods for hot section components
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.
1992-01-01
A two-year program to develop advanced 3D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades, and vanes is described. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulation models were developed: an eight-noded midsurface shell element; a nine-noded midsurface shell element; and a twenty-noded isoparametric solid element. A separate computer program has been developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.
NASA Astrophysics Data System (ADS)
Ichii, K.; Suzuki, T.; Kato, T.; Ito, A.; Hajima, T.; Ueyama, M.; Sasai, T.; Hirata, R.; Saigusa, N.; Ohtani, Y.; Takagi, K.
2010-07-01
Terrestrial biosphere models show large differences when simulating carbon and water cycles, and reducing these differences is a priority for developing more accurate estimates of the condition of terrestrial ecosystems and future climate change. To reduce uncertainties and improve the understanding of their carbon budgets, we investigated the utility of the eddy flux datasets to improve model simulations and reduce variabilities among multi-model outputs of terrestrial biosphere models in Japan. Using 9 terrestrial biosphere models (Support Vector Machine - based regressions, TOPS, CASA, VISIT, Biome-BGC, DAYCENT, SEIB, LPJ, and TRIFFID), we conducted two simulations: (1) point simulations at four eddy flux sites in Japan and (2) spatial simulations for Japan with a default model (based on original settings) and a modified model (based on model parameter tuning using eddy flux data). Generally, models using default model settings showed large deviations in model outputs from observation with large model-by-model variability. However, after we calibrated the model parameters using eddy flux data (GPP, RE and NEP), most models successfully simulated seasonal variations in the carbon cycle, with less variability among models. We also found that interannual variations in the carbon cycle are mostly consistent among models and observations. Spatial analysis also showed a large reduction in the variability among model outputs. This study demonstrated that careful validation and calibration of models with available eddy flux data reduced model-by-model differences. Yet, site history, analysis of model structure changes, and more objective procedure of model calibration should be included in the further analysis.
Studies in astronomical time series analysis. I - Modeling random processes in the time domain
NASA Technical Reports Server (NTRS)
Scargle, J. D.
1981-01-01
Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.
Design, analysis and verification of a knee joint oncological prosthesis finite element model.
Zach, Lukáš; Kunčická, Lenka; Růžička, Pavel; Kocich, Radim
2014-11-01
The aim of this paper was to design a finite element model for a hinged PROSPON oncological knee endoprosthesis and to verify the model by comparison with ankle flexion angle using knee-bending experimental data obtained previously. Visible Human Project CT scans were used to create a general lower extremity bones model and to compose a 3D CAD knee joint model to which muscles and ligaments were added. Into the assembly the designed finite element PROSPON prosthesis model was integrated and an analysis focused on the PEEK-OPTIMA hinge pin bushing stress state was carried out. To confirm the stress state analysis results, contact pressure was investigated. The analysis was performed in the knee-bending position within 15.4-69.4° hip joint flexion range. The results showed that the maximum stress achieved during the analysis (46.6 MPa) did not exceed the yield strength of the material (90 MPa); the condition of plastic stability was therefore met. The stress state analysis results were confirmed by the distribution of contact pressure during knee-bending. The applicability of our designed finite element model for the real implant behaviour prediction was proven on the basis of good correlation of the analytical and experimental ankle flexion angle data. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bouskill, N. J.; Riley, W. J.; Tang, J. Y.
2014-12-01
Accurate representation of ecosystem processes in land models is crucial for reducing predictive uncertainty in energy and greenhouse gas feedbacks with the climate. Here we describe an observational and modeling meta-analysis approach to benchmark land models, and apply the method to the land model CLM4.5 with two versions of belowground biogeochemistry. We focused our analysis on the aboveground and belowground responses to warming and nitrogen addition in high-latitude ecosystems, and identified absent or poorly parameterized mechanisms in CLM4.5. While the two model versions predicted similar soil carbon stock trajectories following both warming and nitrogen addition, other predicted variables (e.g., belowground respiration) differed from observations in both magnitude and direction, indicating that CLM4.5 has inadequate underlying mechanisms for representing high-latitude ecosystems. On the basis of observational synthesis, we attribute the model-observation differences to missing representations of microbial dynamics, aboveground and belowground coupling, and nutrient cycling, and we use the observational meta-analysis to discuss potential approaches to improving the current models. However, we also urge caution concerning the selection of data sets and experiments for meta-analysis. For example, the concentrations of nitrogen applied in the synthesized field experiments (average = 72 kg ha-1 yr-1) are many times higher than projected soil nitrogen concentrations (from nitrogen deposition and release during mineralization), which precludes a rigorous evaluation of the model responses to likely nitrogen perturbations. Overall, we demonstrate that elucidating ecological mechanisms via meta-analysis can identify deficiencies in ecosystem models and empirical experiments.
NASA Astrophysics Data System (ADS)
Keating, Elizabeth H.; Doherty, John; Vrugt, Jasper A.; Kang, Qinjun
2010-10-01
Highly parameterized and CPU-intensive groundwater models are increasingly being used to understand and predict flow and transport through aquifers. Despite their frequent use, these models pose significant challenges for parameter estimation and predictive uncertainty analysis algorithms, particularly global methods which usually require very large numbers of forward runs. Here we present a general methodology for parameter estimation and uncertainty analysis that can be utilized in these situations. Our proposed method includes extraction of a surrogate model that mimics key characteristics of a full process model, followed by testing and implementation of a pragmatic uncertainty analysis technique, called null-space Monte Carlo (NSMC), that merges the strengths of gradient-based search and parameter dimensionality reduction. As part of the surrogate model analysis, the results of NSMC are compared with a formal Bayesian approach using the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. Such a comparison has never been accomplished before, especially in the context of high parameter dimensionality. Despite the highly nonlinear nature of the inverse problem, the existence of multiple local minima, and the relatively large parameter dimensionality, both methods performed well and results compare favorably with each other. Experiences gained from the surrogate model analysis are then transferred to calibrate the full highly parameterized and CPU intensive groundwater model and to explore predictive uncertainty of predictions made by that model. The methodology presented here is generally applicable to any highly parameterized and CPU-intensive environmental model, where efficient methods such as NSMC provide the only practical means for conducting predictive uncertainty analysis.
NASA Astrophysics Data System (ADS)
Wang, Qi; Dong, Xufeng; Li, Luyu; Ou, Jinping
2018-06-01
As constitutive models are too complicated and existing mechanical models lack universality, these models are beyond satisfaction for magnetorheological elastomer (MRE) devices. In this article, a novel universal method is proposed to build concise mechanical models. Constitutive model and electromagnetic analysis were applied in this method to ensure universality, while a series of derivations and simplifications were carried out to obtain a concise formulation. To illustrate the proposed modeling method, a conical MRE isolator was introduced. Its basic mechanical equations were built based on equilibrium, deformation compatibility, constitutive equations and electromagnetic analysis. An iteration model and a highly efficient differential equation editor based model were then derived to solve the basic mechanical equations. The final simplified mechanical equations were obtained by re-fitting the simulations with a novel optimal algorithm. In the end, verification test of the isolator has proved the accuracy of the derived mechanical model and the modeling method.
Monir, Md. Mamun; Zhu, Jun
2017-01-01
Most of the genome-wide association studies (GWASs) for human complex diseases have ignored dominance, epistasis and ethnic interactions. We conducted comparative GWASs for total cholesterol using full model and additive models, which illustrate the impacts of the ignoring genetic variants on analysis results and demonstrate how genetic effects of multiple loci could differ across different ethnic groups. There were 15 quantitative trait loci with 13 individual loci and 3 pairs of epistasis loci identified by full model, whereas only 14 loci (9 common loci and 5 different loci) identified by multi-loci additive model. Again, 4 full model detected loci were not detected using multi-loci additive model. PLINK-analysis identified two loci and GCTA-analysis detected only one locus with genome-wide significance. Full model identified three previously reported genes as well as several new genes. Bioinformatics analysis showed some new genes are related with cholesterol related chemicals and/or diseases. Analyses of cholesterol data and simulation studies revealed that the full model performs were better than the additive-model performs in terms of detecting power and unbiased estimations of genetic variants of complex traits. PMID:28079101
Using multi-criteria analysis of simulation models to understand complex biological systems
Maureen C. Kennedy; E. David Ford
2011-01-01
Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...
Latent Transition Analysis with a Mixture Item Response Theory Measurement Model
ERIC Educational Resources Information Center
Cho, Sun-Joo; Cohen, Allan S.; Kim, Seock-Ho; Bottge, Brian
2010-01-01
A latent transition analysis (LTA) model was described with a mixture Rasch model (MRM) as the measurement model. Unlike the LTA, which was developed with a latent class measurement model, the LTA-MRM permits within-class variability on the latent variable, making it more useful for measuring treatment effects within latent classes. A simulation…
Estimating the Regional Economic Significance of Airports
1992-09-01
following three options for estimating induced impacts: the economic base model , an econometric model , and a regional input-output model . One approach to...limitations, however, the economic base model has been widely used for regional economic analysis. A second approach is to develop an econometric model of...analysis is the principal statistical tool used to estimate the economic relationships. Regional econometric models are capable of estimating a single
Structural Analysis of the Redesigned Ice/Frost Ramp Bracket
NASA Technical Reports Server (NTRS)
Phillips, D. R.; Dawicke, D. S.; Gentz, S. J.; Roberts, P. W.; Raju, I. S.
2007-01-01
This paper describes the interim structural analysis of a redesigned Ice/Frost Ramp bracket for the Space Shuttle External Tank (ET). The proposed redesigned bracket consists of mounts for attachment to the ET wall, supports for the electronic/instrument cables and propellant repressurization lines that run along the ET, an upper plate, a lower plate, and complex bolted connections. The eight nominal bolted connections are considered critical in the summarized structural analysis. Each bolted connection contains a bolt, a nut, four washers, and a non-metallic spacer and block that are designed for thermal insulation. A three-dimensional (3D) finite element model of the bracket is developed using solid 10-node tetrahedral elements. The loading provided by the ET Project is used in the analysis. Because of the complexities associated with accurately modeling the bolted connections in the bracket, the analysis is performed using a global/local analysis procedure. The finite element analysis of the bracket identifies one of the eight bolted connections as having high stress concentrations. A local area of the bracket surrounding this bolted connection is extracted from the global model and used as a local model. Within the local model, the various components of the bolted connection are refined, and contact is introduced along the appropriate interfaces determined by the analysts. The deformations from the global model are applied as boundary conditions to the local model. The results from the global/local analysis show that while the stresses in the bolts are well within yield, the spacers fail due to compression. The primary objective of the interim structural analysis is to show concept viability for static thermal testing. The proposed design concept would undergo continued design optimization to address the identified analytical assumptions and concept shortcomings, assuming successful thermal testing.
NASA Astrophysics Data System (ADS)
Li, Y.; Kinzelbach, W.; Zhou, J.; Cheng, G. D.; Li, X.
2012-05-01
The hydrologic model HYDRUS-1-D and the crop growth model WOFOST are coupled to efficiently manage water resources in agriculture and improve the prediction of crop production. The results of the coupled model are validated by experimental studies of irrigated-maize done in the middle reaches of northwest China's Heihe River, a semi-arid to arid region. Good agreement is achieved between the simulated evapotranspiration, soil moisture and crop production and their respective field measurements made under current maize irrigation and fertilization. Based on the calibrated model, the scenario analysis reveals that the most optimal amount of irrigation is 500-600 mm in this region. However, for regions without detailed observation, the results of the numerical simulation can be unreliable for irrigation decision making owing to the shortage of calibrated model boundary conditions and parameters. So, we develop a method of combining model ensemble simulations and uncertainty/sensitivity analysis to speculate the probability of crop production. In our studies, the uncertainty analysis is used to reveal the risk of facing a loss of crop production as irrigation decreases. The global sensitivity analysis is used to test the coupled model and further quantitatively analyse the impact of the uncertainty of coupled model parameters and environmental scenarios on crop production. This method can be used for estimation in regions with no or reduced data availability.
Forecast first: An argument for groundwater modeling in reverse
White, Jeremy
2017-01-01
Numerical groundwater models are important compo-nents of groundwater analyses that are used for makingcritical decisions related to the management of ground-water resources. In this support role, models are oftenconstructed to serve a specific purpose that is to provideinsights, through simulation, related to a specific func-tion of a complex aquifer system that cannot be observeddirectly (Anderson et al. 2015).For any given modeling analysis, several modelinput datasets must be prepared. Herein, the datasetsrequired to simulate the historical conditions are referredto as the calibration model, and the datasets requiredto simulate the model’s purpose are referred to as theforecast model. Future groundwater conditions or otherunobserved aspects of the groundwater system may besimulated by the forecast model—the outputs of interestfrom the forecast model represent the purpose of themodeling analysis. Unfortunately, the forecast model,needed to simulate the purpose of the modeling analysis,is seemingly an afterthought—calibration is where themajority of time and effort are expended and calibrationis usually completed before the forecast model is evenconstructed. Herein, I am proposing a new groundwatermodeling workflow, referred to as the “forecast first”workflow, where the forecast model is constructed at anearlier stage in the modeling analysis and the outputsof interest from the forecast model are evaluated duringsubsequent tasks in the workflow.
Laboratory modeling and analysis of aircraft-lightning interactions
NASA Technical Reports Server (NTRS)
Turner, C. D.; Trost, T. F.
1982-01-01
Modeling studies of the interaction of a delta wing aircraft with direct lightning strikes were carried out using an approximate scale model of an F-106B. The model, which is three feet in length, is subjected to direct injection of fast current pulses supplied by wires, which simulate the lightning channel and are attached at various locations on the model. Measurements are made of the resulting transient electromagnetic fields using time derivative sensors. The sensor outputs are sampled and digitized by computer. The noise level is reduced by averaging the sensor output from ten input pulses at each sample time. Computer analysis of the measured fields includes Fourier transformation and the computation of transfer functions for the model. Prony analysis is also used to determine the natural frequencies of the model. Comparisons of model natural frequencies extracted by Prony analysis with those for in flight direct strike data usually show lower damping in the in flight case. This is indicative of either a lightning channel with a higher impedance than the wires on the model, only one attachment point, or short streamers instead of a long channel.
NASA Astrophysics Data System (ADS)
Noori, Roohollah; Safavi, Salman; Nateghi Shahrokni, Seyyed Afshin
2013-07-01
The five-day biochemical oxygen demand (BOD5) is one of the key parameters in water quality management. In this study, a novel approach, i.e., reduced-order adaptive neuro-fuzzy inference system (ROANFIS) model was developed for rapid estimation of BOD5. In addition, an uncertainty analysis of adaptive neuro-fuzzy inference system (ANFIS) and ROANFIS models was carried out based on Monte-Carlo simulation. Accuracy analysis of ANFIS and ROANFIS models based on both developed discrepancy ratio and threshold statistics revealed that the selected ROANFIS model was superior. Pearson correlation coefficient (R) and root mean square error for the best fitted ROANFIS model were 0.96 and 7.12, respectively. Furthermore, uncertainty analysis of the developed models indicated that the selected ROANFIS had less uncertainty than the ANFIS model and accurately forecasted BOD5 in the Sefidrood River Basin. Besides, the uncertainty analysis also showed that bracketed predictions by 95% confidence bound and d-factor in the testing steps for the selected ROANFIS model were 94% and 0.83, respectively.
Guo, Zhiqiang; Wang, Huaiqing; Yang, Jie; Miller, David J
2015-01-01
In this paper, we propose and implement a hybrid model combining two-directional two-dimensional principal component analysis ((2D)2PCA) and a Radial Basis Function Neural Network (RBFNN) to forecast stock market behavior. First, 36 stock market technical variables are selected as the input features, and a sliding window is used to obtain the input data of the model. Next, (2D)2PCA is utilized to reduce the dimension of the data and extract its intrinsic features. Finally, an RBFNN accepts the data processed by (2D)2PCA to forecast the next day's stock price or movement. The proposed model is used on the Shanghai stock market index, and the experiments show that the model achieves a good level of fitness. The proposed model is then compared with one that uses the traditional dimension reduction method principal component analysis (PCA) and independent component analysis (ICA). The empirical results show that the proposed model outperforms the PCA-based model, as well as alternative models based on ICA and on the multilayer perceptron.
Guo, Zhiqiang; Wang, Huaiqing; Yang, Jie; Miller, David J.
2015-01-01
In this paper, we propose and implement a hybrid model combining two-directional two-dimensional principal component analysis ((2D)2PCA) and a Radial Basis Function Neural Network (RBFNN) to forecast stock market behavior. First, 36 stock market technical variables are selected as the input features, and a sliding window is used to obtain the input data of the model. Next, (2D)2PCA is utilized to reduce the dimension of the data and extract its intrinsic features. Finally, an RBFNN accepts the data processed by (2D)2PCA to forecast the next day's stock price or movement. The proposed model is used on the Shanghai stock market index, and the experiments show that the model achieves a good level of fitness. The proposed model is then compared with one that uses the traditional dimension reduction method principal component analysis (PCA) and independent component analysis (ICA). The empirical results show that the proposed model outperforms the PCA-based model, as well as alternative models based on ICA and on the multilayer perceptron. PMID:25849483
2009-06-01
Valuation’s Risk Simulator..............................................46 viii 6. Palisade @RISK (http://www.palisade.com...71 APPENDIX B. PALISADE @RISK MODELING DATA AND ANALYSIS..................79 A. PALISADE @RISK...values ...81 3. @RISK Model Sorted by EMV ..............................................................82 4. Palisade @RISK Data Analysis
Improve FREQ macroscopic freeway analysis model
DOT National Transportation Integrated Search
2008-07-01
The primary objectives of this project have been to provide technical assistance on district freeway analysis projects, enhance the FREQ model based on guidance and suggestions from Caltrans staff members, and offer three freeway analysis workshops f...
PESTAN: Pesticide Analytical Model Version 4.0 User's Guide
The principal objective of this User's Guide to provide essential information on the aspects such as model conceptualization, model theory, assumptions and limitations, determination of input parameters, analysis of results and sensitivity analysis.
A brain-region-based meta-analysis method utilizing the Apriori algorithm.
Niu, Zhendong; Nie, Yaoxin; Zhou, Qian; Zhu, Linlin; Wei, Jieyao
2016-05-18
Brain network connectivity modeling is a crucial method for studying the brain's cognitive functions. Meta-analyses can unearth reliable results from individual studies. Meta-analytic connectivity modeling is a connectivity analysis method based on regions of interest (ROIs) which showed that meta-analyses could be used to discover brain network connectivity. In this paper, we propose a new meta-analysis method that can be used to find network connectivity models based on the Apriori algorithm, which has the potential to derive brain network connectivity models from activation information in the literature, without requiring ROIs. This method first extracts activation information from experimental studies that use cognitive tasks of the same category, and then maps the activation information to corresponding brain areas by using the automatic anatomical label atlas, after which the activation rate of these brain areas is calculated. Finally, using these brain areas, a potential brain network connectivity model is calculated based on the Apriori algorithm. The present study used this method to conduct a mining analysis on the citations in a language review article by Price (Neuroimage 62(2):816-847, 2012). The results showed that the obtained network connectivity model was consistent with that reported by Price. The proposed method is helpful to find brain network connectivity by mining the co-activation relationships among brain regions. Furthermore, results of the co-activation relationship analysis can be used as a priori knowledge for the corresponding dynamic causal modeling analysis, possibly achieving a significant dimension-reducing effect, thus increasing the efficiency of the dynamic causal modeling analysis.
2009-12-18
cannot be detected with univariate techniques, but require multivariate analysis instead (Kamitani and Tong [2005]). Two other time series analysis ...learning for time series analysis . The historical record of DBNs can be traced back to Dean and Kanazawa [1988] and Dean and Wellman [1991], with...Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: Hidden Process Models, probabilistic time series modeling, functional Magnetic Resonance Imaging
Bruce G. Marcot; Peter H. Singleton; Nathan H. Schumaker
2015-01-01
Sensitivity analysisâdetermination of how prediction variables affect response variablesâof individual-based models (IBMs) are few but important to the interpretation of model output. We present sensitivity analysis of a spatially explicit IBM (HexSim) of a threatened species, the Northern Spotted Owl (NSO; Strix occidentalis caurina) in Washington...
Fusion of Hard and Soft Information in Nonparametric Density Estimation
2015-06-10
and stochastic optimization models, in analysis of simulation output, and when instantiating probability models. We adopt a constrained maximum...particular, density estimation is needed for generation of input densities to simulation and stochastic optimization models, in analysis of simulation output...an essential step in simulation analysis and stochastic optimization is the generation of probability densities for input random variables; see for
Teaching Tip: Using Activity Diagrams to Model Systems Analysis Techniques: Teaching What We Preach
ERIC Educational Resources Information Center
Lending, Diane; May, Jeffrey
2013-01-01
Activity diagrams are used in Systems Analysis and Design classes as a visual tool to model the business processes of "as-is" and "to-be" systems. This paper presents the idea of using these same activity diagrams in the classroom to model the actual processes (practices and techniques) of Systems Analysis and Design. This tip…
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.
AgRISTARS. Supporting research: Algorithms for scene modelling
NASA Technical Reports Server (NTRS)
Rassbach, M. E. (Principal Investigator)
1982-01-01
The requirements for a comprehensive analysis of LANDSAT or other visual data scenes are defined. The development of a general model of a scene and a computer algorithm for finding the particular model for a given scene is discussed. The modelling system includes a boundary analysis subsystem, which detects all the boundaries and lines in the image and builds a boundary graph; a continuous variation analysis subsystem, which finds gradual variations not well approximated by a boundary structure; and a miscellaneous features analysis, which includes texture, line parallelism, etc. The noise reduction capabilities of this method and its use in image rectification and registration are discussed.
Scenario Analysis: An Integrative Study and Guide to Implementation in the United States Air Force
1994-09-01
Environmental Analysis ................................ 3-3 Classifications of Environments ......................... 3-5 Characteristics of... Environments ........................ 3-8 iii Page Components of the Environmental Analysis Process ........... 3-12 Forecasting... Environmental Analysis ...................... 3-4 3-2 Model of the Industry Environment ......................... 3-6 3-3 Model of Macroenvironment
An Extension of Dominance Analysis to Canonical Correlation Analysis
ERIC Educational Resources Information Center
Huo, Yan; Budescu, David V.
2009-01-01
Dominance analysis (Budescu, 1993) offers a general framework for determination of relative importance of predictors in univariate and multivariate multiple regression models. This approach relies on pairwise comparisons of the contribution of predictors in all relevant subset models. In this article we extend dominance analysis to canonical…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines
Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less
Cook, James P; Mahajan, Anubha; Morris, Andrew P
2017-02-01
Linear mixed models are increasingly used for the analysis of genome-wide association studies (GWAS) of binary phenotypes because they can efficiently and robustly account for population stratification and relatedness through inclusion of random effects for a genetic relationship matrix. However, the utility of linear (mixed) models in the context of meta-analysis of GWAS of binary phenotypes has not been previously explored. In this investigation, we present simulations to compare the performance of linear and logistic regression models under alternative weighting schemes in a fixed-effects meta-analysis framework, considering designs that incorporate variable case-control imbalance, confounding factors and population stratification. Our results demonstrate that linear models can be used for meta-analysis of GWAS of binary phenotypes, without loss of power, even in the presence of extreme case-control imbalance, provided that one of the following schemes is used: (i) effective sample size weighting of Z-scores or (ii) inverse-variance weighting of allelic effect sizes after conversion onto the log-odds scale. Our conclusions thus provide essential recommendations for the development of robust protocols for meta-analysis of binary phenotypes with linear models.
An introduction to Space Weather Integrated Modeling
NASA Astrophysics Data System (ADS)
Zhong, D.; Feng, X.
2012-12-01
The need for a software toolkit that integrates space weather models and data is one of many challenges we are facing with when applying the models to space weather forecasting. To meet this challenge, we have developed Space Weather Integrated Modeling (SWIM) that is capable of analysis and visualizations of the results from a diverse set of space weather models. SWIM has a modular design and is written in Python, by using NumPy, matplotlib, and the Visualization ToolKit (VTK). SWIM provides data management module to read a variety of spacecraft data products and a specific data format of Solar-Interplanetary Conservation Element/Solution Element MHD model (SIP-CESE MHD model) for the study of solar-terrestrial phenomena. Data analysis, visualization and graphic user interface modules are also presented in a user-friendly way to run the integrated models and visualize the 2-D and 3-D data sets interactively. With these tools we can locally or remotely analysis the model result rapidly, such as extraction of data on specific location in time-sequence data sets, plotting interplanetary magnetic field lines, multi-slicing of solar wind speed, volume rendering of solar wind density, animation of time-sequence data sets, comparing between model result and observational data. To speed-up the analysis, an in-situ visualization interface is used to support visualizing the data 'on-the-fly'. We also modified some critical time-consuming analysis and visualization methods with the aid of GPU and multi-core CPU. We have used this tool to visualize the data of SIP-CESE MHD model in real time, and integrated the Database Model of shock arrival, Shock Propagation Model, Dst forecasting model and SIP-CESE MHD model developed by SIGMA Weather Group at State Key Laboratory of Space Weather/CAS.
Papanastasiou, Giorgos; Williams, Michelle C; Kershaw, Lucy E; Dweck, Marc R; Alam, Shirjel; Mirsadraee, Saeed; Connell, Martin; Gray, Calum; MacGillivray, Tom; Newby, David E; Semple, Scott Ik
2015-02-17
Mathematical modeling of cardiovascular magnetic resonance perfusion data allows absolute quantification of myocardial blood flow. Saturation of left ventricle signal during standard contrast administration can compromise the input function used when applying these models. This saturation effect is evident during application of standard Fermi models in single bolus perfusion data. Dual bolus injection protocols have been suggested to eliminate saturation but are much less practical in the clinical setting. The distributed parameter model can also be used for absolute quantification but has not been applied in patients with coronary artery disease. We assessed whether distributed parameter modeling might be less dependent on arterial input function saturation than Fermi modeling in healthy volunteers. We validated the accuracy of each model in detecting reduced myocardial blood flow in stenotic vessels versus gold-standard invasive methods. Eight healthy subjects were scanned using a dual bolus cardiac perfusion protocol at 3T. We performed both single and dual bolus analysis of these data using the distributed parameter and Fermi models. For the dual bolus analysis, a scaled pre-bolus arterial input function was used. In single bolus analysis, the arterial input function was extracted from the main bolus. We also performed analysis using both models of single bolus data obtained from five patients with coronary artery disease and findings were compared against independent invasive coronary angiography and fractional flow reserve. Statistical significance was defined as two-sided P value < 0.05. Fermi models overestimated myocardial blood flow in healthy volunteers due to arterial input function saturation in single bolus analysis compared to dual bolus analysis (P < 0.05). No difference was observed in these volunteers when applying distributed parameter-myocardial blood flow between single and dual bolus analysis. In patients, distributed parameter modeling was able to detect reduced myocardial blood flow at stress (<2.5 mL/min/mL of tissue) in all 12 stenotic vessels compared to only 9 for Fermi modeling. Comparison of single bolus versus dual bolus values suggests that distributed parameter modeling is less dependent on arterial input function saturation than Fermi modeling. Distributed parameter modeling showed excellent accuracy in detecting reduced myocardial blood flow in all stenotic vessels.
TRAC-PD2 posttest analysis of the CCTF Evaluation-Model Test C1-19 (Run 38). [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Motley, F.
The results of a Transient Reactor Analysis Code posttest analysis of the Cylindral Core Test Facility Evaluation-Model Test agree very well with the results of the experiment. The good agreement obtained verifies the multidimensional analysis capability of the TRAC code. Because of the steep radial power profile, the importance of using fine noding in the core region was demonstrated (as compared with poorer results obtained from an earlier pretest prediction that used a coarsely noded model).
User-defined Material Model for Thermo-mechanical Progressive Failure Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.
2008-01-01
Previously a user-defined material model for orthotropic bimodulus materials was developed for linear and nonlinear stress analysis of composite structures using either shell or solid finite elements within a nonlinear finite element analysis tool. Extensions of this user-defined material model to thermo-mechanical progressive failure analysis are described, and the required input data are documented. The extensions include providing for temperature-dependent material properties, archival of the elastic strains, and a thermal strain calculation for materials exhibiting a stress-free temperature.
NASA Technical Reports Server (NTRS)
Hodges, Robert V.; Nixon, Mark W.; Rehfield, Lawrence W.
1987-01-01
A methodology was developed for the structural analysis of composite rotor blades. This coupled-beam analysis is relatively simple to use compared with alternative analysis techniques. The beam analysis was developed for thin-wall single-cell rotor structures and includes the effects of elastic coupling. This paper demonstrates the effectiveness of the new composite-beam analysis method through comparison of its results with those of an established baseline analysis technique. The baseline analysis is an MSC/NASTRAN finite-element model built up from anisotropic shell elements. Deformations are compared for three linear static load cases of centrifugal force at design rotor speed, applied torque, and lift for an ideal rotor in hover. A D-spar designed to twist under axial loading is the subject of the analysis. Results indicate the coupled-beam analysis is well within engineering accuracy.
Evaluation of the C* Model for Addressing Short Fatigue Crack Growth
2008-10-01
FASTRAN/CGAP, the internal solution which evaluates the crack growth independently in the thickness and width direction was used . The analysis ...that used for the FASTRAN/CGAP analysis . The initial crack size used for all the models is 77 μm, as per [8]. From the viewpoint of engineering...Haddad Model, a0=0.05 mm El Haddad Model, a0=0.103 mm Figure 17: Comparison of crack growth analysis using modified El Haddad approach with
How Stationary Are the Internal Tides in a High-Resolution Global Ocean Circulation Model?
2014-05-12
Egbert et al., 1994] and that the model global internal tide amplitudes compare well with an altimetric-based tidal analysis [Ray and Byrne, 2010]. The... analysis [Foreman, 1977] applied to the HYCOM total SSH. We will follow Shriver et al. [2012], analyzing the tides along satellite altimeter tracks...spots,’’ the comparison between the model and altimetric analysis is not as good due, in part, to two prob- lems, errors in the model barotropic tides and
NASA Technical Reports Server (NTRS)
Farrell, C. E.; Krauze, L. D.
1983-01-01
The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.
A modeling analysis program for the JPL table mountain Io sodium cloud data
NASA Technical Reports Server (NTRS)
Smyth, W. H.; Goldberg, B. A.
1984-01-01
A detailed review of 110 of the 263 Region B/C images of the 1981 data set is undertaken and a preliminary assessment of 39 images of the 1976-79 data set is presented. The basic spatial characteristics of these images are discussed. Modeling analysis of these images after further data processing will provide useful information about Io and the planetary magnetosphere. Plans for data processing and modeling analysis are outlined. Results of very preliminary modeling activities are presented.
Representing Uncertainty on Model Analysis Plots
ERIC Educational Resources Information Center
Smith, Trevor I.
2016-01-01
Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model.…
Background / Question / Methods Planning for the recovery of threatened species is increasingly informed by spatially-explicit population models. However, using simulation model results to guide land management decisions can be difficult due to the volume and complexity of model...
Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melaina, M.; Penev, M.
2012-09-01
NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.
Bird impact analysis package for turbine engine fan blades
NASA Technical Reports Server (NTRS)
Hirschbein, M. S.
1982-01-01
A computer program has been developed to analyze the gross structural response of turbine engine fan blades subjected to bird strikes. The program couples a NASTRAN finite element model and modal analysis of a fan blade with a multi-mode bird impact analysis computer program. The impact analysis uses the NASTRAN blade model and a fluid jet model of the bird to interactively calculate blade loading during a bird strike event. The analysis package is computationaly efficient, easy to use and provides a comprehensive history of the gross structual blade response. Example cases are presented for a representative fan blade.
Assessment of environmental impacts part one. Intervention analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hipel, Keith William; Lettenmaier, Dennis P.; McLeod, A. Ian
The use of intervention analysis as a statistical method of gauging the effects of environmental changes is discussed. The Box-Jenkins model, serves as the basis for the intervention analysis methodology. Environmental studies of the Aswan Dam, the South Saskatchewan River, and a forest fire near the Pipers Hole River, Canada, are included as case studies in which intervention analysis was employed. Methods of data collection for intervention analysis are found to have a significant impact on model reliability; effective data collection processes for the Box-Jenkins model are provided. (15 graphs, 27 references, 2 tables)
APPLE - An aeroelastic analysis system for turbomachines and propfans
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Bakhle, Milind A.; Srivastava, R.; Mehmed, Oral
1992-01-01
This paper reviews aeroelastic analysis methods for propulsion elements (advanced propellers, compressors and turbines) being developed and used at NASA Lewis Research Center. These aeroelastic models include both structural and aerodynamic components. The structural models include the typical section model, the beam model with and without disk flexibility, and the finite element blade model with plate bending elements. The aerodynamic models are based on the solution of equations ranging from the two-dimensional linear potential equation for a cascade to the three-dimensional Euler equations for multi-blade configurations. Typical results are presented for each aeroelastic model. Suggestions for further research are indicated. All the available aeroelastic models and analysis methods are being incorporated into a unified computer program named APPLE (Aeroelasticity Program for Propulsion at LEwis).
Haitsma, Jack J.; Furmli, Suleiman; Masoom, Hussain; Liu, Mingyao; Imai, Yumiko; Slutsky, Arthur S.; Beyene, Joseph; Greenwood, Celia M. T.; dos Santos, Claudia
2012-01-01
Objectives To perform a meta-analysis of gene expression microarray data from animal studies of lung injury, and to identify an injury-specific gene expression signature capable of predicting the development of lung injury in humans. Methods We performed a microarray meta-analysis using 77 microarray chips across six platforms, two species and different animal lung injury models exposed to lung injury with or/and without mechanical ventilation. Individual gene chips were classified and grouped based on the strategy used to induce lung injury. Effect size (change in gene expression) was calculated between non-injurious and injurious conditions comparing two main strategies to pool chips: (1) one-hit and (2) two-hit lung injury models. A random effects model was used to integrate individual effect sizes calculated from each experiment. Classification models were built using the gene expression signatures generated by the meta-analysis to predict the development of lung injury in human lung transplant recipients. Results Two injury-specific lists of differentially expressed genes generated from our meta-analysis of lung injury models were validated using external data sets and prospective data from animal models of ventilator-induced lung injury (VILI). Pathway analysis of gene sets revealed that both new and previously implicated VILI-related pathways are enriched with differentially regulated genes. Classification model based on gene expression signatures identified in animal models of lung injury predicted development of primary graft failure (PGF) in lung transplant recipients with larger than 80% accuracy based upon injury profiles from transplant donors. We also found that better classifier performance can be achieved by using meta-analysis to identify differentially-expressed genes than using single study-based differential analysis. Conclusion Taken together, our data suggests that microarray analysis of gene expression data allows for the detection of “injury" gene predictors that can classify lung injury samples and identify patients at risk for clinically relevant lung injury complications. PMID:23071521
NASA Technical Reports Server (NTRS)
Sun, C. T.; Yoon, K. J.
1990-01-01
A one-parameter plasticity model was shown to adequately describe the orthotropic plastic deformation of AS4/PEEK (APC-2) unidirectional thermoplastic composite. This model was verified further for unidirectional and laminated composite panels with and without a hole. The nonlinear stress-strain relations were measured and compared with those predicted by the finite element analysis using the one-parameter elastic-plastic constitutive model. The results show that the one-parameter orthotropic plasticity model is suitable for the analysis of elastic-plastic deformation of AS4/PEEK composite laminates.
Elastic-plastic analysis of AS4/PEEK composite laminate using a one-parameter plasticity model
NASA Technical Reports Server (NTRS)
Sun, C. T.; Yoon, K. J.
1992-01-01
A one-parameter plasticity model was shown to adequately describe the plastic deformation of AS4/PEEK (APC-2) unidirectional thermoplastic composite. This model was verified further for unidirectional and laminated composite panels with and without a hole. The elastic-plastic stress-strain relations of coupon specimens were measured and compared with those predicted by the finite element analysis using the one-parameter plasticity model. The results show that the one-parameter plasticity model is suitable for the analysis of elastic-plastic deformation of AS4/PEEK composite laminates.
A kinematic model to assess spinal motion during walking.
Konz, Regina J; Fatone, Stefania; Stine, Rebecca L; Ganju, Aruna; Gard, Steven A; Ondra, Stephen L
2006-11-15
A 3-dimensional multi-segment kinematic spine model was developed for noninvasive analysis of spinal motion during walking. Preliminary data from able-bodied ambulators were collected and analyzed using the model. Neither the spine's role during walking nor the effect of surgical spinal stabilization on gait is fully understood. Typically, gait analysis models disregard the spine entirely or regard it as a single rigid structure. Data on regional spinal movements, in conjunction with lower limb data, associated with walking are scarce. KinTrak software (Motion Analysis Corp., Santa Rosa, CA) was used to create a biomechanical model for analysis of 3-dimensional regional spinal movements. Measuring known angles from a mechanical model and comparing them to the calculated angles validated the kinematic model. Spine motion data were collected from 10 able-bodied adults walking at 5 self-selected speeds. These results were compared to data reported in the literature. The uniaxial angles measured on the mechanical model were within 5 degrees of the calculated kinematic model angles, and the coupled angles were within 2 degrees. Regional spine kinematics from able-bodied subjects calculated with this model compared well to data reported by other authors. A multi-segment kinematic spine model has been developed and validated for analysis of spinal motion during walking. By understanding the spine's role during ambulation and the cause-and-effect relationship between spine motion and lower limb motion, preoperative planning may be augmented to restore normal alignment and balance with minimal negative effects on walking.
Global Sensitivity Analysis for Process Identification under Model Uncertainty
NASA Astrophysics Data System (ADS)
Ye, M.; Dai, H.; Walker, A. P.; Shi, L.; Yang, J.
2015-12-01
The environmental system consists of various physical, chemical, and biological processes, and environmental models are always built to simulate these processes and their interactions. For model building, improvement, and validation, it is necessary to identify important processes so that limited resources can be used to better characterize the processes. While global sensitivity analysis has been widely used to identify important processes, the process identification is always based on deterministic process conceptualization that uses a single model for representing a process. However, environmental systems are complex, and it happens often that a single process may be simulated by multiple alternative models. Ignoring the model uncertainty in process identification may lead to biased identification in that identified important processes may not be so in the real world. This study addresses this problem by developing a new method of global sensitivity analysis for process identification. The new method is based on the concept of Sobol sensitivity analysis and model averaging. Similar to the Sobol sensitivity analysis to identify important parameters, our new method evaluates variance change when a process is fixed at its different conceptualizations. The variance considers both parametric and model uncertainty using the method of model averaging. The method is demonstrated using a synthetic study of groundwater modeling that considers recharge process and parameterization process. Each process has two alternative models. Important processes of groundwater flow and transport are evaluated using our new method. The method is mathematically general, and can be applied to a wide range of environmental problems.
Modeling abundance using multinomial N-mixture models
Royle, Andy
2016-01-01
Multinomial N-mixture models are a generalization of the binomial N-mixture models described in Chapter 6 to allow for more complex and informative sampling protocols beyond simple counts. Many commonly used protocols such as multiple observer sampling, removal sampling, and capture-recapture produce a multivariate count frequency that has a multinomial distribution and for which multinomial N-mixture models can be developed. Such protocols typically result in more precise estimates than binomial mixture models because they provide direct information about parameters of the observation process. We demonstrate the analysis of these models in BUGS using several distinct formulations that afford great flexibility in the types of models that can be developed, and we demonstrate likelihood analysis using the unmarked package. Spatially stratified capture-recapture models are one class of models that fall into the multinomial N-mixture framework, and we discuss analysis of stratified versions of classical models such as model Mb, Mh and other classes of models that are only possible to describe within the multinomial N-mixture framework.
Ensemble habitat mapping of invasive plant species
Stohlgren, T.J.; Ma, P.; Kumar, S.; Rocca, M.; Morisette, J.T.; Jarnevich, C.S.; Benson, N.
2010-01-01
Ensemble species distribution models combine the strengths of several species environmental matching models, while minimizing the weakness of any one model. Ensemble models may be particularly useful in risk analysis of recently arrived, harmful invasive species because species may not yet have spread to all suitable habitats, leaving species-environment relationships difficult to determine. We tested five individual models (logistic regression, boosted regression trees, random forest, multivariate adaptive regression splines (MARS), and maximum entropy model or Maxent) and ensemble modeling for selected nonnative plant species in Yellowstone and Grand Teton National Parks, Wyoming; Sequoia and Kings Canyon National Parks, California, and areas of interior Alaska. The models are based on field data provided by the park staffs, combined with topographic, climatic, and vegetation predictors derived from satellite data. For the four invasive plant species tested, ensemble models were the only models that ranked in the top three models for both field validation and test data. Ensemble models may be more robust than individual species-environment matching models for risk analysis. ?? 2010 Society for Risk Analysis.
Methods for assessing the stability of slopes during earthquakes-A retrospective
Jibson, R.W.
2011-01-01
During the twentieth century, several methods to assess the stability of slopes during earthquakes were developed. Pseudostatic analysis was the earliest method; it involved simply adding a permanent body force representing the earthquake shaking to a static limit-equilibrium analysis. Stress-deformation analysis, a later development, involved much more complex modeling of slopes using a mesh in which the internal stresses and strains within elements are computed based on the applied external loads, including gravity and seismic loads. Stress-deformation analysis provided the most realistic model of slope behavior, but it is very complex and requires a high density of high-quality soil-property data as well as an accurate model of soil behavior. In 1965, Newmark developed a method that effectively bridges the gap between these two types of analysis. His sliding-block model is easy to apply and provides a useful index of co-seismic slope performance. Subsequent modifications to sliding-block analysis have made it applicable to a wider range of landslide types. Sliding-block analysis provides perhaps the greatest utility of all the types of analysis. It is far easier to apply than stress-deformation analysis, and it yields much more useful information than does pseudostatic analysis. ?? 2010.
Advances and trends in the development of computational models for tires
NASA Technical Reports Server (NTRS)
Noor, A. K.; Tanner, J. A.
1985-01-01
Status and some recent developments of computational models for tires are summarized. Discussion focuses on a number of aspects of tire modeling and analysis including: tire materials and their characterization; evolution of tire models; characteristics of effective finite element models for analyzing tires; analysis needs for tires; and impact of the advances made in finite element technology, computational algorithms, and new computing systems on tire modeling and analysis. An initial set of benchmark problems has been proposed in concert with the U.S. tire industry. Extensive sets of experimental data will be collected for these problems and used for evaluating and validating different tire models. Also, the new Aircraft Landing Dynamics Facility (ALDF) at NASA Langley Research Center is described.
Logical Modeling and Dynamical Analysis of Cellular Networks
Abou-Jaoudé, Wassim; Traynard, Pauline; Monteiro, Pedro T.; Saez-Rodriguez, Julio; Helikar, Tomáš; Thieffry, Denis; Chaouiya, Claudine
2016-01-01
The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle. PMID:27303434
Energy and technology review: Engineering modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cabayan, H.S.; Goudreau, G.L.; Ziolkowski, R.W.
1986-10-01
This report presents information concerning: Modeling Canonical Problems in Electromagnetic Coupling Through Apertures; Finite-Element Codes for Computing Electrostatic Fields; Finite-Element Modeling of Electromagnetic Phenomena; Modeling Microwave-Pulse Compression in a Resonant Cavity; Lagrangian Finite-Element Analysis of Penetration Mechanics; Crashworthiness Engineering; Computer Modeling of Metal-Forming Processes; Thermal-Mechanical Modeling of Tungsten Arc Welding; Modeling Air Breakdown Induced by Electromagnetic Fields; Iterative Techniques for Solving Boltzmann's Equations for p-Type Semiconductors; Semiconductor Modeling; and Improved Numerical-Solution Techniques in Large-Scale Stress Analysis.
Analysis of stress-strain relationships in silicon ribbon
NASA Technical Reports Server (NTRS)
Dillon, O. W., Jr.
1984-01-01
An analysis of stress-strain relationships in silicon ribbon is presented. A model to present entire process, dynamical Transit Analysis is developed. It is found that knowledge of past-strain history is significant in modeling activities.
Dynamic Blowout Risk Analysis Using Loss Functions.
Abimbola, Majeed; Khan, Faisal
2018-02-01
Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.
Application of structured analysis to a telerobotic system
NASA Technical Reports Server (NTRS)
Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven
1990-01-01
The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.
Pasta, D J; Taylor, J L; Henning, J M
1999-01-01
Decision-analytic models are frequently used to evaluate the relative costs and benefits of alternative therapeutic strategies for health care. Various types of sensitivity analysis are used to evaluate the uncertainty inherent in the models. Although probabilistic sensitivity analysis is more difficult theoretically and computationally, the results can be much more powerful and useful than deterministic sensitivity analysis. The authors show how a Monte Carlo simulation can be implemented using standard software to perform a probabilistic sensitivity analysis incorporating the bootstrap. The method is applied to a decision-analytic model evaluating the cost-effectiveness of Helicobacter pylori eradication. The necessary steps are straightforward and are described in detail. The use of the bootstrap avoids certain difficulties encountered with theoretical distributions. The probabilistic sensitivity analysis provided insights into the decision-analytic model beyond the traditional base-case and deterministic sensitivity analyses and should become the standard method for assessing sensitivity.
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians
NASA Astrophysics Data System (ADS)
Dinklage, Andreas; Dreier, Heiko; Fischer, Rainer; Gori, Silvio; Preuss, Roland; Toussaint, Udo von
2008-03-01
Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.
Recent literature on structural modeling, identification, and analysis
NASA Technical Reports Server (NTRS)
Craig, Roy R., Jr.
1990-01-01
The literature on the mathematical modeling of large space structures is first reviewed, with attention given to continuum models, model order reduction, substructuring, and computational techniques. System identification and mode verification are then discussed with reference to the verification of mathematical models of large space structures. In connection with analysis, the paper surveys recent research on eigensolvers and dynamic response solvers for large-order finite-element-based models.
An Analysis of the Navy’s Voluntary Education Program
2007-03-01
NAVAL ANALYSIS VOLED STUDY .........11 1. Data .........................................11 2. Statistical Models ...........................12 3...B. EMPLOYER FINANCED GENERAL TRAINING ................31 1. Data .........................................32 2. Statistical Model...37 1. Data .........................................38 2. Statistical Model ............................38 3. Findings
Speed estimation for air quality analysis.
DOT National Transportation Integrated Search
2005-05-01
Average speed is an essential input to the air quality analysis model MOBILE6 for emission factor calculation. Traditionally, speed is obtained from travel demand models. However, such models are not usually calibrated to speeds. Furthermore, for rur...
NASA Technical Reports Server (NTRS)
Johnston, John D.; Howard, Joseph M.; Mosier, Gary E.; Parrish, Keith A.; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.
2004-01-01
The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal-optical, often referred to as STOP, analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. Temperatures predicted using geometric and thermal math models are mapped to a structural finite element model in order to predict thermally induced deformations. Motions and deformations at optical surfaces are then input to optical models, and optical performance is predicted using either an optical ray trace or a linear optical analysis tool. In addition to baseline performance predictions, a process for performing sensitivity studies to assess modeling uncertainties is described.
Automatic network coupling analysis for dynamical systems based on detailed kinetic models.
Lebiedz, Dirk; Kammerer, Julia; Brandt-Pollmann, Ulrich
2005-10-01
We introduce a numerical complexity reduction method for the automatic identification and analysis of dynamic network decompositions in (bio)chemical kinetics based on error-controlled computation of a minimal model dimension represented by the number of (locally) active dynamical modes. Our algorithm exploits a generalized sensitivity analysis along state trajectories and subsequent singular value decomposition of sensitivity matrices for the identification of these dominant dynamical modes. It allows for a dynamic coupling analysis of (bio)chemical species in kinetic models that can be exploited for the piecewise computation of a minimal model on small time intervals and offers valuable functional insight into highly nonlinear reaction mechanisms and network dynamics. We present results for the identification of network decompositions in a simple oscillatory chemical reaction, time scale separation based model reduction in a Michaelis-Menten enzyme system and network decomposition of a detailed model for the oscillatory peroxidase-oxidase enzyme system.
Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2008-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Malley, Daniel; Vesselinov, Velimir V.
MADSpython (Model analysis and decision support tools in Python) is a code in Python that streamlines the process of using data and models for analysis and decision support using the code MADS. MADS is open-source code developed at LANL and written in C/C++ (MADS; http://mads.lanl.gov; LA-CC-11-035). MADS can work with external models of arbitrary complexity as well as built-in models of flow and transport in porous media. The Python scripts in MADSpython facilitate the generation of input and output file needed by MADS as wells as the external simulators which include FEHM and PFLOTRAN. MADSpython enables a number of data-more » and model-based analyses including model calibration, sensitivity analysis, uncertainty quantification, and decision analysis. MADSpython will be released under GPL V3 license. MADSpython will be distributed as a Git repo at gitlab.com and github.com. MADSpython manual and documentation will be posted at http://madspy.lanl.gov.« less
Analysis of mortality data from the former USSR: age-period-cohort analysis.
Willekens, F; Scherbov, S
1992-01-01
The objective of this article is to review research on age-period-cohort (APC) analysis of mortality and to trace the effects of contemporary and historical factors on mortality change in the former USSR. Several events in USSR history have exerted a lasting influence on its people. These influences may be captured by an APC model in which the period effects measure the impact of contemporary factors and the cohort effects the past history of individuals which cannot be attributed to age or stage in the life cycle. APC models are extensively applied in the study of mortality. This article presents the statistical theory of the APC models and shows that they belong to the family of generalized linear models. The parameters of the APC model may therefore be estimated by any package of loglinear analysis that allows for hybrid loglinear models.
MSEE: Stochastic Cognitive Linguistic Behavior Models for Semantic Sensing
2013-09-01
recognition, a Gaussian Process Dynamic Model with Social Network Analysis (GPDM-SNA) for a small human group action recognition, an extended GPDM-SNA...44 3.2. Small Human Group Activity Modeling Based on Gaussian Process Dynamic Model and Social Network Analysis (SN-GPDM...51 Approved for public release; distribution unlimited. 3 3.2.3. Gaussian Process Dynamical Model and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henager, Charles H.; Nguyen, Ba Nghiep; Kurtz, Richard J.
2016-03-31
Finite element continuum damage models (FE-CDM) have been developed to simulate and model dual-phase joints and cracked joints for improved analysis of SiC materials in nuclear environments. This report extends the analysis from the last reporting cycle by including results from dual-phase models and from cracked joint models.
Systems Engineering Models and Tools | Wind | NREL
(tm)) that provides wind turbine and plant engineering and cost models for holistic system analysis turbine/component models and wind plant analysis models that the systems engineering team produces. If you integrated modeling of wind turbines and plants. It provides guidance for overall wind turbine and plant
Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith
2018-01-02
Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour and identifies the sets of rate parameters of interest.
Lee, Yeonok; Wu, Hulin
2012-01-01
Differential equation models are widely used for the study of natural phenomena in many fields. The study usually involves unknown factors such as initial conditions and/or parameters. It is important to investigate the impact of unknown factors (parameters and initial conditions) on model outputs in order to better understand the system the model represents. Apportioning the uncertainty (variation) of output variables of a model according to the input factors is referred to as sensitivity analysis. In this paper, we focus on the global sensitivity analysis of ordinary differential equation (ODE) models over a time period using the multivariate adaptive regression spline (MARS) as a meta model based on the concept of the variance of conditional expectation (VCE). We suggest to evaluate the VCE analytically using the MARS model structure of univariate tensor-product functions which is more computationally efficient. Our simulation studies show that the MARS model approach performs very well and helps to significantly reduce the computational cost. We present an application example of sensitivity analysis of ODE models for influenza infection to further illustrate the usefulness of the proposed method.
Some aspects of the analysis of geodetic strain observations in kinematic models
NASA Astrophysics Data System (ADS)
Welsch, W. M.
1986-11-01
Frequently, deformation processes are analyzed in static models. In many cases, this procedure is justified, in particular if the deformation occurring is a singular event. If. however, the deformation is a continuous process, as is the case, for instance, with recent crustal movements, the analysis in kinematic models is more commensurate with the problem because the factor "time" is considered an essential part of the model. Some specialities have to be considered when analyzing geodetic strain observations in kinematic models. They are dealt with in this paper. After a brief derivation of the basic kinematic model and the kinematic strain model, the following subjects are treated: the adjustment of the pointwise velocity field and the derivation of strain-rate parameters; the fixing of the kinematic reference system as part of the geodetic datum; statistical tests of models by testing linear hypotheses; the invariance of kinematic strain-rate parameters with respect to transformations of the coordinate-system and the geodetic datum; the interpolation of strain rates by finite-element methods. After the representation of some advanced models for the description of secular and episodic kinematic processes, the data analysis in dynamic models is regarded as a further generalization of deformation analysis.
State Event Models for the Formal Analysis of Human-Machine Interactions
NASA Technical Reports Server (NTRS)
Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles
2014-01-01
The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.
Transportation Systems Evaluation
NASA Technical Reports Server (NTRS)
Fanning, M. L.; Michelson, R. A.
1972-01-01
A methodology for the analysis of transportation systems consisting of five major interacting elements is reported. The analysis begins with the causes of travel demand: geographic, economic, and demographic characteristics as well as attitudes toward travel. Through the analysis, the interaction of these factors with the physical and economic characteristics of the transportation system is determined. The result is an evaluation of the system from the point of view of both passenger and operator. The methodology is applicable to the intraurban transit systems as well as major airlines. Applications of the technique to analysis of a PRT system and a study of intraurban air travel are given. In the discussion several unique models or techniques are mentioned: i.e., passenger preference modeling, an integrated intraurban transit model, and a series of models to perform airline analysis.
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
Streamflow characterization using functional data analysis of the Potomac River
NASA Astrophysics Data System (ADS)
Zelmanow, A.; Maslova, I.; Ticlavilca, A. M.; McKee, M.
2013-12-01
Flooding and droughts are extreme hydrological events that affect the United States economically and socially. The severity and unpredictability of flooding has caused billions of dollars in damage and the loss of lives in the eastern United States. In this context, there is an urgent need to build a firm scientific basis for adaptation by developing and applying new modeling techniques for accurate streamflow characterization and reliable hydrological forecasting. The goal of this analysis is to use numerical streamflow characteristics in order to classify, model, and estimate the likelihood of extreme events in the eastern United States, mainly the Potomac River. Functional data analysis techniques are used to study yearly streamflow patterns, with the extreme streamflow events characterized via functional principal component analysis. These methods are merged with more classical techniques such as cluster analysis, classification analysis, and time series modeling. The developed functional data analysis approach is used to model continuous streamflow hydrographs. The forecasting potential of this technique is explored by incorporating climate factors to produce a yearly streamflow outlook.
Advanced superposition methods for high speed turbopump vibration analysis
NASA Technical Reports Server (NTRS)
Nielson, C. E.; Campany, A. D.
1981-01-01
The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.
NASA Technical Reports Server (NTRS)
Frederick, D. K.; Lashmet, P. K.; Sandor, G. N.; Shen, C. N.; Smith, E. V.; Yerazunis, S. W.
1973-01-01
Problems related to the design and control of a mobile planetary vehicle to implement a systematic plan for the exploration of Mars are reported. Problem areas include: vehicle configuration, control, dynamics, systems and propulsion; systems analysis, terrain modeling and path selection; and chemical analysis of specimens. These tasks are summarized: vehicle model design, mathematical model of vehicle dynamics, experimental vehicle dynamics, obstacle negotiation, electrochemical controls, remote control, collapsibility and deployment, construction of a wheel tester, wheel analysis, payload design, system design optimization, effect of design assumptions, accessory optimal design, on-board computer subsystem, laser range measurement, discrete obstacle detection, obstacle detection systems, terrain modeling, path selection system simulation and evaluation, gas chromatograph/mass spectrometer system concepts, and chromatograph model evaluation and improvement.
Penetration analysis of projectile with inclined concrete target
NASA Astrophysics Data System (ADS)
Kim, S. B.; Kim, H. W.; Yoo, Y. H.
2015-09-01
This paper presents numerical analysis result of projectile penetration with concrete target. We applied dynamic material properties of 4340 steels, aluminium and explosive for projectile body. Dynamic material properties were measured with static tensile testing machine and Hopkinson pressure bar tests. Moreover, we used three concrete damage models included in LS-DYNA 3D, such as SOIL_CONCRETE, CSCM (cap model with smooth interaction) and CONCRETE_DAMAGE (K&C concrete) models. Strain rate effect for concrete material is important to predict the fracture deformation and shape of concrete, and penetration depth for projectiles. CONCRETE_DAMAGE model with strain rate effect also applied to penetration analysis. Analysis result with CSCM model shows good agreement with penetration experimental data. The projectile trace and fracture shapes of concrete target were compared with experimental data.
The multiple complex exponential model and its application to EEG analysis
NASA Astrophysics Data System (ADS)
Chen, Dao-Mu; Petzold, J.
The paper presents a novel approach to the analysis of the EEG signal, which is based on a multiple complex exponential (MCE) model. Parameters of the model are estimated using a nonharmonic Fourier expansion algorithm. The central idea of the algorithm is outlined, and the results, estimated on the basis of simulated data, are presented and compared with those obtained by the conventional methods of signal analysis. Preliminary work on various application possibilities of the MCE model in EEG data analysis is described. It is shown that the parameters of the MCE model reflect the essential information contained in an EEG segment. These parameters characterize the EEG signal in a more objective way because they are closer to the recent supposition of the nonlinear character of the brain's dynamic behavior.
Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan
2015-10-29
This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parametersmore » applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.« less
Hurtado Rúa, Sandra M; Mazumdar, Madhu; Strawderman, Robert L
2015-12-30
Bayesian meta-analysis is an increasingly important component of clinical research, with multivariate meta-analysis a promising tool for studies with multiple endpoints. Model assumptions, including the choice of priors, are crucial aspects of multivariate Bayesian meta-analysis (MBMA) models. In a given model, two different prior distributions can lead to different inferences about a particular parameter. A simulation study was performed in which the impact of families of prior distributions for the covariance matrix of a multivariate normal random effects MBMA model was analyzed. Inferences about effect sizes were not particularly sensitive to prior choice, but the related covariance estimates were. A few families of prior distributions with small relative biases, tight mean squared errors, and close to nominal coverage for the effect size estimates were identified. Our results demonstrate the need for sensitivity analysis and suggest some guidelines for choosing prior distributions in this class of problems. The MBMA models proposed here are illustrated in a small meta-analysis example from the periodontal field and a medium meta-analysis from the study of stroke. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Lipid emulsion improves survival in animal models of local anesthetic toxicity: a meta-analysis.
Fettiplace, Michael R; McCabe, Daniel J
2017-08-01
The Lipid Emulsion Therapy workgroup, organized by the American Academy of Clinical Toxicology, recently conducted a systematic review, which subjectively evaluated lipid emulsion as a treatment for local anesthetic toxicity. We re-extracted data and conducted a meta-analysis of survival in animal models. We extracted survival data from 26 publications and conducted a random-effect meta-analysis based on odds ratio weighted by inverse variance. We assessed the benefit of lipid emulsion as an independent variable in resuscitative models (16 studies). We measured Cochran's Q for heterogeneity and I 2 to determine variance contributed by heterogeneity. Finally, we conducted a funnel plot analysis and Egger's test to assess for publication bias in studies. Lipid emulsion reduced the odds of death in resuscitative models (OR =0.24; 95%CI: 0.1-0.56, p = .0012). Heterogeneity analysis indicated a homogenous distribution. Funnel plot analysis did not indicate publication bias in experimental models. Meta-analysis of animal data supports the use of lipid emulsion (in combination with other resuscitative measures) for the treatment of local anesthetic toxicity, specifically from bupivacaine. Our conclusion differed from the original review. Analysis of outliers reinforced the need for good life support measures (securement of airway and chest compressions) along with prompt treatment with lipid.
User-Defined Material Model for Progressive Failure Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)
2006-01-01
An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.
The analysis and modelling of dilatational terms in compressible turbulence
NASA Technical Reports Server (NTRS)
Sarkar, S.; Erlebacher, G.; Hussaini, M. Y.; Kreiss, H. O.
1991-01-01
It is shown that the dilatational terms that need to be modeled in compressible turbulence include not only the pressure-dilatation term but also another term - the compressible dissipation. The nature of these dilatational terms in homogeneous turbulence is explored by asymptotic analysis of the compressible Navier-Stokes equations. A non-dimensional parameter which characterizes some compressible effects in moderate Mach number, homogeneous turbulence is identified. Direct numerical simulations (DNS) of isotropic, compressible turbulence are performed, and their results are found to be in agreement with the theoretical analysis. A model for the compressible dissipation is proposed; the model is based on the asymptotic analysis and the direct numerical simulations. This model is calibrated with reference to the DNS results regarding the influence of compressibility on the decay rate of isotropic turbulence. An application of the proposed model to the compressible mixing layer has shown that the model is able to predict the dramatically reduced growth rate of the compressible mixing layer.
The analysis and modeling of dilatational terms in compressible turbulence
NASA Technical Reports Server (NTRS)
Sarkar, S.; Erlebacher, G.; Hussaini, M. Y.; Kreiss, H. O.
1989-01-01
It is shown that the dilatational terms that need to be modeled in compressible turbulence include not only the pressure-dilatation term but also another term - the compressible dissipation. The nature of these dilatational terms in homogeneous turbulence is explored by asymptotic analysis of the compressible Navier-Stokes equations. A non-dimensional parameter which characterizes some compressible effects in moderate Mach number, homogeneous turbulence is identified. Direct numerical simulations (DNS) of isotropic, compressible turbulence are performed, and their results are found to be in agreement with the theoretical analysis. A model for the compressible dissipation is proposed; the model is based on the asymptotic analysis and the direct numerical simulations. This model is calibrated with reference to the DNS results regarding the influence of compressibility on the decay rate of isotropic turbulence. An application of the proposed model to the compressible mixing layer has shown that the model is able to predict the dramatically reduced growth rate of the compressible mixing layer.
Moderation analysis using a two-level regression model.
Yuan, Ke-Hai; Cheng, Ying; Maxwell, Scott
2014-10-01
Moderation analysis is widely used in social and behavioral research. The most commonly used model for moderation analysis is moderated multiple regression (MMR) in which the explanatory variables of the regression model include product terms, and the model is typically estimated by least squares (LS). This paper argues for a two-level regression model in which the regression coefficients of a criterion variable on predictors are further regressed on moderator variables. An algorithm for estimating the parameters of the two-level model by normal-distribution-based maximum likelihood (NML) is developed. Formulas for the standard errors (SEs) of the parameter estimates are provided and studied. Results indicate that, when heteroscedasticity exists, NML with the two-level model gives more efficient and more accurate parameter estimates than the LS analysis of the MMR model. When error variances are homoscedastic, NML with the two-level model leads to essentially the same results as LS with the MMR model. Most importantly, the two-level regression model permits estimating the percentage of variance of each regression coefficient that is due to moderator variables. When applied to data from General Social Surveys 1991, NML with the two-level model identified a significant moderation effect of race on the regression of job prestige on years of education while LS with the MMR model did not. An R package is also developed and documented to facilitate the application of the two-level model.
Scalable Parameter Estimation for Genome-Scale Biochemical Reaction Networks
Kaltenbacher, Barbara; Hasenauer, Jan
2017-01-01
Mechanistic mathematical modeling of biochemical reaction networks using ordinary differential equation (ODE) models has improved our understanding of small- and medium-scale biological processes. While the same should in principle hold for large- and genome-scale processes, the computational methods for the analysis of ODE models which describe hundreds or thousands of biochemical species and reactions are missing so far. While individual simulations are feasible, the inference of the model parameters from experimental data is computationally too intensive. In this manuscript, we evaluate adjoint sensitivity analysis for parameter estimation in large scale biochemical reaction networks. We present the approach for time-discrete measurement and compare it to state-of-the-art methods used in systems and computational biology. Our comparison reveals a significantly improved computational efficiency and a superior scalability of adjoint sensitivity analysis. The computational complexity is effectively independent of the number of parameters, enabling the analysis of large- and genome-scale models. Our study of a comprehensive kinetic model of ErbB signaling shows that parameter estimation using adjoint sensitivity analysis requires a fraction of the computation time of established methods. The proposed method will facilitate mechanistic modeling of genome-scale cellular processes, as required in the age of omics. PMID:28114351
The transcription factor p53: Not a repressor, solely an activator
Fischer, Martin; Steiner, Lydia; Engeland, Kurt
2014-01-01
The predominant function of the tumor suppressor p53 is transcriptional regulation. It is generally accepted that p53-dependent transcriptional activation occurs by binding to a specific recognition site in promoters of target genes. Additionally, several models for p53-dependent transcriptional repression have been postulated. Here, we evaluate these models based on a computational meta-analysis of genome-wide data. Surprisingly, several major models of p53-dependent gene regulation are implausible. Meta-analysis of large-scale data is unable to confirm reports on directly repressed p53 target genes and falsifies models of direct repression. This notion is supported by experimental re-analysis of representative genes reported as directly repressed by p53. Therefore, p53 is not a direct repressor of transcription, but solely activates its target genes. Moreover, models based on interference of p53 with activating transcription factors as well as models based on the function of ncRNAs are also not supported by the meta-analysis. As an alternative to models of direct repression, the meta-analysis leads to the conclusion that p53 represses transcription indirectly by activation of the p53-p21-DREAM/RB pathway. PMID:25486564
2012-01-01
Background We explore the benefits of applying a new proportional hazard model to analyze survival of breast cancer patients. As a parametric model, the hypertabastic survival model offers a closer fit to experimental data than Cox regression, and furthermore provides explicit survival and hazard functions which can be used as additional tools in the survival analysis. In addition, one of our main concerns is utilization of multiple gene expression variables. Our analysis treats the important issue of interaction of different gene signatures in the survival analysis. Methods The hypertabastic proportional hazards model was applied in survival analysis of breast cancer patients. This model was compared, using statistical measures of goodness of fit, with models based on the semi-parametric Cox proportional hazards model and the parametric log-logistic and Weibull models. The explicit functions for hazard and survival were then used to analyze the dynamic behavior of hazard and survival functions. Results The hypertabastic model provided the best fit among all the models considered. Use of multiple gene expression variables also provided a considerable improvement in the goodness of fit of the model, as compared to use of only one. By utilizing the explicit survival and hazard functions provided by the model, we were able to determine the magnitude of the maximum rate of increase in hazard, and the maximum rate of decrease in survival, as well as the times when these occurred. We explore the influence of each gene expression variable on these extrema. Furthermore, in the cases of continuous gene expression variables, represented by a measure of correlation, we were able to investigate the dynamics with respect to changes in gene expression. Conclusions We observed that use of three different gene signatures in the model provided a greater combined effect and allowed us to assess the relative importance of each in determination of outcome in this data set. These results point to the potential to combine gene signatures to a greater effect in cases where each gene signature represents some distinct aspect of the cancer biology. Furthermore we conclude that the hypertabastic survival models can be an effective survival analysis tool for breast cancer patients. PMID:23241496
Displacement-based back-analysis of the model parameters of the Nuozhadu high earth-rockfill dam.
Wu, Yongkang; Yuan, Huina; Zhang, Bingyin; Zhang, Zongliang; Yu, Yuzhen
2014-01-01
The parameters of the constitutive model, the creep model, and the wetting model of materials of the Nuozhadu high earth-rockfill dam were back-analyzed together based on field monitoring displacement data by employing an intelligent back-analysis method. In this method, an artificial neural network is used as a substitute for time-consuming finite element analysis, and an evolutionary algorithm is applied for both network training and parameter optimization. To avoid simultaneous back-analysis of many parameters, the model parameters of the three main dam materials are decoupled and back-analyzed separately in a particular order. Displacement back-analyses were performed at different stages of the construction period, with and without considering the creep and wetting deformations. Good agreement between the numerical results and the monitoring data was obtained for most observation points, which implies that the back-analysis method and decoupling method are effective for solving complex problems with multiple models and parameters. The comparison of calculation results based on different sets of back-analyzed model parameters indicates the necessity of taking the effects of creep and wetting into consideration in the numerical analyses of high earth-rockfill dams. With the resulting model parameters, the stress and deformation distributions at completion are predicted and analyzed.
Data on copula modeling of mixed discrete and continuous neural time series.
Hu, Meng; Li, Mingyao; Li, Wu; Liang, Hualou
2016-06-01
Copula is an important tool for modeling neural dependence. Recent work on copula has been expanded to jointly model mixed time series in neuroscience ("Hu et al., 2016, Joint Analysis of Spikes and Local Field Potentials using Copula" [1]). Here we present further data for joint analysis of spike and local field potential (LFP) with copula modeling. In particular, the details of different model orders and the influence of possible spike contamination in LFP data from the same and different electrode recordings are presented. To further facilitate the use of our copula model for the analysis of mixed data, we provide the Matlab codes, together with example data.
Hao, Yong; Sun, Xu-Dong; Yang, Qiang
2012-12-01
Variables selection strategy combined with local linear embedding (LLE) was introduced for the analysis of complex samples by using near infrared spectroscopy (NIRS). Three methods include Monte Carlo uninformation variable elimination (MCUVE), successive projections algorithm (SPA) and MCUVE connected with SPA were used for eliminating redundancy spectral variables. Partial least squares regression (PLSR) and LLE-PLSR were used for modeling complex samples. The results shown that MCUVE can both extract effective informative variables and improve the precision of models. Compared with PLSR models, LLE-PLSR models can achieve more accurate analysis results. MCUVE combined with LLE-PLSR is an effective modeling method for NIRS quantitative analysis.
The Volatility of Data Space: Topology Oriented Sensitivity Analysis
Du, Jing; Ligmann-Zielinska, Arika
2015-01-01
Despite the difference among specific methods, existing Sensitivity Analysis (SA) technologies are all value-based, that is, the uncertainties in the model input and output are quantified as changes of values. This paradigm provides only limited insight into the nature of models and the modeled systems. In addition to the value of data, a potentially richer information about the model lies in the topological difference between pre-model data space and post-model data space. This paper introduces an innovative SA method called Topology Oriented Sensitivity Analysis, which defines sensitivity as the volatility of data space. It extends SA into a deeper level that lies in the topology of data. PMID:26368929
Monte Carlo based statistical power analysis for mediation models: methods and software.
Zhang, Zhiyong
2014-12-01
The existing literature on statistical power analysis for mediation models often assumes data normality and is based on a less powerful Sobel test instead of the more powerful bootstrap test. This study proposes to estimate statistical power to detect mediation effects on the basis of the bootstrap method through Monte Carlo simulation. Nonnormal data with excessive skewness and kurtosis are allowed in the proposed method. A free R package called bmem is developed to conduct the power analysis discussed in this study. Four examples, including a simple mediation model, a multiple-mediator model with a latent mediator, a multiple-group mediation model, and a longitudinal mediation model, are provided to illustrate the proposed method.
NASA Astrophysics Data System (ADS)
O'Donncha, Fearghal; Hartnett, Michael; Nash, Stephen; Ren, Lei; Ragnoli, Emanuele
2015-02-01
In this study, High Frequency Radar (HFR), observations in conjunction with numerical model simulations investigate surface flow dynamics in a tidally-active, wind-driven bay; Galway Bay situated on the West coast of Ireland. Comparisons against ADCP sensor data permit an independent assessment of HFR and model performance, respectively. Results show root-mean-square (rms) differences in the range 10 - 12cm/s while model rms equalled 12 - 14cm/s. Subsequent analysis focus on a detailed comparison of HFR and model output. Harmonic analysis decompose both sets of surface currents based on distinct flow process, enabling a correlation analysis between the resultant output and dominant forcing parameters. Comparisons of barotropic model simulations and HFR tidal signal demonstrate consistently high agreement, particularly of the dominant M2 tidal signal. Analysis of residual flows demonstrate considerably poorer agreement, with the model failing to replicate complex flows. A number of hypotheses explaining this discrepancy are discussed, namely: discrepancies between regional-scale, coastal-ocean models and globally-influenced bay-scale dynamics; model uncertainties arising from highly-variable wind-driven flows across alarge body of water forced by point measurements of wind vectors; and the high dependence of model simulations on empirical wind-stress coefficients. The research demonstrates that an advanced, widely-used hydro-environmental model does not accurately reproduce aspects of surface flow processes, particularly with regards wind forcing. Considering the significance of surface boundary conditions in both coastal and open ocean dynamics, the viability of using a systematic analysis of results to improve model predictions is discussed.
A two-step sensitivity analysis for hydrological signatures in Jinhua River Basin, East China
NASA Astrophysics Data System (ADS)
Pan, S.; Fu, G.; Chiang, Y. M.; Xu, Y. P.
2016-12-01
Owing to model complexity and large number of parameters, calibration and sensitivity analysis are difficult processes for distributed hydrological models. In this study, a two-step sensitivity analysis approach is proposed for analyzing the hydrological signatures in Jinhua River Basin, East China, using the Distributed Hydrology-Soil-Vegetation Model (DHSVM). A rough sensitivity analysis is firstly conducted to obtain preliminary influential parameters via Analysis of Variance. The number of parameters was greatly reduced from eighteen-three to sixteen. Afterwards, the sixteen parameters are further analyzed based on a variance-based global sensitivity analysis, i.e., Sobol's sensitivity analysis method, to achieve robust sensitivity rankings and parameter contributions. Parallel-Computing is applied to reduce computational burden in variance-based sensitivity analysis. The results reveal that only a few number of model parameters are significantly sensitive, including rain LAI multiplier, lateral conductivity, porosity, field capacity, wilting point of clay loam, understory monthly LAI, understory minimum resistance and root zone depths of croplands. Finally several hydrological signatures are used for investigating the performance of DHSVM. Results show that high value of efficiency criteria didn't indicate excellent performance of hydrological signatures. For most samples from Sobol's sensitivity analysis, water yield was simulated very well. However, lowest and maximum annual daily runoffs were underestimated. Most of seven-day minimum runoffs were overestimated. Nevertheless, good performances of the three signatures above still exist in a number of samples. Analysis of peak flow shows that small and medium floods are simulated perfectly while slight underestimations happen to large floods. The work in this study helps to further multi-objective calibration of DHSVM model and indicates where to improve the reliability and credibility of model simulation.
Bouskill, N. J.; Riley, W. J.; Tang, J. Y.
2014-12-11
Accurate representation of ecosystem processes in land models is crucial for reducing predictive uncertainty in energy and greenhouse gas feedbacks with the climate. Here we describe an observational and modeling meta-analysis approach to benchmark land models, and apply the method to the land model CLM4.5 with two versions of belowground biogeochemistry. We focused our analysis on the aboveground and belowground responses to warming and nitrogen addition in high-latitude ecosystems, and identified absent or poorly parameterized mechanisms in CLM4.5. While the two model versions predicted similar soil carbon stock trajectories following both warming and nitrogen addition, other predicted variables (e.g., belowgroundmore » respiration) differed from observations in both magnitude and direction, indicating that CLM4.5 has inadequate underlying mechanisms for representing high-latitude ecosystems. On the basis of observational synthesis, we attribute the model–observation differences to missing representations of microbial dynamics, aboveground and belowground coupling, and nutrient cycling, and we use the observational meta-analysis to discuss potential approaches to improving the current models. However, we also urge caution concerning the selection of data sets and experiments for meta-analysis. For example, the concentrations of nitrogen applied in the synthesized field experiments (average = 72 kg ha -1 yr -1) are many times higher than projected soil nitrogen concentrations (from nitrogen deposition and release during mineralization), which precludes a rigorous evaluation of the model responses to likely nitrogen perturbations. Overall, we demonstrate that elucidating ecological mechanisms via meta-analysis can identify deficiencies in ecosystem models and empirical experiments.« less
Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende
2014-01-01
Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.
ERIC Educational Resources Information Center
Hsieh, Chueh-An; Maier, Kimberly S.
2009-01-01
The capacity of Bayesian methods in estimating complex statistical models is undeniable. Bayesian data analysis is seen as having a range of advantages, such as an intuitive probabilistic interpretation of the parameters of interest, the efficient incorporation of prior information to empirical data analysis, model averaging and model selection.…
Time Series Model Identification by Estimating Information.
1982-11-01
principle, Applications of Statistics, P. R. Krishnaiah , ed., North-Holland: Amsterdam, 27-41. Anderson, T. W. (1971). The Statistical Analysis of Time Series...E. (1969). Multiple Time Series Modeling, Multivariate Analysis II, edited by P. Krishnaiah , Academic Press: New York, 389-409. Parzen, E. (1981...Newton, H. J. (1980). Multiple Time Series Modeling, II Multivariate Analysis - V, edited by P. Krishnaiah , North Holland: Amsterdam, 181-197. Shibata, R
Reference Models for Multi-Layer Tissue Structures
2016-09-01
simulation, finite element analysis 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON USAMRMC...Physiologically realistic, fully specimen-specific, nonlinear reference models. Tasks. Finite element analysis of non-linear mechanics of cadaver...models. Tasks. Finite element analysis of non-linear mechanics of multi-layer tissue regions of human subjects. Deliverables. Partially subject- and
Process for computing geometric perturbations for probabilistic analysis
Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX
2012-04-10
A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.
1976-03-01
This report summarizes the results of the research program on Image Analysis and Modeling supported by the Defense Advanced Research Projects Agency...The objective is to achieve a better understanding of image structure and to use this knowledge to develop improved image models for use in image ... analysis and processing tasks such as information extraction, image enhancement and restoration, and coding. The ultimate objective of this research is
Sensitivity Analysis of QSAR Models for Assessing Novel Military Compounds
2009-01-01
ER D C TR -0 9 -3 Strategic Environmental Research and Development Program Sensitivity Analysis of QSAR Models for Assessing Novel...Environmental Research and Development Program ERDC TR-09-3 January 2009 Sensitivity Analysis of QSAR Models for Assessing Novel Military Compound...Jay L. Clausen Cold Regions Research and Engineering Laboratory U.S. Army Engineer Research and Development Center 72 Lyme Road Hanover, NH
Frederix, Gerardus W J; van Hasselt, Johan G C; Schellens, Jan H M; Hövels, Anke M; Raaijmakers, Jan A M; Huitema, Alwin D R; Severens, Johan L
2014-01-01
Structural uncertainty relates to differences in model structure and parameterization. For many published health economic analyses in oncology, substantial differences in model structure exist, leading to differences in analysis outcomes and potentially impacting decision-making processes. The objectives of this analysis were (1) to identify differences in model structure and parameterization for cost-effectiveness analyses (CEAs) comparing tamoxifen and anastrazole for adjuvant breast cancer (ABC) treatment; and (2) to quantify the impact of these differences on analysis outcome metrics. The analysis consisted of four steps: (1) review of the literature for identification of eligible CEAs; (2) definition and implementation of a base model structure, which included the core structural components for all identified CEAs; (3) definition and implementation of changes or additions in the base model structure or parameterization; and (4) quantification of the impact of changes in model structure or parameterizations on the analysis outcome metrics life-years gained (LYG), incremental costs (IC) and the incremental cost-effectiveness ratio (ICER). Eleven CEA analyses comparing anastrazole and tamoxifen as ABC treatment were identified. The base model consisted of the following health states: (1) on treatment; (2) off treatment; (3) local recurrence; (4) metastatic disease; (5) death due to breast cancer; and (6) death due to other causes. The base model estimates of anastrazole versus tamoxifen for the LYG, IC and ICER were 0.263 years, €3,647 and €13,868/LYG, respectively. In the published models that were evaluated, differences in model structure included the addition of different recurrence health states, and associated transition rates were identified. Differences in parameterization were related to the incidences of recurrence, local recurrence to metastatic disease, and metastatic disease to death. The separate impact of these model components on the LYG ranged from 0.207 to 0.356 years, while incremental costs ranged from €3,490 to €3,714 and ICERs ranged from €9,804/LYG to €17,966/LYG. When we re-analyzed the published CEAs in our framework by including their respective model properties, the LYG ranged from 0.207 to 0.383 years, IC ranged from €3,556 to €3,731 and ICERs ranged from €9,683/LYG to €17,570/LYG. Differences in model structure and parameterization lead to substantial differences in analysis outcome metrics. This analysis supports the need for more guidance regarding structural uncertainty and the use of standardized disease-specific models for health economic analyses of adjuvant endocrine breast cancer therapies. The developed approach in the current analysis could potentially serve as a template for further evaluations of structural uncertainty and development of disease-specific models.
Multiscale modeling of brain dynamics: from single neurons and networks to mathematical tools.
Siettos, Constantinos; Starke, Jens
2016-09-01
The extreme complexity of the brain naturally requires mathematical modeling approaches on a large variety of scales; the spectrum ranges from single neuron dynamics over the behavior of groups of neurons to neuronal network activity. Thus, the connection between the microscopic scale (single neuron activity) to macroscopic behavior (emergent behavior of the collective dynamics) and vice versa is a key to understand the brain in its complexity. In this work, we attempt a review of a wide range of approaches, ranging from the modeling of single neuron dynamics to machine learning. The models include biophysical as well as data-driven phenomenological models. The discussed models include Hodgkin-Huxley, FitzHugh-Nagumo, coupled oscillators (Kuramoto oscillators, Rössler oscillators, and the Hindmarsh-Rose neuron), Integrate and Fire, networks of neurons, and neural field equations. In addition to the mathematical models, important mathematical methods in multiscale modeling and reconstruction of the causal connectivity are sketched. The methods include linear and nonlinear tools from statistics, data analysis, and time series analysis up to differential equations, dynamical systems, and bifurcation theory, including Granger causal connectivity analysis, phase synchronization connectivity analysis, principal component analysis (PCA), independent component analysis (ICA), and manifold learning algorithms such as ISOMAP, and diffusion maps and equation-free techniques. WIREs Syst Biol Med 2016, 8:438-458. doi: 10.1002/wsbm.1348 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
Sensitivity analysis of a sound absorption model with correlated inputs
NASA Astrophysics Data System (ADS)
Chai, W.; Christen, J.-L.; Zine, A.-M.; Ichchou, M.
2017-04-01
Sound absorption in porous media is a complex phenomenon, which is usually addressed with homogenized models, depending on macroscopic parameters. Since these parameters emerge from the structure at microscopic scale, they may be correlated. This paper deals with sensitivity analysis methods of a sound absorption model with correlated inputs. Specifically, the Johnson-Champoux-Allard model (JCA) is chosen as the objective model with correlation effects generated by a secondary micro-macro semi-empirical model. To deal with this case, a relatively new sensitivity analysis method Fourier Amplitude Sensitivity Test with Correlation design (FASTC), based on Iman's transform, is taken into application. This method requires a priori information such as variables' marginal distribution functions and their correlation matrix. The results are compared to the Correlation Ratio Method (CRM) for reference and validation. The distribution of the macroscopic variables arising from the microstructure, as well as their correlation matrix are studied. Finally the results of tests shows that the correlation has a very important impact on the results of sensitivity analysis. Assessment of correlation strength among input variables on the sensitivity analysis is also achieved.
Modeling stock price dynamics by continuum percolation system and relevant complex systems analysis
NASA Astrophysics Data System (ADS)
Xiao, Di; Wang, Jun
2012-10-01
The continuum percolation system is developed to model a random stock price process in this work. Recent empirical research has demonstrated various statistical features of stock price changes, the financial model aiming at understanding price fluctuations needs to define a mechanism for the formation of the price, in an attempt to reproduce and explain this set of empirical facts. The continuum percolation model is usually referred to as a random coverage process or a Boolean model, the local interaction or influence among traders is constructed by the continuum percolation, and a cluster of continuum percolation is applied to define the cluster of traders sharing the same opinion about the market. We investigate and analyze the statistical behaviors of normalized returns of the price model by some analysis methods, including power-law tail distribution analysis, chaotic behavior analysis and Zipf analysis. Moreover, we consider the daily returns of Shanghai Stock Exchange Composite Index from January 1997 to July 2011, and the comparisons of return behaviors between the actual data and the simulation data are exhibited.
Static Aeroelastic Analysis with an Inviscid Cartesian Method
NASA Technical Reports Server (NTRS)
Rodriguez, David L.; Aftosmis, Michael J.; Nemec, Marian; Smith, Stephen C.
2014-01-01
An embedded-boundary, Cartesian-mesh flow solver is coupled with a three degree-of-freedom structural model to perform static, aeroelastic analysis of complex aircraft geometries. The approach solves a nonlinear, aerostructural system of equations using a loosely-coupled strategy. An open-source, 3-D discrete-geometry engine is utilized to deform a triangulated surface geometry according to the shape predicted by the structural model under the computed aerodynamic loads. The deformation scheme is capable of modeling large deflections and is applicable to the design of modern, very-flexible transport wings. The coupling interface is modular so that aerodynamic or structural analysis methods can be easily swapped or enhanced. After verifying the structural model with comparisons to Euler beam theory, two applications of the analysis method are presented as validation. The first is a relatively stiff, transport wing model which was a subject of a recent workshop on aeroelasticity. The second is a very flexible model recently tested in a low speed wind tunnel. Both cases show that the aeroelastic analysis method produces results in excellent agreement with experimental data.