Link, William; Sauer, John R.
2016-01-01
The analysis of ecological data has changed in two important ways over the last 15 years. The development and easy availability of Bayesian computational methods has allowed and encouraged the fitting of complex hierarchical models. At the same time, there has been increasing emphasis on acknowledging and accounting for model uncertainty. Unfortunately, the ability to fit complex models has outstripped the development of tools for model selection and model evaluation: familiar model selection tools such as Akaike's information criterion and the deviance information criterion are widely known to be inadequate for hierarchical models. In addition, little attention has been paid to the evaluation of model adequacy in context of hierarchical modeling, i.e., to the evaluation of fit for a single model. In this paper, we describe Bayesian cross-validation, which provides tools for model selection and evaluation. We describe the Bayesian predictive information criterion and a Bayesian approximation to the BPIC known as the Watanabe-Akaike information criterion. We illustrate the use of these tools for model selection, and the use of Bayesian cross-validation as a tool for model evaluation, using three large data sets from the North American Breeding Bird Survey.
IT vendor selection model by using structural equation model & analytical hierarchy process
NASA Astrophysics Data System (ADS)
Maitra, Sarit; Dominic, P. D. D.
2012-11-01
Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.
Metrics for evaluating performance and uncertainty of Bayesian network models
Bruce G. Marcot
2012-01-01
This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...
ERIC Educational Resources Information Center
Burstein, Leigh
Two specific methods of analysis in large-scale evaluations are considered: structural equation modeling and selection modeling/analysis of non-equivalent control group designs. Their utility in large-scale educational program evaluation is discussed. The examination of these methodological developments indicates how people (evaluators,…
Liabeuf, Debora; Sim, Sung-Chur; Francis, David M
2018-03-01
Bacterial spot affects tomato crops (Solanum lycopersicum) grown under humid conditions. Major genes and quantitative trait loci (QTL) for resistance have been described, and multiple loci from diverse sources need to be combined to improve disease control. We investigated genomic selection (GS) prediction models for resistance to Xanthomonas euvesicatoria and experimentally evaluated the accuracy of these models. The training population consisted of 109 families combining resistance from four sources and directionally selected from a population of 1,100 individuals. The families were evaluated on a plot basis in replicated inoculated trials and genotyped with single nucleotide polymorphisms (SNP). We compared the prediction ability of models developed with 14 to 387 SNP. Genomic estimated breeding values (GEBV) were derived using Bayesian least absolute shrinkage and selection operator regression (BL) and ridge regression (RR). Evaluations were based on leave-one-out cross validation and on empirical observations in replicated field trials using the next generation of inbred progeny and a hybrid population resulting from selections in the training population. Prediction ability was evaluated based on correlations between GEBV and phenotypes (r g ), percentage of coselection between genomic and phenotypic selection, and relative efficiency of selection (r g /r p ). Results were similar with BL and RR models. Models using only markers previously identified as significantly associated with resistance but weighted based on GEBV and mixed models with markers associated with resistance treated as fixed effects and markers distributed in the genome treated as random effects offered greater accuracy and a high percentage of coselection. The accuracy of these models to predict the performance of progeny and hybrids exceeded the accuracy of phenotypic selection.
An Evaluation Research Model for System-Wide Textbook Selection.
ERIC Educational Resources Information Center
Talmage, Harriet; Walberg, Herbert T.
One component of an evaluation research model for system-wide selection of curriculum materials is reported: implementation of an evaluation design for obtaining data that permits professional and lay persons to base curriculum materials decisions on a "best fit" principle. The design includes teacher characteristics, learning environment…
An Evaluation of Some Models for Culture-Fair Selection.
ERIC Educational Resources Information Center
Petersen, Nancy S.; Novick, Melvin R.
Models proposed by Cleary, Thorndike, Cole, Linn, Einhorn and Bass, Darlington, and Gross and Su for analyzing bias in the use of tests in a selection strategy are surveyed. Several additional models are also introduced. The purpose is to describe, compare, contrast, and evaluate these models while extracting such useful ideas as may be found in…
Multicriteria framework for selecting a process modelling language
NASA Astrophysics Data System (ADS)
Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel
2016-01-01
The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.
An Evaluation Model To Select an Integrated Learning System in a Large, Suburban School District.
ERIC Educational Resources Information Center
Curlette, William L.; And Others
The systematic evaluation process used in Georgia's DeKalb County School System to purchase comprehensive instructional software--an integrated learning system (ILS)--is described, and the decision-making model for selection is presented. Selection and implementation of an ILS were part of an instructional technology plan for the DeKalb schools…
Model selection for the North American Breeding Bird Survey: A comparison of methods
Link, William; Sauer, John; Niven, Daniel
2017-01-01
The North American Breeding Bird Survey (BBS) provides data for >420 bird species at multiple geographic scales over 5 decades. Modern computational methods have facilitated the fitting of complex hierarchical models to these data. It is easy to propose and fit new models, but little attention has been given to model selection. Here, we discuss and illustrate model selection using leave-one-out cross validation, and the Bayesian Predictive Information Criterion (BPIC). Cross-validation is enormously computationally intensive; we thus evaluate the performance of the Watanabe-Akaike Information Criterion (WAIC) as a computationally efficient approximation to the BPIC. Our evaluation is based on analyses of 4 models as applied to 20 species covered by the BBS. Model selection based on BPIC provided no strong evidence of one model being consistently superior to the others; for 14/20 species, none of the models emerged as superior. For the remaining 6 species, a first-difference model of population trajectory was always among the best fitting. Our results show that WAIC is not reliable as a surrogate for BPIC. Development of appropriate model sets and their evaluation using BPIC is an important innovation for the analysis of BBS data.
QaaS (quality as a service) model for web services using big data technologies
NASA Astrophysics Data System (ADS)
Ahmad, Faisal; Sarkar, Anirban
2017-10-01
Quality of service (QoS) determines the service usability and utility and both of which influence the service selection process. The QoS varies from one service provider to other. Each web service has its own methodology for evaluating QoS. The lack of transparent QoS evaluation model makes the service selection challenging. Moreover, most QoS evaluation processes do not consider their historical data which not only helps in getting more accurate QoS but also helps for future prediction, recommendation and knowledge discovery. QoS driven service selection demands a model where QoS can be provided as a service to end users. This paper proposes a layered QaaS (quality as a service) model in the same line as PaaS and software as a service, where users can provide QoS attributes as inputs and the model returns services satisfying the user's QoS expectation. This paper covers all the key aspects in this context, like selection of data sources, its transformation, evaluation, classification and storage of QoS. The paper uses server log as the source for evaluating QoS values, common methodology for its evaluation and big data technologies for its transformation and analysis. This paper also establishes the fact that Spark outperforms the Pig with respect to evaluation of QoS from logs.
Island-Model Genomic Selection for Long-Term Genetic Improvement of Autogamous Crops.
Yabe, Shiori; Yamasaki, Masanori; Ebana, Kaworu; Hayashi, Takeshi; Iwata, Hiroyoshi
2016-01-01
Acceleration of genetic improvement of autogamous crops such as wheat and rice is necessary to increase cereal production in response to the global food crisis. Population and pedigree methods of breeding, which are based on inbred line selection, are used commonly in the genetic improvement of autogamous crops. These methods, however, produce a few novel combinations of genes in a breeding population. Recurrent selection promotes recombination among genes and produces novel combinations of genes in a breeding population, but it requires inaccurate single-plant evaluation for selection. Genomic selection (GS), which can predict genetic potential of individuals based on their marker genotype, might have high reliability of single-plant evaluation and might be effective in recurrent selection. To evaluate the efficiency of recurrent selection with GS, we conducted simulations using real marker genotype data of rice cultivars. Additionally, we introduced the concept of an "island model" inspired by evolutionary algorithms that might be useful to maintain genetic variation through the breeding process. We conducted GS simulations using real marker genotype data of rice cultivars to evaluate the efficiency of recurrent selection and the island model in an autogamous species. Results demonstrated the importance of producing novel combinations of genes through recurrent selection. An initial population derived from admixture of multiple bi-parental crosses showed larger genetic gains than a population derived from a single bi-parental cross in whole cycles, suggesting the importance of genetic variation in an initial population. The island-model GS better maintained genetic improvement in later generations than the other GS methods, suggesting that the island-model GS can utilize genetic variation in breeding and can retain alleles with small effects in the breeding population. The island-model GS will become a new breeding method that enhances the potential of genomic selection in autogamous crops, especially bringing long-term improvement.
Newcom, D W; Baas, T J; Stalder, K J; Schwab, C R
2005-04-01
Three selection models were evaluated to compare selection candidate rankings based on EBV and to evaluate subsequent effects of model-derived EBV on the selection differential and expected genetic response in the population. Data were collected from carcass- and ultrasound-derived estimates of loin i.m. fat percent (IMF) in a population of Duroc swine under selection to increase IMF. The models compared were Model 1, a two-trait animal model used in the selection experiment that included ultrasound IMF from all pigs scanned and carcass IMF from pigs slaughtered to estimate breeding values for both carcass (C1) and ultrasound IMF (U1); Model 2, a single-trait animal model that included ultrasound IMF values on all pigs scanned to estimate breeding values for ultrasound IMF (U2); and Model 3, a multiple-trait animal model including carcass IMF from slaughtered pigs and the first three principal components from a total of 10 image parameters averaged across four longitudinal ultrasound images to estimate breeding values for carcass IMF (C3). Rank correlations between breeding value estimates for U1 and C1, U1 and U2, and C1 and C3 were 0.95, 0.97, and 0.92, respectively. Other rank correlations were 0.86 or less. In the selection experiment, approximately the top 10% of boars and 50% of gilts were selected. Selection differentials for pigs in Generation 3 were greatest when ranking pigs based on C1, followed by U1, U2, and C3. In addition, selection differential and estimated response were evaluated when simulating selection of the top 1, 5, and 10% of sires and 50% of dams. Results of this analysis indicated the greatest selection differential was for selection based on C1. The greatest loss in selection differential was found for selection based on C3 when selecting the top 10 and 1% of boars and 50% of gilts. The loss in estimated response when selecting varying percentages of boars and the top 50% of gilts was greatest when selection was based on C3 (16.0 to 25.8%) and least for selection based on U1 (1.3 to 10.9%). Estimated genetic change from selection based on carcass IMF was greater than selection based on ultrasound IMF. Results show that selection based on a combination of ultrasonically predicted IMF and sib carcass IMF produced the greatest selection differentials and should lead to the greatest genetic change.
Hill, Mary C.; L. Foglia,; S. W. Mehl,; P. Burlando,
2013-01-01
Model adequacy is evaluated with alternative models rated using model selection criteria (AICc, BIC, and KIC) and three other statistics. Model selection criteria are tested with cross-validation experiments and insights for using alternative models to evaluate model structural adequacy are provided. The study is conducted using the computer codes UCODE_2005 and MMA (MultiModel Analysis). One recharge alternative is simulated using the TOPKAPI hydrological model. The predictions evaluated include eight heads and three flows located where ecological consequences and model precision are of concern. Cross-validation is used to obtain measures of prediction accuracy. Sixty-four models were designed deterministically and differ in representation of river, recharge, bedrock topography, and hydraulic conductivity. Results include: (1) What may seem like inconsequential choices in model construction may be important to predictions. Analysis of predictions from alternative models is advised. (2) None of the model selection criteria consistently identified models with more accurate predictions. This is a disturbing result that suggests to reconsider the utility of model selection criteria, and/or the cross-validation measures used in this work to measure model accuracy. (3) KIC displayed poor performance for the present regression problems; theoretical considerations suggest that difficulties are associated with wide variations in the sensitivity term of KIC resulting from the models being nonlinear and the problems being ill-posed due to parameter correlations and insensitivity. The other criteria performed somewhat better, and similarly to each other. (4) Quantities with high leverage are more difficult to predict. The results are expected to be generally applicable to models of environmental systems.
MISFITS: evaluating the goodness of fit between a phylogenetic model and an alignment.
Nguyen, Minh Anh Thi; Klaere, Steffen; von Haeseler, Arndt
2011-01-01
As models of sequence evolution become more and more complicated, many criteria for model selection have been proposed, and tools are available to select the best model for an alignment under a particular criterion. However, in many instances the selected model fails to explain the data adequately as reflected by large deviations between observed pattern frequencies and the corresponding expectation. We present MISFITS, an approach to evaluate the goodness of fit (http://www.cibiv.at/software/misfits). MISFITS introduces a minimum number of "extra substitutions" on the inferred tree to provide a biologically motivated explanation why the alignment may deviate from expectation. These extra substitutions plus the evolutionary model then fully explain the alignment. We illustrate the method on several examples and then give a survey about the goodness of fit of the selected models to the alignments in the PANDIT database.
NASA Astrophysics Data System (ADS)
Jafarzadeh Ghoushchi, Saeid; Dodkanloi Milan, Mehran; Jahangoshai Rezaee, Mustafa
2017-11-01
Nowadays, with respect to knowledge growth about enterprise sustainability, sustainable supplier selection is considered a vital factor in sustainable supply chain management. On the other hand, usually in real problems, the data are imprecise. One method that is helpful for the evaluation and selection of the sustainable supplier and has the ability to use a variety of data types is data envelopment analysis (DEA). In the present article, first, the supplier efficiency is measured with respect to all economic, social and environmental dimensions using DEA and applying imprecise data. Then, to have a general evaluation of the suppliers, the DEA model is developed using imprecise data based on goal programming (GP). Integrating the set of criteria changes the new model into a coherent framework for sustainable supplier selection. Moreover, employing this model in a multilateral sustainable supplier selection can be an incentive for the suppliers to move towards environmental, social and economic activities. Improving environmental, economic and social performance will mean improving the supply chain performance. Finally, the application of the proposed approach is presented with a real dataset.
Island-Model Genomic Selection for Long-Term Genetic Improvement of Autogamous Crops
Yabe, Shiori; Yamasaki, Masanori; Ebana, Kaworu; Hayashi, Takeshi; Iwata, Hiroyoshi
2016-01-01
Acceleration of genetic improvement of autogamous crops such as wheat and rice is necessary to increase cereal production in response to the global food crisis. Population and pedigree methods of breeding, which are based on inbred line selection, are used commonly in the genetic improvement of autogamous crops. These methods, however, produce a few novel combinations of genes in a breeding population. Recurrent selection promotes recombination among genes and produces novel combinations of genes in a breeding population, but it requires inaccurate single-plant evaluation for selection. Genomic selection (GS), which can predict genetic potential of individuals based on their marker genotype, might have high reliability of single-plant evaluation and might be effective in recurrent selection. To evaluate the efficiency of recurrent selection with GS, we conducted simulations using real marker genotype data of rice cultivars. Additionally, we introduced the concept of an “island model” inspired by evolutionary algorithms that might be useful to maintain genetic variation through the breeding process. We conducted GS simulations using real marker genotype data of rice cultivars to evaluate the efficiency of recurrent selection and the island model in an autogamous species. Results demonstrated the importance of producing novel combinations of genes through recurrent selection. An initial population derived from admixture of multiple bi-parental crosses showed larger genetic gains than a population derived from a single bi-parental cross in whole cycles, suggesting the importance of genetic variation in an initial population. The island-model GS better maintained genetic improvement in later generations than the other GS methods, suggesting that the island-model GS can utilize genetic variation in breeding and can retain alleles with small effects in the breeding population. The island-model GS will become a new breeding method that enhances the potential of genomic selection in autogamous crops, especially bringing long-term improvement. PMID:27115872
Predicting the difficulty of pure, strict, epistatic models: metrics for simulated model selection.
Urbanowicz, Ryan J; Kiralis, Jeff; Fisher, Jonathan M; Moore, Jason H
2012-09-26
Algorithms designed to detect complex genetic disease associations are initially evaluated using simulated datasets. Typical evaluations vary constraints that influence the correct detection of underlying models (i.e. number of loci, heritability, and minor allele frequency). Such studies neglect to account for model architecture (i.e. the unique specification and arrangement of penetrance values comprising the genetic model), which alone can influence the detectability of a model. In order to design a simulation study which efficiently takes architecture into account, a reliable metric is needed for model selection. We evaluate three metrics as predictors of relative model detection difficulty derived from previous works: (1) Penetrance table variance (PTV), (2) customized odds ratio (COR), and (3) our own Ease of Detection Measure (EDM), calculated from the penetrance values and respective genotype frequencies of each simulated genetic model. We evaluate the reliability of these metrics across three very different data search algorithms, each with the capacity to detect epistatic interactions. We find that a model's EDM and COR are each stronger predictors of model detection success than heritability. This study formally identifies and evaluates metrics which quantify model detection difficulty. We utilize these metrics to intelligently select models from a population of potential architectures. This allows for an improved simulation study design which accounts for differences in detection difficulty attributed to model architecture. We implement the calculation and utilization of EDM and COR into GAMETES, an algorithm which rapidly and precisely generates pure, strict, n-locus epistatic models.
Exploring Several Methods of Groundwater Model Selection
NASA Astrophysics Data System (ADS)
Samani, Saeideh; Ye, Ming; Asghari Moghaddam, Asghar
2017-04-01
Selecting reliable models for simulating groundwater flow and solute transport is essential to groundwater resources management and protection. This work is to explore several model selection methods for avoiding over-complex and/or over-parameterized groundwater models. We consider six groundwater flow models with different numbers (6, 10, 10, 13, 13 and 15) of model parameters. These models represent alternative geological interpretations, recharge estimates, and boundary conditions at a study site in Iran. The models were developed with Model Muse, and calibrated against observations of hydraulic head using UCODE. Model selection was conducted by using the following four approaches: (1) Rank the models using their root mean square error (RMSE) obtained after UCODE-based model calibration, (2) Calculate model probability using GLUE method, (3) Evaluate model probability using model selection criteria (AIC, AICc, BIC, and KIC), and (4) Evaluate model weights using the Fuzzy Multi-Criteria-Decision-Making (MCDM) approach. MCDM is based on the fuzzy analytical hierarchy process (AHP) and fuzzy technique for order performance, which is to identify the ideal solution by a gradual expansion from the local to the global scale of model parameters. The KIC and MCDM methods are superior to other methods, as they consider not only the fit between observed and simulated data and the number of parameter, but also uncertainty in model parameters. Considering these factors can prevent from occurring over-complexity and over-parameterization, when selecting the appropriate groundwater flow models. These methods selected, as the best model, one with average complexity (10 parameters) and the best parameter estimation (model 3).
A Value-Added Approach to Selecting the Best Master of Business Administration (MBA) Program
ERIC Educational Resources Information Center
Fisher, Dorothy M.; Kiang, Melody; Fisher, Steven A.
2007-01-01
Although numerous studies rank master of business administration (MBA) programs, prospective students' selection of the best MBA program is a formidable task. In this study, the authors used a linear-programming-based model called data envelopment analysis (DEA) to evaluate MBA programs. The DEA model connects costs to benefits to evaluate the…
SELECTION OF CANDIDATE EUTROPHICATION MODELS FOR TOTAL MAXIMUM DAILY LOADS ANALYSES
A tiered approach was developed to evaluate candidate eutrophication models to select a common suite of models that could be used for Total Maximum Daily Loads (TMDL) analyses in estuaries, rivers, and lakes/reservoirs. Consideration for linkage to watershed models and ecologica...
A Heckman selection model for the safety analysis of signalized intersections
Wong, S. C.; Zhu, Feng; Pei, Xin; Huang, Helai; Liu, Youjun
2017-01-01
Purpose The objective of this paper is to provide a new method for estimating crash rate and severity simultaneously. Methods This study explores a Heckman selection model of the crash rate and severity simultaneously at different levels and a two-step procedure is used to investigate the crash rate and severity levels. The first step uses a probit regression model to determine the sample selection process, and the second step develops a multiple regression model to simultaneously evaluate the crash rate and severity for slight injury/kill or serious injury (KSI), respectively. The model uses 555 observations from 262 signalized intersections in the Hong Kong metropolitan area, integrated with information on the traffic flow, geometric road design, road environment, traffic control and any crashes that occurred during two years. Results The results of the proposed two-step Heckman selection model illustrate the necessity of different crash rates for different crash severity levels. Conclusions A comparison with the existing approaches suggests that the Heckman selection model offers an efficient and convenient alternative method for evaluating the safety performance at signalized intersections. PMID:28732050
ERIC Educational Resources Information Center
Chang, Ting-Cheng; Wang, Hui
2016-01-01
This paper proposes a cloud multi-criteria group decision-making model for teacher evaluation in higher education which is involving subjectivity, imprecision and fuzziness. First, selecting the appropriate evaluation index depending on the evaluation objectives, indicating a clear structural relationship between the evaluation index and…
Goess, Christian; Harris, Christopher M; Murdock, Sara; McCarthy, Richard W; Sampson, Erik; Twomey, Rachel; Mathieu, Suzanne; Mario, Regina; Perham, Matthew; Goedken, Eric R; Long, Andrew J
2018-06-02
Bruton's Tyrosine Kinase (BTK) is a non-receptor tyrosine kinase required for intracellular signaling downstream of multiple immunoreceptors. We evaluated ABBV-105, a covalent BTK inhibitor, using in vitro and in vivo assays to determine potency, selectivity, and efficacy to validate the therapeutic potential of ABBV-105 in inflammatory disease. ABBV-105 potency and selectivity were evaluated in enzymatic and cellular assays. The impact of ABBV-105 on B cell function in vivo was assessed using mechanistic models of antibody production. Efficacy of ABBV-105 in chronic inflammatory disease was evaluated in animal models of arthritis and lupus. Measurement of BTK occupancy was employed as a target engagement biomarker. ABBV-105 irreversibly inhibits BTK, demonstrating superior kinome selectivity and is potent in B cell receptor, Fc receptor, and TLR-9-dependent cellular assays. Oral administration resulted in rapid clearance in plasma, but maintenance of BTK splenic occupancy. ABBV-105 inhibited antibody responses to thymus-independent and thymus-dependent antigens, paw swelling and bone destruction in rat collagen induced arthritis (CIA), and reduced disease in an IFNα-accelerated lupus nephritis model. BTK occupancy in disease models correlated with in vivo efficacy. ABBV-105, a selective BTK inhibitor, demonstrates compelling efficacy in pre-clinical mechanistic models of antibody production and in models of rheumatoid arthritis and lupus.
Evaluating models of healthcare delivery using the Model of Care Evaluation Tool (MCET).
Hudspeth, Randall S; Vogt, Marjorie; Wysocki, Ken; Pittman, Oralea; Smith, Susan; Cooke, Cindy; Dello Stritto, Rita; Hoyt, Karen Sue; Merritt, T Jeanne
2016-08-01
Our aim was to provide the outcome of a structured Model of Care (MoC) Evaluation Tool (MCET), developed by an FAANP Best-practices Workgroup, that can be used to guide the evaluation of existing MoCs being considered for use in clinical practice. Multiple MoCs are available, but deciding which model of health care delivery to use can be confusing. This five-component tool provides a structured assessment approach to model selection and has universal application. A literature review using CINAHL, PubMed, Ovid, and EBSCO was conducted. The MCET evaluation process includes five sequential components with a feedback loop from component 5 back to component 3 for reevaluation of any refinements. The components are as follows: (1) Background, (2) Selection of an MoC, (3) Implementation, (4) Evaluation, and (5) Sustainability and Future Refinement. This practical resource considers an evidence-based approach to use in determining the best model to implement based on need, stakeholder considerations, and feasibility. ©2015 American Association of Nurse Practitioners.
Sensitivity of resource selection and connectivity models to landscape definition
Katherine A. Zeller; Kevin McGarigal; Samuel A. Cushman; Paul Beier; T. Winston Vickers; Walter M. Boyce
2017-01-01
Context: The definition of the geospatial landscape is the underlying basis for species-habitat models, yet sensitivity of habitat use inference, predicted probability surfaces, and connectivity models to landscape definition has received little attention. Objectives: We evaluated the sensitivity of resource selection and connectivity models to four landscape...
Howard B. Stauffer; Cynthia J. Zabel; Jeffrey R. Dunk
2005-01-01
We compared a set of competing logistic regression habitat selection models for Northern Spotted Owls (Strix occidentalis caurina) in California. The habitat selection models were estimated, compared, evaluated, and tested using multiple sample datasets collected on federal forestlands in northern California. We used Bayesian methods in interpreting...
Crockett, Molly J.
2013-01-01
Moral dilemmas engender conflicts between two traditions: consequentialism, which evaluates actions based on their outcomes, and deontology, which evaluates actions themselves. These strikingly resemble two distinct decision-making architectures: a model-based system that selects actions based on inferences about their consequences; and a model-free system that selects actions based on their reinforcement history. Here, I consider how these systems, along with a Pavlovian system that responds reflexively to rewards and punishments, can illuminate puzzles in moral psychology. PMID:23845564
Mota, L F M; Martins, P G M A; Littiere, T O; Abreu, L R A; Silva, M A; Bonafé, C M
2018-04-01
The objective was to estimate (co)variance functions using random regression models (RRM) with Legendre polynomials, B-spline function and multi-trait models aimed at evaluating genetic parameters of growth traits in meat-type quail. A database containing the complete pedigree information of 7000 meat-type quail was utilized. The models included the fixed effects of contemporary group and generation. Direct additive genetic and permanent environmental effects, considered as random, were modeled using B-spline functions considering quadratic and cubic polynomials for each individual segment, and Legendre polynomials for age. Residual variances were grouped in four age classes. Direct additive genetic and permanent environmental effects were modeled using 2 to 4 segments and were modeled by Legendre polynomial with orders of fit ranging from 2 to 4. The model with quadratic B-spline adjustment, using four segments for direct additive genetic and permanent environmental effects, was the most appropriate and parsimonious to describe the covariance structure of the data. The RRM using Legendre polynomials presented an underestimation of the residual variance. Lesser heritability estimates were observed for multi-trait models in comparison with RRM for the evaluated ages. In general, the genetic correlations between measures of BW from hatching to 35 days of age decreased as the range between the evaluated ages increased. Genetic trend for BW was positive and significant along the selection generations. The genetic response to selection for BW in the evaluated ages presented greater values for RRM compared with multi-trait models. In summary, RRM using B-spline functions with four residual variance classes and segments were the best fit for genetic evaluation of growth traits in meat-type quail. In conclusion, RRM should be considered in genetic evaluation of breeding programs.
Which risk models perform best in selecting ever-smokers for lung cancer screening?
A new analysis by scientists at NCI evaluates nine different individualized lung cancer risk prediction models based on their selections of ever-smokers for computed tomography (CT) lung cancer screening.
NASA Astrophysics Data System (ADS)
Martinez, Guillermo F.; Gupta, Hoshin V.
2011-12-01
Methods to select parsimonious and hydrologically consistent model structures are useful for evaluating dominance of hydrologic processes and representativeness of data. While information criteria (appropriately constrained to obey underlying statistical assumptions) can provide a basis for evaluating appropriate model complexity, it is not sufficient to rely upon the principle of maximum likelihood (ML) alone. We suggest that one must also call upon a "principle of hydrologic consistency," meaning that selected ML structures and parameter estimates must be constrained (as well as possible) to reproduce desired hydrological characteristics of the processes under investigation. This argument is demonstrated in the context of evaluating the suitability of candidate model structures for lumped water balance modeling across the continental United States, using data from 307 snow-free catchments. The models are constrained to satisfy several tests of hydrologic consistency, a flow space transformation is used to ensure better consistency with underlying statistical assumptions, and information criteria are used to evaluate model complexity relative to the data. The results clearly demonstrate that the principle of consistency provides a sensible basis for guiding selection of model structures and indicate strong spatial persistence of certain model structures across the continental United States. Further work to untangle reasons for model structure predominance can help to relate conceptual model structures to physical characteristics of the catchments, facilitating the task of prediction in ungaged basins.
Yang, Kai-Fu; Li, Chao-Yi; Li, Yong-Jie
2015-01-01
Both the neurons with orientation-selective and with non-selective surround inhibition have been observed in the primary visual cortex (V1) of primates and cats. Though the inhibition coming from the surround region (named as non-classical receptive field, nCRF) has been considered playing critical role in visual perception, the specific role of orientation-selective and non-selective inhibition in the task of contour detection is less known. To clarify above question, we first carried out computational analysis of the contour detection performance of V1 neurons with different types of surround inhibition, on the basis of which we then proposed two integrated models to evaluate their role in this specific perceptual task by combining the two types of surround inhibition with two different ways. The two models were evaluated with synthetic images and a set of challenging natural images, and the results show that both of the integrated models outperform the typical models with orientation-selective or non-selective inhibition alone. The findings of this study suggest that V1 neurons with different types of center–surround interaction work in cooperative and adaptive ways at least when extracting organized structures from cluttered natural scenes. This work is expected to inspire efficient phenomenological models for engineering applications in field of computational machine-vision. PMID:26136664
Yang, Kai-Fu; Li, Chao-Yi; Li, Yong-Jie
2015-01-01
Both the neurons with orientation-selective and with non-selective surround inhibition have been observed in the primary visual cortex (V1) of primates and cats. Though the inhibition coming from the surround region (named as non-classical receptive field, nCRF) has been considered playing critical role in visual perception, the specific role of orientation-selective and non-selective inhibition in the task of contour detection is less known. To clarify above question, we first carried out computational analysis of the contour detection performance of V1 neurons with different types of surround inhibition, on the basis of which we then proposed two integrated models to evaluate their role in this specific perceptual task by combining the two types of surround inhibition with two different ways. The two models were evaluated with synthetic images and a set of challenging natural images, and the results show that both of the integrated models outperform the typical models with orientation-selective or non-selective inhibition alone. The findings of this study suggest that V1 neurons with different types of center-surround interaction work in cooperative and adaptive ways at least when extracting organized structures from cluttered natural scenes. This work is expected to inspire efficient phenomenological models for engineering applications in field of computational machine-vision.
Sutton, Steven C; Hu, Mingxiu
2006-05-05
Many mathematical models have been proposed for establishing an in vitro/in vivo correlation (IVIVC). The traditional IVIVC model building process consists of 5 steps: deconvolution, model fitting, convolution, prediction error evaluation, and cross-validation. This is a time-consuming process and typically a few models at most are tested for any given data set. The objectives of this work were to (1) propose a statistical tool to screen models for further development of an IVIVC, (2) evaluate the performance of each model under different circumstances, and (3) investigate the effectiveness of common statistical model selection criteria for choosing IVIVC models. A computer program was developed to explore which model(s) would be most likely to work well with a random variation from the original formulation. The process used Monte Carlo simulation techniques to build IVIVC models. Data-based model selection criteria (Akaike Information Criteria [AIC], R2) and the probability of passing the Food and Drug Administration "prediction error" requirement was calculated. To illustrate this approach, several real data sets representing a broad range of release profiles are used to illustrate the process and to demonstrate the advantages of this automated process over the traditional approach. The Hixson-Crowell and Weibull models were often preferred over the linear. When evaluating whether a Level A IVIVC model was possible, the model selection criteria AIC generally selected the best model. We believe that the approach we proposed may be a rapid tool to determine which IVIVC model (if any) is the most applicable.
Model Selection in Historical Research Using Approximate Bayesian Computation
Rubio-Campillo, Xavier
2016-01-01
Formal Models and History Computational models are increasingly being used to study historical dynamics. This new trend, which could be named Model-Based History, makes use of recently published datasets and innovative quantitative methods to improve our understanding of past societies based on their written sources. The extensive use of formal models allows historians to re-evaluate hypotheses formulated decades ago and still subject to debate due to the lack of an adequate quantitative framework. The initiative has the potential to transform the discipline if it solves the challenges posed by the study of historical dynamics. These difficulties are based on the complexities of modelling social interaction, and the methodological issues raised by the evaluation of formal models against data with low sample size, high variance and strong fragmentation. Case Study This work examines an alternate approach to this evaluation based on a Bayesian-inspired model selection method. The validity of the classical Lanchester’s laws of combat is examined against a dataset comprising over a thousand battles spanning 300 years. Four variations of the basic equations are discussed, including the three most common formulations (linear, squared, and logarithmic) and a new variant introducing fatigue. Approximate Bayesian Computation is then used to infer both parameter values and model selection via Bayes Factors. Impact Results indicate decisive evidence favouring the new fatigue model. The interpretation of both parameter estimations and model selection provides new insights into the factors guiding the evolution of warfare. At a methodological level, the case study shows how model selection methods can be used to guide historical research through the comparison between existing hypotheses and empirical evidence. PMID:26730953
Input variable selection and calibration data selection for storm water quality regression models.
Sun, Siao; Bertrand-Krajewski, Jean-Luc
2013-01-01
Storm water quality models are useful tools in storm water management. Interest has been growing in analyzing existing data for developing models for urban storm water quality evaluations. It is important to select appropriate model inputs when many candidate explanatory variables are available. Model calibration and verification are essential steps in any storm water quality modeling. This study investigates input variable selection and calibration data selection in storm water quality regression models. The two selection problems are mutually interacted. A procedure is developed in order to fulfil the two selection tasks in order. The procedure firstly selects model input variables using a cross validation method. An appropriate number of variables are identified as model inputs to ensure that a model is neither overfitted nor underfitted. Based on the model input selection results, calibration data selection is studied. Uncertainty of model performances due to calibration data selection is investigated with a random selection method. An approach using the cluster method is applied in order to enhance model calibration practice based on the principle of selecting representative data for calibration. The comparison between results from the cluster selection method and random selection shows that the former can significantly improve performances of calibrated models. It is found that the information content in calibration data is important in addition to the size of calibration data.
Compromise Approach-Based Genetic Algorithm for Constrained Multiobjective Portfolio Selection Model
NASA Astrophysics Data System (ADS)
Li, Jun
In this paper, fuzzy set theory is incorporated into a multiobjective portfolio selection model for investors’ taking into three criteria: return, risk and liquidity. The cardinality constraint, the buy-in threshold constraint and the round-lots constraints are considered in the proposed model. To overcome the difficulty of evaluation a large set of efficient solutions and selection of the best one on non-dominated surface, a compromise approach-based genetic algorithm is presented to obtain a compromised solution for the proposed constrained multiobjective portfolio selection model.
The effect of interface properties on nickel base alloy composites
NASA Technical Reports Server (NTRS)
Groves, M.; Grossman, T.; Senemeier, M.; Wright, K.
1995-01-01
This program was performed to assess the extent to which mechanical behavior models can predict the properties of sapphire fiber/nickel aluminide matrix composites and help guide their development by defining improved combinations of matrix and interface coating. The program consisted of four tasks: 1) selection of the matrices and interface coating constituents using a modeling-based approach; 2) fabrication of the selected materials; 3) testing and evaluation of the materials; and 4) evaluation of the behavior models to develop recommendations. Ni-50Al and Ni-20AI-30Fe (a/o) matrices were selected which gave brittle and ductile behavior, respectively, and an interface coating of PVD YSZ was selected which provided strong bonding to the sapphire fiber. Significant fiber damage and strength loss was observed in the composites which made straightforward comparison of properties with models difficult. Nevertheless, the models selected generally provided property predictions which agreed well with results when fiber degradation was incorporated. The presence of a strong interface bond was felt to be detrimental in the NiAI MMC system where low toughness and low strength were observed.
Sustainable Supplier Performance Evaluation and Selection with Neofuzzy TOPSIS Method
Chaharsooghi, S. K.; Ashrafi, Mehdi
2014-01-01
Supplier selection plays an important role in the supply chain management and traditional criteria such as price, quality, and flexibility are considered for supplier performance evaluation in researches. In recent years sustainability has received more attention in the supply chain management literature with triple bottom line (TBL) describing the sustainability in supply chain management with social, environmental, and economic initiatives. This paper explores sustainability in supply chain management and examines the problem of identifying a new model for supplier selection based on extended model of TBL approach in supply chain by presenting fuzzy multicriteria method. Linguistic values of experts' subjective preferences are expressed with fuzzy numbers and Neofuzzy TOPSIS is proposed for finding the best solution of supplier selection problem. Numerical results show that the proposed model is efficient for integrating sustainability in supplier selection problem. The importance of using complimentary aspects of sustainability and Neofuzzy TOPSIS concept in sustainable supplier selection process is shown with sensitivity analysis. PMID:27379267
Sustainable Supplier Performance Evaluation and Selection with Neofuzzy TOPSIS Method.
Chaharsooghi, S K; Ashrafi, Mehdi
2014-01-01
Supplier selection plays an important role in the supply chain management and traditional criteria such as price, quality, and flexibility are considered for supplier performance evaluation in researches. In recent years sustainability has received more attention in the supply chain management literature with triple bottom line (TBL) describing the sustainability in supply chain management with social, environmental, and economic initiatives. This paper explores sustainability in supply chain management and examines the problem of identifying a new model for supplier selection based on extended model of TBL approach in supply chain by presenting fuzzy multicriteria method. Linguistic values of experts' subjective preferences are expressed with fuzzy numbers and Neofuzzy TOPSIS is proposed for finding the best solution of supplier selection problem. Numerical results show that the proposed model is efficient for integrating sustainability in supplier selection problem. The importance of using complimentary aspects of sustainability and Neofuzzy TOPSIS concept in sustainable supplier selection process is shown with sensitivity analysis.
Boutkhoum, Omar; Hanine, Mohamed; Agouti, Tarik; Tikniouine, Abdessadek
2015-01-01
In this paper, we examine the issue of strategic industrial location selection in uncertain decision making environments for implanting new industrial corporation. In fact, the industrial location issue is typically considered as a crucial factor in business research field which is related to many calculations about natural resources, distributors, suppliers, customers, and most other things. Based on the integration of environmental, economic and social decisive elements of sustainable development, this paper presents a hybrid decision making model combining fuzzy multi-criteria analysis with analytical capabilities that OLAP systems can provide for successful and optimal industrial location selection. The proposed model mainly consists in three stages. In the first stage, a decision-making committee has been established to identify the evaluation criteria impacting the location selection process. In the second stage, we develop fuzzy AHP software based on the extent analysis method to assign the importance weights to the selected criteria, which allows us to model the linguistic vagueness, ambiguity, and incomplete knowledge. In the last stage, OLAP analysis integrated with multi-criteria analysis employs these weighted criteria as inputs to evaluate, rank and select the strategic industrial location for implanting new business corporation in the region of Casablanca, Morocco. Finally, a sensitivity analysis is performed to evaluate the impact of criteria weights and the preferences given by decision makers on the final rankings of strategic industrial locations.
Ecological Modeling Guide for Ecosystem Restoration and Management
2012-08-01
may result from proposed restoration and management actions. This report provides information to guide environmental planers in selection, development...actions. This report provides information to guide environmental planers in selection, development, evaluation and documentation of ecological models. A
Hippert, Henrique S; Taylor, James W
2010-04-01
Artificial neural networks have frequently been proposed for electricity load forecasting because of their capabilities for the nonlinear modelling of large multivariate data sets. Modelling with neural networks is not an easy task though; two of the main challenges are defining the appropriate level of model complexity, and choosing the input variables. This paper evaluates techniques for automatic neural network modelling within a Bayesian framework, as applied to six samples containing daily load and weather data for four different countries. We analyse input selection as carried out by the Bayesian 'automatic relevance determination', and the usefulness of the Bayesian 'evidence' for the selection of the best structure (in terms of number of neurones), as compared to methods based on cross-validation. Copyright 2009 Elsevier Ltd. All rights reserved.
Evaluating the habitat capability model for Merriam's turkeys
Mark A. Rumble; Stanley H. Anderson
1995-01-01
Habitat capability (HABCAP) models for wildlife assist land managers in predicting the consequences of their management decisions. Models must be tested and refined prior to using them in management planning. We tested the predicted patterns of habitat selection of the R2 HABCAP model using observed patterns of habitats selected by radio-marked Merriamâs turkey (
Computer-Aided Group Problem Solving for Unified Life Cycle Engineering (ULCE)
1989-02-01
defining the problem, generating alternative solutions, evaluating alternatives, selecting alternatives, and implementing the solution. Systems...specialist in group dynamics, assists the group in formulating the problem and selecting a model framework. The analyst provides the group with computer...allocating resources, evaluating and selecting options, making judgments explicit, and analyzing dynamic systems. c. University of Rhode Island Drs. Geoffery
Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang
2014-01-01
Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible. PMID:25745272
Mixed conditional logistic regression for habitat selection studies.
Duchesne, Thierry; Fortin, Daniel; Courbin, Nicolas
2010-05-01
1. Resource selection functions (RSFs) are becoming a dominant tool in habitat selection studies. RSF coefficients can be estimated with unconditional (standard) and conditional logistic regressions. While the advantage of mixed-effects models is recognized for standard logistic regression, mixed conditional logistic regression remains largely overlooked in ecological studies. 2. We demonstrate the significance of mixed conditional logistic regression for habitat selection studies. First, we use spatially explicit models to illustrate how mixed-effects RSFs can be useful in the presence of inter-individual heterogeneity in selection and when the assumption of independence from irrelevant alternatives (IIA) is violated. The IIA hypothesis states that the strength of preference for habitat type A over habitat type B does not depend on the other habitat types also available. Secondly, we demonstrate the significance of mixed-effects models to evaluate habitat selection of free-ranging bison Bison bison. 3. When movement rules were homogeneous among individuals and the IIA assumption was respected, fixed-effects RSFs adequately described habitat selection by simulated animals. In situations violating the inter-individual homogeneity and IIA assumptions, however, RSFs were best estimated with mixed-effects regressions, and fixed-effects models could even provide faulty conclusions. 4. Mixed-effects models indicate that bison did not select farmlands, but exhibited strong inter-individual variations in their response to farmlands. Less than half of the bison preferred farmlands over forests. Conversely, the fixed-effect model simply suggested an overall selection for farmlands. 5. Conditional logistic regression is recognized as a powerful approach to evaluate habitat selection when resource availability changes. This regression is increasingly used in ecological studies, but almost exclusively in the context of fixed-effects models. Fitness maximization can imply differences in trade-offs among individuals, which can yield inter-individual differences in selection and lead to departure from IIA. These situations are best modelled with mixed-effects models. Mixed-effects conditional logistic regression should become a valuable tool for ecological research.
NASA Astrophysics Data System (ADS)
Sakuma, Jun; Wright, Rebecca N.
Privacy-preserving classification is the task of learning or training a classifier on the union of privately distributed datasets without sharing the datasets. The emphasis of existing studies in privacy-preserving classification has primarily been put on the design of privacy-preserving versions of particular data mining algorithms, However, in classification problems, preprocessing and postprocessing— such as model selection or attribute selection—play a prominent role in achieving higher classification accuracy. In this paper, we show generalization error of classifiers in privacy-preserving classification can be securely evaluated without sharing prediction results. Our main technical contribution is a new generalized Hamming distance protocol that is universally applicable to preprocessing and postprocessing of various privacy-preserving classification problems, such as model selection in support vector machine and attribute selection in naive Bayes classification.
Nguyen, Huu-Tho; Dawal, Siti Zawiah Md; Nukman, Yusoff; Rifai, Achmad P; Aoyama, Hideki
2016-01-01
The conveyor system plays a vital role in improving the performance of flexible manufacturing cells (FMCs). The conveyor selection problem involves the evaluation of a set of potential alternatives based on qualitative and quantitative criteria. This paper presents an integrated multi-criteria decision making (MCDM) model of a fuzzy AHP (analytic hierarchy process) and fuzzy ARAS (additive ratio assessment) for conveyor evaluation and selection. In this model, linguistic terms represented as triangular fuzzy numbers are used to quantify experts' uncertain assessments of alternatives with respect to the criteria. The fuzzy set is then integrated into the AHP to determine the weights of the criteria. Finally, a fuzzy ARAS is used to calculate the weights of the alternatives. To demonstrate the effectiveness of the proposed model, a case study is performed of a practical example, and the results obtained demonstrate practical potential for the implementation of FMCs.
Nguyen, Huu-Tho; Md Dawal, Siti Zawiah; Nukman, Yusoff; P. Rifai, Achmad; Aoyama, Hideki
2016-01-01
The conveyor system plays a vital role in improving the performance of flexible manufacturing cells (FMCs). The conveyor selection problem involves the evaluation of a set of potential alternatives based on qualitative and quantitative criteria. This paper presents an integrated multi-criteria decision making (MCDM) model of a fuzzy AHP (analytic hierarchy process) and fuzzy ARAS (additive ratio assessment) for conveyor evaluation and selection. In this model, linguistic terms represented as triangular fuzzy numbers are used to quantify experts’ uncertain assessments of alternatives with respect to the criteria. The fuzzy set is then integrated into the AHP to determine the weights of the criteria. Finally, a fuzzy ARAS is used to calculate the weights of the alternatives. To demonstrate the effectiveness of the proposed model, a case study is performed of a practical example, and the results obtained demonstrate practical potential for the implementation of FMCs. PMID:27070543
Evaluating the Assessment Models for Young Children with Special Needs in Taiwan
ERIC Educational Resources Information Center
Ho, Hua-Kuo
2009-01-01
The purpose of this study was intended to evaluate the assessment models of two representative centers of team evaluation for children's development in Taiwan. Documentary analysis and phone interview were employed in the study to collect the research data needed. Two centers of team evaluation for children's development were selected and…
Selecting Models for Measuring Change When True Experimental Conditions Do Not Exist.
ERIC Educational Resources Information Center
Fortune, Jim C.; Hutson, Barbara A.
1984-01-01
Measuring change when true experimental conditions do not exist is a difficult process. This article reviews the artifacts of change measurement in evaluations and quasi-experimental designs, delineates considerations in choosing a model to measure change under nonideal conditions, and suggests ways to organize models to facilitate selection.…
ERIC Educational Resources Information Center
Beretvas, S. Natasha; Murphy, Daniel L.
2013-01-01
The authors assessed correct model identification rates of Akaike's information criterion (AIC), corrected criterion (AICC), consistent AIC (CAIC), Hannon and Quinn's information criterion (HQIC), and Bayesian information criterion (BIC) for selecting among cross-classified random effects models. Performance of default values for the 5…
Measures of GCM Performance as Functions of Model Parameters Affecting Clouds and Radiation
NASA Astrophysics Data System (ADS)
Jackson, C.; Mu, Q.; Sen, M.; Stoffa, P.
2002-05-01
This abstract is one of three related presentations at this meeting dealing with several issues surrounding optimal parameter and uncertainty estimation of model predictions of climate. Uncertainty in model predictions of climate depends in part on the uncertainty produced by model approximations or parameterizations of unresolved physics. Evaluating these uncertainties is computationally expensive because one needs to evaluate how arbitrary choices for any given combination of model parameters affects model performance. Because the computational effort grows exponentially with the number of parameters being investigated, it is important to choose parameters carefully. Evaluating whether a parameter is worth investigating depends on two considerations: 1) does reasonable choices of parameter values produce a large range in model response relative to observational uncertainty? and 2) does the model response depend non-linearly on various combinations of model parameters? We have decided to narrow our attention to selecting parameters that affect clouds and radiation, as it is likely that these parameters will dominate uncertainties in model predictions of future climate. We present preliminary results of ~20 to 30 AMIPII style climate model integrations using NCAR's CCM3.10 that show model performance as functions of individual parameters controlling 1) critical relative humidity for cloud formation (RHMIN), and 2) boundary layer critical Richardson number (RICR). We also explore various definitions of model performance that include some or all observational data sources (surface air temperature and pressure, meridional and zonal winds, clouds, long and short-wave cloud forcings, etc...) and evaluate in a few select cases whether the model's response depends non-linearly on the parameter values we have selected.
SENSITIVE PARAMETER EVALUATION FOR A VADOSE ZONE FATE AND TRANSPORT MODEL
This report presents information pertaining to quantitative evaluation of the potential impact of selected parameters on output of vadose zone transport and fate models used to describe the behavior of hazardous chemicals in soil. The Vadose 2one Interactive Processes (VIP) model...
Scalable gastroscopic video summarization via similar-inhibition dictionary selection.
Wang, Shuai; Cong, Yang; Cao, Jun; Yang, Yunsheng; Tang, Yandong; Zhao, Huaici; Yu, Haibin
2016-01-01
This paper aims at developing an automated gastroscopic video summarization algorithm to assist clinicians to more effectively go through the abnormal contents of the video. To select the most representative frames from the original video sequence, we formulate the problem of gastroscopic video summarization as a dictionary selection issue. Different from the traditional dictionary selection methods, which take into account only the number and reconstruction ability of selected key frames, our model introduces the similar-inhibition constraint to reinforce the diversity of selected key frames. We calculate the attention cost by merging both gaze and content change into a prior cue to help select the frames with more high-level semantic information. Moreover, we adopt an image quality evaluation process to eliminate the interference of the poor quality images and a segmentation process to reduce the computational complexity. For experiments, we build a new gastroscopic video dataset captured from 30 volunteers with more than 400k images and compare our method with the state-of-the-arts using the content consistency, index consistency and content-index consistency with the ground truth. Compared with all competitors, our method obtains the best results in 23 of 30 videos evaluated based on content consistency, 24 of 30 videos evaluated based on index consistency and all videos evaluated based on content-index consistency. For gastroscopic video summarization, we propose an automated annotation method via similar-inhibition dictionary selection. Our model can achieve better performance compared with other state-of-the-art models and supplies more suitable key frames for diagnosis. The developed algorithm can be automatically adapted to various real applications, such as the training of young clinicians, computer-aided diagnosis or medical report generation. Copyright © 2015 Elsevier B.V. All rights reserved.
Svolos, Patricia; Tsougos, Ioannis; Kyrgias, Georgios; Kappas, Constantine; Theodorou, Kiki
2011-04-01
In this study we sought to evaluate and accent the importance of radiobiological parameter selection and implementation to the normal tissue complication probability (NTCP) models. The relative seriality (RS) and the Lyman-Kutcher-Burman (LKB) models were studied. For each model, a minimum and maximum set of radiobiological parameter sets was selected from the overall published sets applied in literature and a theoretical mean parameter set was computed. In order to investigate the potential model weaknesses in NTCP estimation and to point out the correct use of model parameters, these sets were used as input to the RS and the LKB model, estimating radiation induced complications for a group of 36 breast cancer patients treated with radiotherapy. The clinical endpoint examined was Radiation Pneumonitis. Each model was represented by a certain dose-response range when the selected parameter sets were applied. Comparing the models with their ranges, a large area of coincidence was revealed. If the parameter uncertainties (standard deviation) are included in the models, their area of coincidence might be enlarged, constraining even greater their predictive ability. The selection of the proper radiobiological parameter set for a given clinical endpoint is crucial. Published parameter values are not definite but should be accompanied by uncertainties, and one should be very careful when applying them to the NTCP models. Correct selection and proper implementation of published parameters provides a quite accurate fit of the NTCP models to the considered endpoint.
NASA Technical Reports Server (NTRS)
Frederick, D. K.; Lashmet, P. K.; Sandor, G. N.; Shen, C. N.; Smith, E. J.; Yerazunis, S. W.
1972-01-01
The problems related to the design and control of a mobile planetary vehicle to implement a systematic plan for the exploration of Mars were investigated. Problem areas receiving attention include: vehicle configuration, control, dynamics, systems and propulsion; systems analysis; navigation, terrain modeling and path selection; and chemical analysis of specimens. The following specific tasks were studied: vehicle model design, mathematical modeling of dynamic vehicle, experimental vehicle dynamics, obstacle negotiation, electromechanical controls, collapsibility and deployment, construction of a wheel tester, wheel analysis, payload design, system design optimization, effect of design assumptions, accessory optimal design, on-board computer subsystem, laser range measurement, discrete obstacle detection, obstacle detection systems, terrain modeling, path selection system simulation and evaluation, gas chromatograph/mass spectrometer system concepts, chromatograph model evaluation and improvement and transport parameter evaluation.
A Process Model of Principal Selection.
ERIC Educational Resources Information Center
Flanigan, J. L.; And Others
A process model to assist school district superintendents in the selection of principals is presented in this paper. Components of the process are described, which include developing an action plan, formulating an explicit job description, advertising, assessing candidates' philosophy, conducting interview analyses, evaluating response to stress,…
Assessing Predictive Properties of Genome-Wide Selection in Soybeans
Xavier, Alencar; Muir, William M.; Rainey, Katy Martin
2016-01-01
Many economically important traits in plant breeding have low heritability or are difficult to measure. For these traits, genomic selection has attractive features and may boost genetic gains. Our goal was to evaluate alternative scenarios to implement genomic selection for yield components in soybean (Glycine max L. merr). We used a nested association panel with cross validation to evaluate the impacts of training population size, genotyping density, and prediction model on the accuracy of genomic prediction. Our results indicate that training population size was the factor most relevant to improvement in genome-wide prediction, with greatest improvement observed in training sets up to 2000 individuals. We discuss assumptions that influence the choice of the prediction model. Although alternative models had minor impacts on prediction accuracy, the most robust prediction model was the combination of reproducing kernel Hilbert space regression and BayesB. Higher genotyping density marginally improved accuracy. Our study finds that breeding programs seeking efficient genomic selection in soybeans would best allocate resources by investing in a representative training set. PMID:27317786
Evaluation of new collision-pair selection models in DSMC
NASA Astrophysics Data System (ADS)
Akhlaghi, Hassan; Roohi, Ehsan
2017-10-01
The current paper investigates new collision-pair selection procedures in a direct simulation Monte Carlo (DSMC) method. Collision partner selection based on the random procedure from nearest neighbor particles and deterministic selection of nearest neighbor particles have already been introduced as schemes that provide accurate results in a wide range of problems. In the current research, new collision-pair selections based on the time spacing and direction of the relative movement of particles are introduced and evaluated. Comparisons between the new and existing algorithms are made considering appropriate test cases including fluctuations in homogeneous gas, 2D equilibrium flow, and Fourier flow problem. Distribution functions for number of particles and collisions in cell, velocity components, and collisional parameters (collision separation, time spacing, relative velocity, and the angle between relative movements of particles) are investigated and compared with existing analytical relations for each model. The capability of each model in the prediction of the heat flux in the Fourier problem at different cell numbers, numbers of particles, and time steps is examined. For new and existing collision-pair selection schemes, the effect of an alternative formula for the number of collision-pair selections and avoiding repetitive collisions are investigated via the prediction of the Fourier heat flux. The simulation results demonstrate the advantages and weaknesses of each model in different test cases.
Multi-agent Reinforcement Learning Model for Effective Action Selection
NASA Astrophysics Data System (ADS)
Youk, Sang Jo; Lee, Bong Keun
Reinforcement learning is a sub area of machine learning concerned with how an agent ought to take actions in an environment so as to maximize some notion of long-term reward. In the case of multi-agent, especially, which state space and action space gets very enormous in compared to single agent, so it needs to take most effective measure available select the action strategy for effective reinforcement learning. This paper proposes a multi-agent reinforcement learning model based on fuzzy inference system in order to improve learning collect speed and select an effective action in multi-agent. This paper verifies an effective action select strategy through evaluation tests based on Robocop Keep away which is one of useful test-beds for multi-agent. Our proposed model can apply to evaluate efficiency of the various intelligent multi-agents and also can apply to strategy and tactics of robot soccer system.
Nallikuzhy, Jiss J; Dandapat, S
2017-06-01
In this work, a new patient-specific approach to enhance the spatial resolution of ECG is proposed and evaluated. The proposed model transforms a three-lead ECG into a standard twelve-lead ECG thereby enhancing its spatial resolution. The three leads used for prediction are obtained from the standard twelve-lead ECG. The proposed model takes advantage of the improved inter-lead correlation in wavelet domain. Since the model is patient-specific, it also selects the optimal predictor leads for a given patient using a lead selection algorithm. The lead selection algorithm is based on a new diagnostic similarity score which computes the diagnostic closeness between the original and the spatially enhanced leads. Standard closeness measures are used to assess the performance of the model. The similarity in diagnostic information between the original and the spatially enhanced leads are evaluated using various diagnostic measures. Repeatability and diagnosability are performed to quantify the applicability of the model. A comparison of the proposed model is performed with existing models that transform a subset of standard twelve-lead ECG into the standard twelve-lead ECG. From the analysis of the results, it is evident that the proposed model preserves diagnostic information better compared to other models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Selection of Variables in Cluster Analysis: An Empirical Comparison of Eight Procedures
ERIC Educational Resources Information Center
Steinley, Douglas; Brusco, Michael J.
2008-01-01
Eight different variable selection techniques for model-based and non-model-based clustering are evaluated across a wide range of cluster structures. It is shown that several methods have difficulties when non-informative variables (i.e., random noise) are included in the model. Furthermore, the distribution of the random noise greatly impacts the…
[Evaluation on a fast weight reduction model in vitro].
Li, Songtao; Li, Ying; Wen, Ying; Sun, Changhao
2010-03-01
To establish a fast and effective model in vitro for screening weight-reducing drugs and taking preliminary evaluation of the model. Mature adipocytes of SD rat induced by oleic acid were used to establish a obesity model in vitro. Isoprel, genistein, caffeine were selected as positive agents and curcumine as negative agent to evaluate the obesity model. Lipolysis of adipocytes was stimulated significantly by isoprel, genistein and caffeine rather than curcumine. This model could be used efficiently for screening weight-losing drugs.
Various models have been proposed for describing the time- and concentration-dependence of toxic effects to aquatic organisms, which would improve characterization of risks in natural systems. Selected models were evaluated using results from a study on the lethality of copper t...
The Evaluation and Selection of Adequate Causal Models: A Compensatory Education Example.
ERIC Educational Resources Information Center
Tanaka, Jeffrey S.
1982-01-01
Implications of model evaluation (using traditional chi square goodness of fit statistics, incremental fit indices for covariance structure models, and latent variable coefficients of determination) on substantive conclusions are illustrated with an example examining the effects of participation in a compensatory education program on posttreatment…
Critical evaluation of models used to study agricultural phosphorus and water quality
USDA-ARS?s Scientific Manuscript database
This article uses examples from recent research to evaluate select issues related to developing models that simulate phosphorus in the environment. Models are valuable because they force scientists to formalize understanding of P systems and identify knowledge gaps, and allow quantification of P tra...
Valente, Bruno D.; Morota, Gota; Peñagaricano, Francisco; Gianola, Daniel; Weigel, Kent; Rosa, Guilherme J. M.
2015-01-01
The term “effect” in additive genetic effect suggests a causal meaning. However, inferences of such quantities for selection purposes are typically viewed and conducted as a prediction task. Predictive ability as tested by cross-validation is currently the most acceptable criterion for comparing models and evaluating new methodologies. Nevertheless, it does not directly indicate if predictors reflect causal effects. Such evaluations would require causal inference methods that are not typical in genomic prediction for selection. This suggests that the usual approach to infer genetic effects contradicts the label of the quantity inferred. Here we investigate if genomic predictors for selection should be treated as standard predictors or if they must reflect a causal effect to be useful, requiring causal inference methods. Conducting the analysis as a prediction or as a causal inference task affects, for example, how covariates of the regression model are chosen, which may heavily affect the magnitude of genomic predictors and therefore selection decisions. We demonstrate that selection requires learning causal genetic effects. However, genomic predictors from some models might capture noncausal signal, providing good predictive ability but poorly representing true genetic effects. Simulated examples are used to show that aiming for predictive ability may lead to poor modeling decisions, while causal inference approaches may guide the construction of regression models that better infer the target genetic effect even when they underperform in cross-validation tests. In conclusion, genomic selection models should be constructed to aim primarily for identifiability of causal genetic effects, not for predictive ability. PMID:25908318
Examining speed versus selection in connectivity models using elk migration as an example
Brennan, Angela; Hanks, Ephraim M.; Merkle, Jerod A.; Cole, Eric K.; Dewey, Sarah R.; Courtemanch, Alyson B.; Cross, Paul C.
2018-01-01
ContextLandscape resistance is vital to connectivity modeling and frequently derived from resource selection functions (RSFs). RSFs estimate relative probability of use and tend to focus on understanding habitat preferences during slow, routine animal movements (e.g., foraging). Dispersal and migration, however, can produce rarer, faster movements, in which case models of movement speed rather than resource selection may be more realistic for identifying habitats that facilitate connectivity.ObjectiveTo compare two connectivity modeling approaches applied to resistance estimated from models of movement rate and resource selection.MethodsUsing movement data from migrating elk, we evaluated continuous time Markov chain (CTMC) and movement-based RSF models (i.e., step selection functions [SSFs]). We applied circuit theory and shortest random path (SRP) algorithms to CTMC, SSF and null (i.e., flat) resistance surfaces to predict corridors between elk seasonal ranges. We evaluated prediction accuracy by comparing model predictions to empirical elk movements.ResultsAll connectivity models predicted elk movements well, but models applied to CTMC resistance were more accurate than models applied to SSF and null resistance. Circuit theory models were more accurate on average than SRP models.ConclusionsCTMC can be more realistic than SSFs for estimating resistance for fast movements, though SSFs may demonstrate some predictive ability when animals also move slowly through corridors (e.g., stopover use during migration). High null model accuracy suggests seasonal range data may also be critical for predicting direct migration routes. For animals that migrate or disperse across large landscapes, we recommend incorporating CTMC into the connectivity modeling toolkit.
Tsoi, B; O'Reilly, D; Jegathisawaran, J; Tarride, J-E; Blackhouse, G; Goeree, R
2015-06-17
In constructing or appraising a health economic model, an early consideration is whether the modelling approach selected is appropriate for the given decision problem. Frameworks and taxonomies that distinguish between modelling approaches can help make this decision more systematic and this study aims to identify and compare the decision frameworks proposed to date on this topic area. A systematic review was conducted to identify frameworks from peer-reviewed and grey literature sources. The following databases were searched: OVID Medline and EMBASE; Wiley's Cochrane Library and Health Economic Evaluation Database; PubMed; and ProQuest. Eight decision frameworks were identified, each focused on a different set of modelling approaches and employing a different collection of selection criterion. The selection criteria can be categorized as either: (i) structural features (i.e. technical elements that are factual in nature) or (ii) practical considerations (i.e. context-dependent attributes). The most commonly mentioned structural features were population resolution (i.e. aggregate vs. individual) and interactivity (i.e. static vs. dynamic). Furthermore, understanding the needs of the end-users and stakeholders was frequently incorporated as a criterion within these frameworks. There is presently no universally-accepted framework for selecting an economic modelling approach. Rather, each highlights different criteria that may be of importance when determining whether a modelling approach is appropriate. Further discussion is thus necessary as the modelling approach selected will impact the validity of the underlying economic model and have downstream implications on its efficiency, transparency and relevance to decision-makers.
Multicriteria Personnel Selection by the Modified Fuzzy VIKOR Method
Alguliyev, Rasim M.; Aliguliyev, Ramiz M.; Mahmudova, Rasmiyya S.
2015-01-01
Personnel evaluation is an important process in human resource management. The multicriteria nature and the presence of both qualitative and quantitative factors make it considerably more complex. In this study, a fuzzy hybrid multicriteria decision-making (MCDM) model is proposed to personnel evaluation. This model solves personnel evaluation problem in a fuzzy environment where both criteria and weights could be fuzzy sets. The triangular fuzzy numbers are used to evaluate the suitability of personnel and the approximate reasoning of linguistic values. For evaluation, we have selected five information culture criteria. The weights of the criteria were calculated using worst-case method. After that, modified fuzzy VIKOR is proposed to rank the alternatives. The outcome of this research is ranking and selecting best alternative with the help of fuzzy VIKOR and modified fuzzy VIKOR techniques. A comparative analysis of results by fuzzy VIKOR and modified fuzzy VIKOR methods is presented. Experiments showed that the proposed modified fuzzy VIKOR method has some advantages over fuzzy VIKOR method. Firstly, from a computational complexity point of view, the presented model is effective. Secondly, compared to fuzzy VIKOR method, it has high acceptable advantage compared to fuzzy VIKOR method. PMID:26516634
Examining speed versus selection in connectivity models using elk migration as an example
Brennan, Angela; Hanks, EM; Merkle, JA; Cole, EK; Dewey, SR; Courtemanch, AB; Cross, Paul C.
2018-01-01
Context: Landscape resistance is vital to connectivity modeling and frequently derived from resource selection functions (RSFs). RSFs estimate relative probability of use and tend to focus on understanding habitat preferences during slow, routine animal movements (e.g., foraging). Dispersal and migration, however, can produce rarer, faster movements, in which case models of movement speed rather than resource selection may be more realistic for identifying habitats that facilitate connectivity. Objective: To compare two connectivity modeling approaches applied to resistance estimated from models of movement rate and resource selection. Methods: Using movement data from migrating elk, we evaluated continuous time Markov chain (CTMC) and movement-based RSF models (i.e., step selection functions [SSFs]). We applied circuit theory and shortest random path (SRP) algorithms to CTMC, SSF and null (i.e., flat) resistance surfaces to predict corridors between elk seasonal ranges. We evaluated prediction accuracy by comparing model predictions to empirical elk movements. Results: All models predicted elk movements well, but models applied to CTMC resistance were more accurate than models applied to SSF and null resistance. Circuit theory models were more accurate on average than SRP algorithms. Conclusions: CTMC can be more realistic than SSFs for estimating resistance for fast movements, though SSFs may demonstrate some predictive ability when animals also move slowly through corridors (e.g., stopover use during migration). High null model accuracy suggests seasonal range data may also be critical for predicting direct migration routes. For animals that migrate or disperse across large landscapes, we recommend incorporating CTMC into the connectivity modeling toolkit.
A CLIPS-based expert system for the evaluation and selection of robots
NASA Technical Reports Server (NTRS)
Nour, Mohamed A.; Offodile, Felix O.; Madey, Gregory R.
1994-01-01
This paper describes the development of a prototype expert system for intelligent selection of robots for manufacturing operations. The paper first develops a comprehensive, three-stage process to model the robot selection problem. The decisions involved in this model easily lend themselves to an expert system application. A rule-based system, based on the selection model, is developed using the CLIPS expert system shell. Data about actual robots is used to test the performance of the prototype system. Further extensions to the rule-based system for data handling and interfacing capabilities are suggested.
Comparisons of Means Using Exploratory and Confirmatory Approaches
ERIC Educational Resources Information Center
Kuiper, Rebecca M.; Hoijtink, Herbert
2010-01-01
This article discusses comparisons of means using exploratory and confirmatory approaches. Three methods are discussed: hypothesis testing, model selection based on information criteria, and Bayesian model selection. Throughout the article, an example is used to illustrate and evaluate the two approaches and the three methods. We demonstrate that…
Genomic selection in a commercial winter wheat population.
He, Sang; Schulthess, Albert Wilhelm; Mirdita, Vilson; Zhao, Yusheng; Korzun, Viktor; Bothe, Reiner; Ebmeyer, Erhard; Reif, Jochen C; Jiang, Yong
2016-03-01
Genomic selection models can be trained using historical data and filtering genotypes based on phenotyping intensity and reliability criterion are able to increase the prediction ability. We implemented genomic selection based on a large commercial population incorporating 2325 European winter wheat lines. Our objectives were (1) to study whether modeling epistasis besides additive genetic effects results in enhancement on prediction ability of genomic selection, (2) to assess prediction ability when training population comprised historical or less-intensively phenotyped lines, and (3) to explore the prediction ability in subpopulations selected based on the reliability criterion. We found a 5 % increase in prediction ability when shifting from additive to additive plus epistatic effects models. In addition, only a marginal loss from 0.65 to 0.50 in accuracy was observed using the data collected from 1 year to predict genotypes of the following year, revealing that stable genomic selection models can be accurately calibrated to predict subsequent breeding stages. Moreover, prediction ability was maximized when the genotypes evaluated in a single location were excluded from the training set but subsequently decreased again when the phenotyping intensity was increased above two locations, suggesting that the update of the training population should be performed considering all the selected genotypes but excluding those evaluated in a single location. The genomic prediction ability was substantially higher in subpopulations selected based on the reliability criterion, indicating that phenotypic selection for highly reliable individuals could be directly replaced by applying genomic selection to them. We empirically conclude that there is a high potential to assist commercial wheat breeding programs employing genomic selection approaches.
Ovenden, Ben; Milgate, Andrew; Wade, Len J; Rebetzke, Greg J; Holland, James B
2018-05-31
Abiotic stress tolerance traits are often complex and recalcitrant targets for conventional breeding improvement in many crop species. This study evaluated the potential of genomic selection to predict water-soluble carbohydrate concentration (WSCC), an important drought tolerance trait, in wheat under field conditions. A panel of 358 varieties and breeding lines constrained for maturity was evaluated under rainfed and irrigated treatments across two locations and two years. Whole-genome marker profiles and factor analytic mixed models were used to generate genomic estimated breeding values (GEBVs) for specific environments and environment groups. Additive genetic variance was smaller than residual genetic variance for WSCC, such that genotypic values were dominated by residual genetic effects rather than additive breeding values. As a result, GEBVs were not accurate predictors of genotypic values of the extant lines, but GEBVs should be reliable selection criteria to choose parents for intermating to produce new populations. The accuracy of GEBVs for untested lines was sufficient to increase predicted genetic gain from genomic selection per unit time compared to phenotypic selection if the breeding cycle is reduced by half by the use of GEBVs in off-season generations. Further, genomic prediction accuracy depended on having phenotypic data from environments with strong correlations with target production environments to build prediction models. By combining high-density marker genotypes, stress-managed field evaluations, and mixed models that model simultaneously covariances among genotypes and covariances of complex trait performance between pairs of environments, we were able to train models with good accuracy to facilitate genetic gain from genomic selection. Copyright © 2018 Ovenden et al.
The use of modelling to evaluate and adapt strategies for animal disease control.
Saegerman, C; Porter, S R; Humblet, M F
2011-08-01
Disease is often associated with debilitating clinical signs, disorders or production losses in animals and/or humans, leading to severe socio-economic repercussions. This explains the high priority that national health authorities and international organisations give to selecting control strategies for and the eradication of specific diseases. When a control strategy is selected and implemented, an effective method of evaluating its efficacy is through modelling. To illustrate the usefulness of models in evaluating control strategies, the authors describe several examples in detail, including three examples of classification and regression tree modelling to evaluate and improve the early detection of disease: West Nile fever in equids, bovine spongiform encephalopathy (BSE) and multifactorial diseases, such as colony collapse disorder (CCD) in the United States. Also examined are regression modelling to evaluate skin test practices and the efficacy of an awareness campaign for bovine tuberculosis (bTB); mechanistic modelling to monitor the progress of a control strategy for BSE; and statistical nationwide modelling to analyse the spatio-temporal dynamics of bTB and search for potential risk factors that could be used to target surveillance measures more effectively. In the accurate application of models, an interdisciplinary rather than a multidisciplinary approach is required, with the fewest assumptions possible.
Program Evaluation: An Overview.
ERIC Educational Resources Information Center
McCluskey, Lawrence
1973-01-01
Various models of educational evaluation are presented. These include: (1) the classical type model, which contains the following guidelines: formulate objectives, classify objectives, define objectives in behavioral terms, suggest situations in which achievement of objectives will be shown, develop or select appraisal techniques, and gather and…
Spatiotemporal Variation in Distance Dependent Animal Movement Contacts: One Size Doesn’t Fit All
Brommesson, Peter; Wennergren, Uno; Lindström, Tom
2016-01-01
The structure of contacts that mediate transmission has a pronounced effect on the outbreak dynamics of infectious disease and simulation models are powerful tools to inform policy decisions. Most simulation models of livestock disease spread rely to some degree on predictions of animal movement between holdings. Typically, movements are more common between nearby farms than between those located far away from each other. Here, we assessed spatiotemporal variation in such distance dependence of animal movement contacts from an epidemiological perspective. We evaluated and compared nine statistical models, applied to Swedish movement data from 2008. The models differed in at what level (if at all), they accounted for regional and/or seasonal heterogeneities in the distance dependence of the contacts. Using a kernel approach to describe how probability of contacts between farms changes with distance, we developed a hierarchical Bayesian framework and estimated parameters by using Markov Chain Monte Carlo techniques. We evaluated models by three different approaches of model selection. First, we used Deviance Information Criterion to evaluate their performance relative to each other. Secondly, we estimated the log predictive posterior distribution, this was also used to evaluate their relative performance. Thirdly, we performed posterior predictive checks by simulating movements with each of the parameterized models and evaluated their ability to recapture relevant summary statistics. Independent of selection criteria, we found that accounting for regional heterogeneity improved model accuracy. We also found that accounting for seasonal heterogeneity was beneficial, in terms of model accuracy, according to two of three methods used for model selection. Our results have important implications for livestock disease spread models where movement is an important risk factor for between farm transmission. We argue that modelers should refrain from using methods to simulate animal movements that assume the same pattern across all regions and seasons without explicitly testing for spatiotemporal variation. PMID:27760155
Selection of den sites by black bears in the southern Appalachians
Reynolds-Hogland, M. J.; Mitchell, M.S.; Powell, R.A.; Brown, D.C.
2007-01-01
We evaluated selection of den sites by American black bears (Ursus americanus) in the Pisgah Bear Sanctuary, western North Carolina, by comparing characteristics of dens at 53 den sites with availability of habitat characteristics in annual home ranges of bears and in the study area. We also tested whether den-site selection differed by sex, age, and reproductive status of bears. In addition, we evaluated whether the den component of an existing habitat model for black bears predicted where bears would select den sites. We found bears selected den sites far from gravel roads, on steep slopes, and at high elevations relative to what was available in both annual home ranges and in the study area. Den-site selection did not differ by sex or age, but it differed by reproductive status. Adult females with cubs preferred to den in areas that were relatively far from gravel roads, but adult females without cubs did not. The habitat model overestimated the value of areas near gravel roads, underestimated the value of moderately steep areas, and did not include elevation as a predictor variable. Our results highlight the importance of evaluating den selection in terms of both use and availability of den characteristics. ?? 2007 American Society of Mammalogists.
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
Fernandes, David Douglas Sousa; Gomes, Adriano A; Costa, Gean Bezerra da; Silva, Gildo William B da; Véras, Germano
2011-12-15
This work is concerned of evaluate the use of visible and near-infrared (NIR) range, separately and combined, to determine the biodiesel content in biodiesel/diesel blends using Multiple Linear Regression (MLR) and variable selection by Successive Projections Algorithm (SPA). Full spectrum models employing Partial Least Squares (PLS) and variables selection by Stepwise (SW) regression coupled with Multiple Linear Regression (MLR) and PLS models also with variable selection by Jack-Knife (Jk) were compared the proposed methodology. Several preprocessing were evaluated, being chosen derivative Savitzky-Golay with second-order polynomial and 17-point window for NIR and visible-NIR range, with offset correction. A total of 100 blends with biodiesel content between 5 and 50% (v/v) prepared starting from ten sample of biodiesel. In the NIR and visible region the best model was the SPA-MLR using only two and eight wavelengths with RMSEP of 0.6439% (v/v) and 0.5741 respectively, while in the visible-NIR region the best model was the SW-MLR using five wavelengths and RMSEP of 0.9533% (v/v). Results indicate that both spectral ranges evaluated showed potential for developing a rapid and nondestructive method to quantify biodiesel in blends with mineral diesel. Finally, one can still mention that the improvement in terms of prediction error obtained with the procedure for variables selection was significant. Copyright © 2011 Elsevier B.V. All rights reserved.
Study on the Selection of Equipment Suppliers for Wind Power Generation EPC Project
NASA Astrophysics Data System (ADS)
Yang, Yuanyue; Li, Huimin
2017-12-01
In the EPC project, the purchase cost of equipments accounted for about 60% of the total project cost, thus, the selection of equipment suppliers has an important influence on the EPC project. This paper, took EPC project for the phase I engineering of Guizhou Huaxi Yunding wind power plant as research background, constructed the evaluation index system for the selection of equipment suppliers for wind power generation EPC project from multiple perspectives, and introduced matter-element extension evaluation model to evaluate the selection of equipment suppliers for this project from the qualitative and quantitative point of view. The result is consistent with the actual situation, which verifies the validity and operability of this method.
Raymond L. Czaplewski
1973-01-01
A generalized, non-linear population dynamics model of an ecosystem is used to investigate the direction of selective pressures upon a mutant by studying the competition between parent and mutant populations. The model has the advantages of considering selection as operating on the phenotype, of retaining the interaction of the mutant population with the ecosystem as a...
Resource selection by an ectothermic predator in a dynamic thermal landscape
Andrew D. George; Grant M. Connette; Frank R. Thompson; John Faaborg
2017-01-01
Predicting the effects of global climate change on species interactions has remained difficult because there is a spatiotemporal mismatch between regional climate models and microclimates experienced by organisms. We evaluated resource selection in a predominant ectothermic predator using a modeling approach that permitted us to assess the importance of habitat...
Scale dependency of American marten (Martes americana) habitat relations [Chapter 12
Andrew J. Shirk; Tzeidle N. Wasserman; Samuel A. Cushman; Martin G. Raphael
2012-01-01
Animals select habitat resources at multiple spatial scales; therefore, explicit attention to scale-dependency when modeling habitat relations is critical to understanding how organisms select habitat in complex landscapes. Models that evaluate habitat variables calculated at a single spatial scale (e.g., patch, home range) fail to account for the effects of...
ERIC Educational Resources Information Center
Moses, Tim; Holland, Paul W.
2010-01-01
In this study, eight statistical strategies were evaluated for selecting the parameterizations of loglinear models for smoothing the bivariate test score distributions used in nonequivalent groups with anchor test (NEAT) equating. Four of the strategies were based on significance tests of chi-square statistics (Likelihood Ratio, Pearson,…
ERIC Educational Resources Information Center
National Council on Crime and Delinquency, Hackensack, NJ. NewGate Resource Center.
A guide is provided for establishing a college-level education program for inmates of correctional institutions based on the NewGate concept. Necessary first steps are evaluation of current facilities, selection of the sponsoring agency, and selection of the student body. Guidelines for student selection deal with application procedure, record…
Neudecker, D.; Talou, P.; Kawano, T.; ...
2015-08-01
We present evaluations of the prompt fission neutron spectrum (PFNS) of ²³⁹Pu induced by 500 keV neutrons, and associated covariances. In a previous evaluation by Talou et al. 2010, surprisingly low evaluated uncertainties were obtained, partly due to simplifying assumptions in the quantification of uncertainties from experiment and model. Therefore, special emphasis is placed here on a thorough uncertainty quantification of experimental data and of the Los Alamos model predicted values entering the evaluation. In addition, the Los Alamos model was extended and an evaluation technique was employed that takes into account the qualitative differences between normalized model predicted valuesmore » and experimental shape data. These improvements lead to changes in the evaluated PFNS and overall larger evaluated uncertainties than in the previous work. However, these evaluated uncertainties are still smaller than those obtained in a statistical analysis using experimental information only, due to strong model correlations. Hence, suggestions to estimate model defect uncertainties are presented, which lead to more reasonable evaluated uncertainties. The calculated k eff of selected criticality benchmarks obtained with these new evaluations agree with each other within their uncertainties despite the different approaches to estimate model defect uncertainties. The k eff one standard deviations overlap with some of those obtained using ENDF/B-VII.1, albeit their mean values are further away from unity. Spectral indexes for the Jezebel critical assembly calculated with the newly evaluated PFNS agree with the experimental data for selected (n,γ) and (n,f) reactions, and show improvements for high-energy threshold (n,2n) reactions compared to ENDF/B-VII.1.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neudecker, D.; Talou, P.; Kawano, T.
2015-08-01
We present evaluations of the prompt fission neutron spectrum (PFNS) of (PU)-P-239 induced by 500 keV neutrons, and associated covariances. In a previous evaluation by Talon et al. (2010), surprisingly low evaluated uncertainties were obtained, partly due to simplifying assumptions in the quantification of uncertainties from experiment and model. Therefore, special emphasis is placed here on a thorough uncertainty quantification of experimental data and of the Los Alamos model predicted values entering the evaluation. In addition, the Los Alamos model was extended and an evaluation technique was employed that takes into account the qualitative differences between normalized model predicted valuesmore » and experimental shape data These improvements lead to changes in the evaluated PENS and overall larger evaluated uncertainties than in the previous work. However, these evaluated uncertainties are still smaller than those obtained in a statistical analysis using experimental information only, due to strong model correlations. Hence, suggestions to estimate model defect uncertainties are presented. which lead to more reasonable evaluated uncertainties. The calculated k(eff) of selected criticality benchmarks obtained with these new evaluations agree with each other within their uncertainties despite the different approaches to estimate model defect uncertainties. The k(eff) one standard deviations overlap with some of those obtained using ENDF/B-VILl, albeit their mean values are further away from unity. Spectral indexes for the Jezebel critical assembly calculated with the newly evaluated PFNS agree with the experimental data for selected (n,) and (n,f) reactions, and show improvements for highenergy threshold (n,2n) reactions compared to ENDF/B-VII.l. (C) 2015 Elsevier B.V. All rights reserved.« less
Cillo, Umberto; Giuliani, Tommaso; Polacco, Marina; Herrero Manley, Luz Maria; Crivellari, Gino; Vitale, Alessandro
2016-01-01
Morphological criteria have always been considered the benchmark for selecting hepatocellular carcinoma (HCC) patients for liver transplantation (LT). These criteria, which are often inappropriate to express the tumor’s biological behavior and aggressiveness, offer only a static view of the disease burden and are frequently unable to correctly stratify the tumor recurrence risk after LT. Alpha-fetoprotein (AFP) and its progression as well as AFP-mRNA, AFP-L3%, des-γ-carboxyprothrombin, inflammatory markers and other serological tests appear to be correlated with post-transplant outcomes. Several other markers for patient selection including functional imaging studies such as 18F-FDG-PET imaging, histological evaluation of tumor grade, tissue-specific biomarkers, and molecular signatures have been outlined in the literature. HCC growth rate and response to pre-transplant therapies can further contribute to the transplant evaluation process of HCC patients. While AFP, its progression, and HCC response to pre-transplant therapy have already been used as a part of an integrated prognostic model for selecting patients, the utility of other markers in the transplant setting is still under investigation. This article intends to review the data in the literature concerning predictors that could be included in an integrated LT selection model and to evaluate the importance of biological aggressiveness in the evaluation process of these patients. PMID:26755873
ERIC Educational Resources Information Center
New Educational Directions, Crawfordsville, IN.
Phase 2 of this project presents a skeletal model for evaluating vocational education programs which can be applied to secondary, post-secondary, and adult education programs. The model addresses 13 main components of the vocational education system: descriptive information, demonstration of need, student recruitment and selection, curriculum,…
Five Guidelines for Selecting Hydrological Signatures
NASA Astrophysics Data System (ADS)
McMillan, H. K.; Westerberg, I.; Branger, F.
2017-12-01
Hydrological signatures are index values derived from observed or modeled series of hydrological data such as rainfall, flow or soil moisture. They are designed to extract relevant information about hydrological behavior, such as to identify dominant processes, and to determine the strength, speed and spatiotemporal variability of the rainfall-runoff response. Hydrological signatures play an important role in model evaluation. They allow us to test whether particular model structures or parameter sets accurately reproduce the runoff generation processes within the watershed of interest. Most modeling studies use a selection of different signatures to capture different aspects of the catchment response, for example evaluating overall flow distribution as well as high and low flow extremes and flow timing. Such studies often choose their own set of signatures, or may borrow subsets of signatures used in multiple other works. The link between signature values and hydrological processes is not always straightforward, leading to uncertainty and variability in hydrologists' signature choices. In this presentation, we aim to encourage a more rigorous approach to hydrological signature selection, which considers the ability of signatures to represent hydrological behavior and underlying processes for the catchment and application in question. To this end, we propose a set of guidelines for selecting hydrological signatures. We describe five criteria that any hydrological signature should conform to: Identifiability, Robustness, Consistency, Representativeness, and Discriminatory Power. We describe an example of the design process for a signature, assessing possible signature designs against the guidelines above. Due to their ubiquity, we chose a signature related to the Flow Duration Curve, selecting the FDC mid-section slope as a proposed signature to quantify catchment overall behavior and flashiness. We demonstrate how assessment against each guideline could be used to compare or choose between alternative signature definitions. We believe that reaching a consensus on selection criteria for hydrological signatures will assist modelers to choose between competing signatures, facilitate comparison between hydrological studies, and help hydrologists to fully evaluate their models.
NASA Technical Reports Server (NTRS)
Seldner, K.
1976-01-01
The development of control systems for jet engines requires a real-time computer simulation. The simulation provides an effective tool for evaluating control concepts and problem areas prior to actual engine testing. The development and use of a real-time simulation of the Pratt and Whitney F100-PW100 turbofan engine is described. The simulation was used in a multi-variable optimal controls research program using linear quadratic regulator theory. The simulation is used to generate linear engine models at selected operating points and evaluate the control algorithm. To reduce the complexity of the design, it is desirable to reduce the order of the linear model. A technique to reduce the order of the model; is discussed. Selected results between high and low order models are compared. The LQR control algorithms can be programmed on digital computer. This computer will control the engine simulation over the desired flight envelope.
Evaluation of the 29-km Eta Model. Part I: Objective Verification at Three Selected Stations
NASA Technical Reports Server (NTRS)
Manobianco, John; Nutter, Paul
1998-01-01
A subjective evaluation of the National Centers for Environmental Prediction 29-km (meso-) eta model during the 1996 warm (May-August) and cool (October-January) seasons is described. The overall evaluation assessed the utility of the model for operational weather forecasting by the U.S. Air Force 45th Weather Squadron, National Weather Service (NWS) Spaceflight Meteorology Group (SMG) and NWS Office in Melbourne, FL.
Vasconcelos, A G; Almeida, R M; Nobre, F F
2001-08-01
This paper introduces an approach that includes non-quantitative factors for the selection and assessment of multivariate complex models in health. A goodness-of-fit based methodology combined with fuzzy multi-criteria decision-making approach is proposed for model selection. Models were obtained using the Path Analysis (PA) methodology in order to explain the interrelationship between health determinants and the post-neonatal component of infant mortality in 59 municipalities of Brazil in the year 1991. Socioeconomic and demographic factors were used as exogenous variables, and environmental, health service and agglomeration as endogenous variables. Five PA models were developed and accepted by statistical criteria of goodness-of fit. These models were then submitted to a group of experts, seeking to characterize their preferences, according to predefined criteria that tried to evaluate model relevance and plausibility. Fuzzy set techniques were used to rank the alternative models according to the number of times a model was superior to ("dominated") the others. The best-ranked model explained above 90% of the endogenous variables variation, and showed the favorable influences of income and education levels on post-neonatal mortality. It also showed the unfavorable effect on mortality of fast population growth, through precarious dwelling conditions and decreased access to sanitation. It was possible to aggregate expert opinions in model evaluation. The proposed procedure for model selection allowed the inclusion of subjective information in a clear and systematic manner.
NASA Astrophysics Data System (ADS)
Zheng, Feifei; Maier, Holger R.; Wu, Wenyan; Dandy, Graeme C.; Gupta, Hoshin V.; Zhang, Tuqiao
2018-02-01
Hydrological models are used for a wide variety of engineering purposes, including streamflow forecasting and flood-risk estimation. To develop such models, it is common to allocate the available data to calibration and evaluation data subsets. Surprisingly, the issue of how this allocation can affect model evaluation performance has been largely ignored in the research literature. This paper discusses the evaluation performance bias that can arise from how available data are allocated to calibration and evaluation subsets. As a first step to assessing this issue in a statistically rigorous fashion, we present a comprehensive investigation of the influence of data allocation on the development of data-driven artificial neural network (ANN) models of streamflow. Four well-known formal data splitting methods are applied to 754 catchments from Australia and the U.S. to develop 902,483 ANN models. Results clearly show that the choice of the method used for data allocation has a significant impact on model performance, particularly for runoff data that are more highly skewed, highlighting the importance of considering the impact of data splitting when developing hydrological models. The statistical behavior of the data splitting methods investigated is discussed and guidance is offered on the selection of the most appropriate data splitting methods to achieve representative evaluation performance for streamflow data with different statistical properties. Although our results are obtained for data-driven models, they highlight the fact that this issue is likely to have a significant impact on all types of hydrological models, especially conceptual rainfall-runoff models.
Wheeler, David C.; Hickson, DeMarc A.; Waller, Lance A.
2010-01-01
Many diagnostic tools and goodness-of-fit measures, such as the Akaike information criterion (AIC) and the Bayesian deviance information criterion (DIC), are available to evaluate the overall adequacy of linear regression models. In addition, visually assessing adequacy in models has become an essential part of any regression analysis. In this paper, we focus on a spatial consideration of the local DIC measure for model selection and goodness-of-fit evaluation. We use a partitioning of the DIC into the local DIC, leverage, and deviance residuals to assess local model fit and influence for both individual observations and groups of observations in a Bayesian framework. We use visualization of the local DIC and differences in local DIC between models to assist in model selection and to visualize the global and local impacts of adding covariates or model parameters. We demonstrate the utility of the local DIC in assessing model adequacy using HIV prevalence data from pregnant women in the Butare province of Rwanda during 1989-1993 using a range of linear model specifications, from global effects only to spatially varying coefficient models, and a set of covariates related to sexual behavior. Results of applying the diagnostic visualization approach include more refined model selection and greater understanding of the models as applied to the data. PMID:21243121
NASA Astrophysics Data System (ADS)
Bouaziz, Laurène; de Boer-Euser, Tanja; Brauer, Claudia; Drogue, Gilles; Fenicia, Fabrizio; Grelier, Benjamin; de Niel, Jan; Nossent, Jiri; Pereira, Fernando; Savenije, Hubert; Thirel, Guillaume; Willems, Patrick
2016-04-01
International collaboration between institutes and universities is a promising way to reach consensus on hydrological model development. Education, experience and expert knowledge of the hydrological community have resulted in the development of a great variety of model concepts, calibration methods and analysis techniques. Although comparison studies are very valuable for international cooperation, they do often not lead to very clear new insights regarding the relevance of the modelled processes. We hypothesise that this is partly caused by model complexity and the used comparison methods, which focus on a good overall performance instead of focusing on specific events. We propose an approach that focuses on the evaluation of specific events. Eight international research groups calibrated their model for the Ourthe catchment in Belgium (1607 km2) and carried out a validation in time for the Ourthe (i.e. on two different periods, one of them on a blind mode for the modellers) and a validation in space for nested and neighbouring catchments of the Meuse in a completely blind mode. For each model, the same protocol was followed and an ensemble of best performing parameter sets was selected. Signatures were first used to assess model performances in the different catchments during validation. Comparison of the models was then followed by evaluation of selected events, which include: low flows, high flows and the transition from low to high flows. While the models show rather similar performances based on general metrics (i.e. Nash-Sutcliffe Efficiency), clear differences can be observed for specific events. While most models are able to simulate high flows well, large differences are observed during low flows and in the ability to capture the first peaks after drier months. The transferability of model parameters to neighbouring and nested catchments is assessed as an additional measure in the model evaluation. This suggested approach helps to select, among competing model alternatives, the most suitable model for a specific purpose.
ERIC Educational Resources Information Center
Baxa, Julie; Christ, Tanya
2018-01-01
Selecting and integrating the use of digital texts/tools in literacy lessons are complex tasks. The DigiLit framework provides a succinct model to guide planning, reflection, coaching, and formative evaluation of teachers' successful digital text/tool selection and integration for literacy lessons. For digital text/tool selection, teachers need to…
Evaluation of a black-footed ferret resource utilization function model
Eads, D.A.; Millspaugh, J.J.; Biggins, D.E.; Jachowski, D.S.; Livieri, T.M.
2011-01-01
Resource utilization function (RUF) models permit evaluation of potential habitat for endangered species; ideally such models should be evaluated before use in management decision-making. We evaluated the predictive capabilities of a previously developed black-footed ferret (Mustela nigripes) RUF. Using the population-level RUF, generated from ferret observations at an adjacent yet distinct colony, we predicted the distribution of ferrets within a black-tailed prairie dog (Cynomys ludovicianus) colony in the Conata Basin, South Dakota, USA. We evaluated model performance, using data collected during post-breeding spotlight surveys (2007-2008) by assessing model agreement via weighted compositional analysis and count-metrics. Compositional analysis of home range use and colony-level availability, and core area use and home range availability, demonstrated ferret selection of the predicted Very high and High occurrence categories in 2007 and 2008. Simple count-metrics corroborated these findings and suggested selection of the Very high category in 2007 and the Very high and High categories in 2008. Collectively, these results suggested that the RUF was useful in predicting occurrence and intensity of space use of ferrets at our study site, the 2 objectives of the RUF. Application of this validated RUF would increase the resolution of habitat evaluations, permitting prediction of the distribution of ferrets within distinct colonies. Additional model evaluation at other sites, on other black-tailed prairie dog colonies of varying resource configuration and size, would increase understanding of influences upon model performance and the general utility of the RUF. ?? 2011 The Wildlife Society.
NASA Technical Reports Server (NTRS)
Frederick, D. K.; Lashmet, P. K.; Sandor, G. N.; Shen, C. N.; Smith, E. V.; Yerazunis, S. W.
1973-01-01
Problems related to the design and control of a mobile planetary vehicle to implement a systematic plan for the exploration of Mars are reported. Problem areas include: vehicle configuration, control, dynamics, systems and propulsion; systems analysis, terrain modeling and path selection; and chemical analysis of specimens. These tasks are summarized: vehicle model design, mathematical model of vehicle dynamics, experimental vehicle dynamics, obstacle negotiation, electrochemical controls, remote control, collapsibility and deployment, construction of a wheel tester, wheel analysis, payload design, system design optimization, effect of design assumptions, accessory optimal design, on-board computer subsystem, laser range measurement, discrete obstacle detection, obstacle detection systems, terrain modeling, path selection system simulation and evaluation, gas chromatograph/mass spectrometer system concepts, and chromatograph model evaluation and improvement.
Adaptive Modeling Procedure Selection by Data Perturbation.
Zhang, Yongli; Shen, Xiaotong
2015-10-01
Many procedures have been developed to deal with the high-dimensional problem that is emerging in various business and economics areas. To evaluate and compare these procedures, modeling uncertainty caused by model selection and parameter estimation has to be assessed and integrated into a modeling process. To do this, a data perturbation method estimates the modeling uncertainty inherited in a selection process by perturbing the data. Critical to data perturbation is the size of perturbation, as the perturbed data should resemble the original dataset. To account for the modeling uncertainty, we derive the optimal size of perturbation, which adapts to the data, the model space, and other relevant factors in the context of linear regression. On this basis, we develop an adaptive data-perturbation method that, unlike its nonadaptive counterpart, performs well in different situations. This leads to a data-adaptive model selection method. Both theoretical and numerical analysis suggest that the data-adaptive model selection method adapts to distinct situations in that it yields consistent model selection and optimal prediction, without knowing which situation exists a priori. The proposed method is applied to real data from the commodity market and outperforms its competitors in terms of price forecasting accuracy.
We developed a simplified spreadsheet modeling approach for characterizing and prioritizing sources of sediment loadings from watersheds in the United States. A simplified modeling approach was developed to evaluate sediment loadings from watersheds and selected land segments. ...
Four receptor-oriented source apportionment models were evaluated by applying them to simulated personal exposure data for select volatile organic compounds (VOCs) that were generated by Monte Carlo sampling from known source contributions and profiles. The exposure sources mo...
NASA Astrophysics Data System (ADS)
Luo, Lin
2017-08-01
In the practical selection of Wushu athletes, the objective evaluation of the level of athletes lacks sufficient technical indicators and often relies on the coach’s subjective judgments. It is difficult to accurately and objectively reflect the overall quality of the athletes without a fully quantified indicator system, thus affecting the level improvement of Wushu competition. The analytic hierarchy process (AHP) is a systemic analysis method combining quantitative and qualitative analysis. This paper realizes structured, hierarchized and quantified decision-making process of evaluating broadsword, rod, sword and spear athletes in the AHP. Combing characteristics of the athletes, analysis is carried out from three aspects, i.e., the athlete’s body shape, physical function and sports quality and 18 specific evaluation indicators established, and then combining expert advice and practical experience, pairwise comparison matrix is determined, and then the weight of the indicators and comprehensive evaluation coefficient are obtained to establish the evaluation model for the athletes, thus providing a scientific theoretical basis for the selection of Wushu athletes. The evaluation model proposed in this paper has realized the evaluation system of broadsword, rod, sword and spear athletes, which has effectively improved the scientific level of Wushu athletes selection in practical application.
The transferability of safety-driven access management models for application to other sites.
DOT National Transportation Integrated Search
2001-01-01
Several research studies have produced mathematical models that predict the safety impacts of selected access management techniques. Since new models require substantial resources to construct, this study evaluated five existing models with regard to...
Sutrave, Sweta; Scoglio, Caterina; Isard, Scott A; Hutchinson, J M Shawn; Garrett, Karen A
2012-01-01
Surveying invasive species can be highly resource intensive, yet near-real-time evaluations of invasion progress are important resources for management planning. In the case of the soybean rust invasion of the United States, a linked monitoring, prediction, and communication network saved U.S. soybean growers approximately $200 M/yr. Modeling of future movement of the pathogen (Phakopsora pachyrhizi) was based on data about current disease locations from an extensive network of sentinel plots. We developed a dynamic network model for U.S. soybean rust epidemics, with counties as nodes and link weights a function of host hectarage and wind speed and direction. We used the network model to compare four strategies for selecting an optimal subset of sentinel plots, listed here in order of increasing performance: random selection, zonal selection (based on more heavily weighting regions nearer the south, where the pathogen overwinters), frequency-based selection (based on how frequently the county had been infected in the past), and frequency-based selection weighted by the node strength of the sentinel plot in the network model. When dynamic network properties such as node strength are characterized for invasive species, this information can be used to reduce the resources necessary to survey and predict invasion progress.
Ivezic, Nenad; Potok, Thomas E.
2003-09-30
A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.
Adaptive Greedy Dictionary Selection for Web Media Summarization.
Cong, Yang; Liu, Ji; Sun, Gan; You, Quanzeng; Li, Yuncheng; Luo, Jiebo
2017-01-01
Initializing an effective dictionary is an indispensable step for sparse representation. In this paper, we focus on the dictionary selection problem with the objective to select a compact subset of basis from original training data instead of learning a new dictionary matrix as dictionary learning models do. We first design a new dictionary selection model via l 2,0 norm. For model optimization, we propose two methods: one is the standard forward-backward greedy algorithm, which is not suitable for large-scale problems; the other is based on the gradient cues at each forward iteration and speeds up the process dramatically. In comparison with the state-of-the-art dictionary selection models, our model is not only more effective and efficient, but also can control the sparsity. To evaluate the performance of our new model, we select two practical web media summarization problems: 1) we build a new data set consisting of around 500 users, 3000 albums, and 1 million images, and achieve effective assisted albuming based on our model and 2) by formulating the video summarization problem as a dictionary selection issue, we employ our model to extract keyframes from a video sequence in a more flexible way. Generally, our model outperforms the state-of-the-art methods in both these two tasks.
2018-03-01
We apply our methodology to the criticism text written in the flight-training program student evaluations in order to construct a model that...factors. We apply our methodology to the criticism text written in the flight-training program student evaluations in order to construct a model...9 D. BINARY CLASSIFICATION AND FEATURE SELECTION ..........11 III. METHODOLOGY
2007-09-01
behavior libraries selection box, Savage Tactics behavior sub-folder and hostile behavior sub-folder that contains the behavior that is being assigned to...21) applications. The interface allows users to select models (locations, friendly assets, hostile assets, neutral assets, etc) that will be used in...altitude, etc.) for each model and define their behaviors (friendly patrol craft, hostile explosive-laden vessel, etc). Once the models and their
Rodgers, K E; Schwartz, H E; Roda, N; Thornton, M; Kobak, W; diZerega, G S
2000-04-01
To assess the efficacy of Oxiplex (FzioMed, Inc., San Luis Obispo, CA) barriers. Film of polyethylene oxide and carboxymethylcellulose (Oxiplex) were tested for strength and tissue adherence. Films were selected for evaluation in models for biocompatability and adherence. Three films were selected for evaluation in efficacy studies, and one was evaluated for effects on bacterial peritonitis. Handling characteristics of Oxiplex film were evaluated via laparoscopy. University laboratory. Rabbits, rats, pigs. Placement of Oxiplex prototypes at the site of injury. Mechanical properties, biocompatibility, tissue adherence, adhesion development, infection potentiation, and device handling. Mechanical tests indicated that tensile strength and elongation were inversely correlated. All films tested had excellent tissue adherence properties. Selected films, based on residence time and biocompatibility, prevented adhesion formation in all animals and were highly efficacious in preventing adhesion reformation. The optimal Oxiplex prototype prevented adhesion reformation in 91% of the animals. This Oxiplex film, dyed to allow visualization, prevented adhesion reformation and did not affect bacterial peritonitis. In a laparoscopic model, the Oxiplex film, delivered in FilmSert forceps, via a 5.0-mm trocar, rapidly unfurled and could be easily applied to tissue with strong adherence. These data show development of an adhesion prevention material that is tissue adherent, can be placed via laparoscopy, and does not affect host resistance.
Issues in the Evaluation of Educational Television Programs
ERIC Educational Resources Information Center
Aversa, Frances M.; Forman, David C.
1978-01-01
Seven issues raised during an evaluation study of televised instructional components are identified: selection and commitment of sample, environment of evaluation sessions, student and expert review, formulation of an explicit evaluation model, conflicting results from different instruments, standards and the interpretation of data, and style and…
Al-Badriyeh, Daoud; Fahey, Michael; Alabbadi, Ibrahim; Al-Khal, Abdullatif; Zaidan, Manal
2015-12-01
Statin selection for the largest hospital formulary in Qatar is not systematic, not comparative, and does not consider the multi-indication nature of statins. There are no reports in the literature of multi-indication-based comparative scoring models of statins or of statin selection criteria weights that are based primarily on local clinicians' preferences and experiences. This study sought to comparatively evaluate statins for first-line therapy in Qatar, and to quantify the economic impact of this. An evidence-based, multi-indication, multi-criteria pharmacotherapeutic model was developed for the scoring of statins from the perspective of the main health care provider in Qatar. The literature and an expert panel informed the selection criteria of statins. Relative weighting of selection criteria was based on the input of the relevant local clinician population. Statins were comparatively scored based on literature evidence, with those exceeding a defined scoring threshold being recommended for use. With 95% CI and 5% margin of error, the scoring model was successfully developed. Selection criteria comprised 28 subcriteria under the following main criteria: clinical efficacy, best publish evidence and experience, adverse effects, drug interaction, dosing time, and fixed dose combination availability. Outcome measures for multiple indications were related to effects on LDL cholesterol, HDL cholesterol, triglyceride, total cholesterol, and C-reactive protein. Atorvastatin, pravastatin, and rosuvastatin exceeded defined pharmacotherapeutic thresholds. Atorvastatin and pravastatin were recommended as first-line use and rosuvastatin as a nonformulary alternative. It was estimated that this would produce a 17.6% cost savings in statins expenditure. Sensitivity analyses confirmed the robustness of the evaluation's outcomes against input uncertainties. Incorporating a comparative evaluation of statins in Qatari practices based on a locally developed, transparent, multi-indication, multi-criteria scoring model has the potential to considerably reduce expenditures on statins. Atorvastatin and pravastatin should be the first-line statin therapies in the main Qatari health care provider, with rosuvastatin as an alternative. Copyright © 2015 Elsevier HS Journals, Inc. All rights reserved.
Teodoro, P E; Bhering, L L; Costa, R D; Rocha, R B; Laviola, B G
2016-08-19
The aim of this study was to estimate genetic parameters via mixed models and simultaneously to select Jatropha progenies grown in three regions of Brazil that meet high adaptability and stability. From a previous phenotypic selection, three progeny tests were installed in 2008 in the municipalities of Planaltina-DF (Midwest), Nova Porteirinha-MG (Southeast), and Pelotas-RS (South). We evaluated 18 families of half-sib in a randomized block design with three replications. Genetic parameters were estimated using restricted maximum likelihood/best linear unbiased prediction. Selection was based on the harmonic mean of the relative performance of genetic values method in three strategies considering: 1) performance in each environment (with interaction effect); 2) performance in each environment (with interaction effect); and 3) simultaneous selection for grain yield, stability and adaptability. Accuracy obtained (91%) reveals excellent experimental quality and consequently safety and credibility in the selection of superior progenies for grain yield. The gain with the selection of the best five progenies was more than 20%, regardless of the selection strategy. Thus, based on the three selection strategies used in this study, the progenies 4, 11, and 3 (selected in all environments and the mean environment and by adaptability and phenotypic stability methods) are the most suitable for growing in the three regions evaluated.
ERIC Educational Resources Information Center
Ebbeck, Marjory; Winter, Pam; Russo, Sharon; Yim, Hoi Yin Bonnie; Teo-Zuzarte, Geraldine Lian Choo; Goh, Mandy
2012-01-01
This paper presents one aspect of a research project evaluating a curriculum model of a selected child study centre in Singapore. An issue of worldwide interest and concern is the "quality of learning" debate as it relates to early childhood centres. In Singapore, the government is focusing on expansion in child care settings and…
A systematic literature review of open source software quality assessment models.
Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo
2016-01-01
Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.
Kurokawa, Kenji; Hamamoto, Hiroshi; Matsuo, Miki; Nishida, Satoshi; Yamane, Noriko; Lee, Bok Luel; Murakami, Kazuhisa; Maki, Hideki; Sekimizu, Kazuhisa
2009-01-01
The availability of a silkworm larva infection model to evaluate the therapeutic effectiveness of antibiotics was examined. The 50% effective doses (ED50) of d-cycloserine against the Staphylococcus aureus ddlA mutant-mediated killing of larvae were remarkably lower than those against the parental strain-mediated killing of larvae. Changes in MICs and ED50 of other antibiotics were negligible, suggesting that these alterations are d-cycloserine selective. Therefore, this model is useful for selecting desired compounds based on their therapeutic effectiveness during antibiotic development. PMID:19546371
A decision tool for selecting trench cap designs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paige, G.B.; Stone, J.J.; Lane, L.J.
1995-12-31
A computer based prototype decision support system (PDSS) is being developed to assist the risk manager in selecting an appropriate trench cap design for waste disposal sites. The selection of the {open_quote}best{close_quote} design among feasible alternatives requires consideration of multiple and often conflicting objectives. The methodology used in the selection process consists of: selecting and parameterizing decision variables using data, simulation models, or expert opinion; selecting feasible trench cap design alternatives; ordering the decision variables and ranking the design alternatives. The decision model is based on multi-objective decision theory and uses a unique approach to order the decision variables andmore » rank the design alternatives. Trench cap designs are evaluated based on federal regulations, hydrologic performance, cover stability and cost. Four trench cap designs, which were monitored for a four year period at Hill Air Force Base in Utah, are used to demonstrate the application of the PDSS and evaluate the results of the decision model. The results of the PDSS, using both data and simulations, illustrate the relative advantages of each of the cap designs and which cap is the {open_quotes}best{close_quotes} alternative for a given set of criteria and a particular importance order of those decision criteria.« less
USDA-ARS?s Scientific Manuscript database
Bacterial cold water disease (BCWD) causes significant economic losses in salmonid aquaculture, and traditional family-based breeding programs aimed at improving BCWD resistance have been limited to exploiting only between-family variation. We used genomic selection (GS) models to predict genomic br...
A re-evaluation of a case-control model with contaminated controls for resource selection studies
Christopher T. Rota; Joshua J. Millspaugh; Dylan C. Kesler; Chad P. Lehman; Mark A. Rumble; Catherine M. B. Jachowski
2013-01-01
A common sampling design in resource selection studies involves measuring resource attributes at sample units used by an animal and at sample units considered available for use. Few models can estimate the absolute probability of using a sample unit from such data, but such approaches are generally preferred over statistical methods that estimate a relative probability...
NASA Technical Reports Server (NTRS)
Leduc, S. (Principal Investigator)
1982-01-01
Models based on multiple regression were developed to estimate corn and soybean yield from weather data for agrophysical units (APU) in Iowa. The predictor variables are derived from monthly average temperature and monthly total precipitation data at meteorological stations in the cooperative network. The models are similar in form to the previous models developed for crop reporting districts (CRD). The trends and derived variables were the same and the approach to select the significant predictors was similar to that used in developing the CRD models. The APU's were selected to be more homogeneous with respect crop to production than the CRDs. The APU models are quite similar to the CRD models, similar explained variation and number of predictor variables. The APU models are to be independently evaluated and compared to the previously evaluated CRD models. That comparison should indicate the preferred model area for this application, i.e., APU or CRD.
Yabe, Shiori; Hara, Takashi; Ueno, Mariko; Enoki, Hiroyuki; Kimura, Tatsuro; Nishimura, Satoru; Yasui, Yasuo; Ohsawa, Ryo; Iwata, Hiroyoshi
2018-01-01
To evaluate the potential of genomic selection (GS), a selection experiment with GS and phenotypic selection (PS) was performed in an allogamous crop, common buckwheat ( Fagopyrum esculentum Moench). To indirectly select for seed yield per unit area, which cannot be measured on a single-plant basis, a selection index was constructed from seven agro-morphological traits measurable on a single plant basis. Over 3 years, we performed two GS and one PS cycles per year for improvement in the selection index. In GS, a prediction model was updated every year on the basis of genotypes of 14,598-50,000 markers and phenotypes. Plants grown from seeds derived from a series of generations of GS and PS populations were evaluated for the traits in the selection index and other yield-related traits. GS resulted in a 20.9% increase and PS in a 15.0% increase in the selection index in comparison with the initial population. Although the level of linkage disequilibrium in the breeding population was low, the target trait was improved with GS. Traits with higher weights in the selection index were improved more than those with lower weights, especially when prediction accuracy was high. No trait changed in an unintended direction in either GS or PS. The accuracy of genomic prediction models built in the first cycle decreased in the later cycles because the genetic bottleneck through the selection cycles changed linkage disequilibrium patterns in the breeding population. The present study emphasizes the importance of updating models in GS and demonstrates the potential of GS in mass selection of allogamous crop species, and provided a pilot example of successful application of GS to plant breeding.
Yabe, Shiori; Hara, Takashi; Ueno, Mariko; Enoki, Hiroyuki; Kimura, Tatsuro; Nishimura, Satoru; Yasui, Yasuo; Ohsawa, Ryo; Iwata, Hiroyoshi
2018-01-01
To evaluate the potential of genomic selection (GS), a selection experiment with GS and phenotypic selection (PS) was performed in an allogamous crop, common buckwheat (Fagopyrum esculentum Moench). To indirectly select for seed yield per unit area, which cannot be measured on a single-plant basis, a selection index was constructed from seven agro-morphological traits measurable on a single plant basis. Over 3 years, we performed two GS and one PS cycles per year for improvement in the selection index. In GS, a prediction model was updated every year on the basis of genotypes of 14,598–50,000 markers and phenotypes. Plants grown from seeds derived from a series of generations of GS and PS populations were evaluated for the traits in the selection index and other yield-related traits. GS resulted in a 20.9% increase and PS in a 15.0% increase in the selection index in comparison with the initial population. Although the level of linkage disequilibrium in the breeding population was low, the target trait was improved with GS. Traits with higher weights in the selection index were improved more than those with lower weights, especially when prediction accuracy was high. No trait changed in an unintended direction in either GS or PS. The accuracy of genomic prediction models built in the first cycle decreased in the later cycles because the genetic bottleneck through the selection cycles changed linkage disequilibrium patterns in the breeding population. The present study emphasizes the importance of updating models in GS and demonstrates the potential of GS in mass selection of allogamous crop species, and provided a pilot example of successful application of GS to plant breeding. PMID:29619035
NASA Technical Reports Server (NTRS)
Frederick, D. K.; Lashmet, P. K.; Sandor, G. N.; Shen, C. N.; Smith, E. J.; Yerazunis, S. W.
1972-01-01
Investigation of problems related to the design and control of a mobile planetary vehicle to implement a systematic plan for the exploration of Mars has been undertaken. Problem areas receiving attention include: vehicle configuration, control, dynamics, systems and propulsion; systems analysis; terrain modeling and path selection; and chemical analysis of specimens. The following specific tasks have been under study: vehicle model design, mathematical modeling of a dynamic vehicle, experimental vehicle dynamics, obstacle negotiation, electromechanical controls, collapsibility and deployment, construction of a wheel tester, wheel analysis, payload design, system design optimization, effect of design assumptions, accessory optimal design, on-board computer sybsystem, laser range measurement, discrete obstacle detection, obstacle detection systems, terrain modeling, path selection system simulation and evaluation, gas chromatograph/mass spectrometer system concepts, chromatograph model evaluation and improvement.
Roth, Justin C.; Ismail, Mourad; Reese, Jane S.; Lingas, Karen T.; Ferrari, Giuliana; Gerson, Stanton L.
2012-01-01
The P140K point mutant of MGMT allows robust hematopoietic stem cell (HSC) enrichment in vivo. Thus, dual-gene vectors that couple MGMT and therapeutic gene expression have allowed enrichment of gene-corrected HSCs in animal models. However, expression levels from dual-gene vectors are often reduced for one or both genes. Further, it may be desirable to express selection and therapeutic genes at distinct stages of cell differentiation. In this regard, we evaluated whether hematopoietic cells could be efficiently cotransduced using low MOIs of two separate single-gene lentiviruses, including MGMT for dual-positive cell enrichment. Cotransduction efficiencies were evaluated using a range of MGMT : GFP virus ratios, MOIs, and selection stringencies in vitro. Cotransduction was optimal when equal proportions of each virus were used, but low MGMT : GFP virus ratios resulted in the highest proportion of dual-positive cells after selection. This strategy was then evaluated in murine models for in vivo selection of HSCs cotransduced with a ubiquitous MGMT expression vector and an erythroid-specific GFP vector. Although the MGMT and GFP expression percentages were variable among engrafted recipients, drug selection enriched MGMT-positive leukocyte and GFP-positive erythroid cell populations. These data demonstrate cotransduction as a mean to rapidly enrich and evaluate therapeutic lentivectors in vivo. PMID:22888445
Bayes factors and multimodel inference
Link, W.A.; Barker, R.J.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.
2009-01-01
Multimodel inference has two main themes: model selection, and model averaging. Model averaging is a means of making inference conditional on a model set, rather than on a selected model, allowing formal recognition of the uncertainty associated with model choice. The Bayesian paradigm provides a natural framework for model averaging, and provides a context for evaluation of the commonly used AIC weights. We review Bayesian multimodel inference, noting the importance of Bayes factors. Noting the sensitivity of Bayes factors to the choice of priors on parameters, we define and propose nonpreferential priors as offering a reasonable standard for objective multimodel inference.
This study is an evaluation of empirical data and select modeling studies of the behavior of petroleum hydrocarbon (PHC) vapors in subsurface soils and how they can affect subsurface-to-indoor air vapor intrusion (VI), henceforth referred to as petroleum vapor intrusion or “PVI” ...
DOT National Transportation Integrated Search
2016-06-16
The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...
Multiple attribute decision making model and application to food safety risk evaluation.
Ma, Lihua; Chen, Hong; Yan, Huizhe; Yang, Lifeng; Wu, Lifeng
2017-01-01
Decision making for supermarket food purchase decisions are characterized by network relationships. This paper analyzed factors that influence supermarket food selection and proposes a supplier evaluation index system based on the whole process of food production. The author established the intuitive interval value fuzzy set evaluation model based on characteristics of the network relationship among decision makers, and validated for a multiple attribute decision making case study. Thus, the proposed model provides a reliable, accurate method for multiple attribute decision making.
Uncertainty in age-specific harvest estimates and consequences for white-tailed deer management
Collier, B.A.; Krementz, D.G.
2007-01-01
Age structure proportions (proportion of harvested individuals within each age class) are commonly used as support for regulatory restrictions and input for deer population models. Such use requires critical evaluation when harvest regulations force hunters to selectively harvest specific age classes, due to impact on the underlying population age structure. We used a stochastic population simulation model to evaluate the impact of using harvest proportions to evaluate changes in population age structure under a selective harvest management program at two scales. Using harvest proportions to parameterize the age-specific harvest segment of the model for the local scale showed that predictions of post-harvest age structure did not vary dependent upon whether selective harvest criteria were in use or not. At the county scale, yearling frequency in the post-harvest population increased, but model predictions indicated that post-harvest population size of 2.5 years old males would decline below levels found before implementation of the antler restriction, reducing the number of individuals recruited into older age classes. Across the range of age-specific harvest rates modeled, our simulation predicted that underestimation of age-specific harvest rates has considerable influence on predictions of post-harvest population age structure. We found that the consequence of uncertainty in harvest rates corresponds to uncertainty in predictions of residual population structure, and this correspondence is proportional to scale. Our simulations also indicate that regardless of use of harvest proportions or harvest rates, at either the local or county scale the modeled SHC had a high probability (>0.60 and >0.75, respectively) of eliminating recruitment into >2.5 years old age classes. Although frequently used to increase population age structure, our modeling indicated that selective harvest criteria can decrease or eliminate the number of white-tailed deer recruited into older age classes. Thus, we suggest that using harvest proportions for management planning and evaluation should be viewed with caution. In addition, we recommend that managers focus more attention on estimation of age-specific harvest rates, and modeling approaches which combine harvest rates with information from harvested individuals to further increase their ability to effectively manage deer populations under selective harvest programs. ?? 2006 Elsevier B.V. All rights reserved.
Optimal Contractor Selection in Construction Industry: The Fuzzy Way
NASA Astrophysics Data System (ADS)
Krishna Rao, M. V.; Kumar, V. S. S.; Rathish Kumar, P.
2018-02-01
A purely price-based approach to contractor selection has been identified as the root cause for many serious project delivery problems. Therefore, the capability of the contractor to execute the project should be evaluated using a multiple set of selection criteria including reputation, past performance, performance potential, financial soundness and other project specific criteria. An industry-wide questionnaire survey was conducted with the objective of identifying the important criteria for adoption in the selection process. In this work, a fuzzy set based model was developed for contractor prequalification/evaluation, by using effective criteria obtained from the percept of construction professionals, taking subjective judgments of decision makers also into consideration. A case study consisting of four alternatives (contractors in the present case) solicited from a public works department of Pondicherry in India, is used to illustrate the effectiveness of the proposed approach. The final selection of contractor is made based on the integrated score or Overall Evaluation Score of the decision alternative in prequalification as well as bid evaluation stages.
ERIC Educational Resources Information Center
Wholeben, Brent Edward
This report describing the use of operations research techniques to determine which courseware packages or what microcomputer systems best address varied instructional objectives focuses on the MICROPIK model, a highly structured evaluation technique for making such complex instructional decisions. MICROPIK is a multiple alternatives model (MAA)…
Ding, Shuai; Xia, Chen-Yi; Zhou, Kai-Le; Yang, Shan-Lin; Shang, Jennifer S.
2014-01-01
Facing a customer market with rising demands for cloud service dependability and security, trustworthiness evaluation techniques are becoming essential to cloud service selection. But these methods are out of the reach to most customers as they require considerable expertise. Additionally, since the cloud service evaluation is often a costly and time-consuming process, it is not practical to measure trustworthy attributes of all candidates for each customer. Many existing models cannot easily deal with cloud services which have very few historical records. In this paper, we propose a novel service selection approach in which the missing value prediction and the multi-attribute trustworthiness evaluation are commonly taken into account. By simply collecting limited historical records, the current approach is able to support the personalized trustworthy service selection. The experimental results also show that our approach performs much better than other competing ones with respect to the customer preference and expectation in trustworthiness assessment. PMID:24972237
Ding, Shuai; Xia, Cheng-Yi; Xia, Chen-Yi; Zhou, Kai-Le; Yang, Shan-Lin; Shang, Jennifer S
2014-01-01
Facing a customer market with rising demands for cloud service dependability and security, trustworthiness evaluation techniques are becoming essential to cloud service selection. But these methods are out of the reach to most customers as they require considerable expertise. Additionally, since the cloud service evaluation is often a costly and time-consuming process, it is not practical to measure trustworthy attributes of all candidates for each customer. Many existing models cannot easily deal with cloud services which have very few historical records. In this paper, we propose a novel service selection approach in which the missing value prediction and the multi-attribute trustworthiness evaluation are commonly taken into account. By simply collecting limited historical records, the current approach is able to support the personalized trustworthy service selection. The experimental results also show that our approach performs much better than other competing ones with respect to the customer preference and expectation in trustworthiness assessment.
Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa
2016-01-01
Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.
Fitness consequences of sex-specific selection.
Connallon, Tim; Cox, Robert M; Calsbeek, Ryan
2010-06-01
Theory suggests that sex-specific selection can facilitate adaptation in sexually reproducing populations. However, sexual conflict theory and recent experiments indicate that sex-specific selection is potentially costly due to sexual antagonism: alleles harmful to one sex can accumulate within a population because they are favored in the other sex. Whether sex-specific selection provides a net fitness benefit or cost depends, in part, on the relative frequency and strength of sexually concordant versus sexually antagonistic selection throughout a species' genome. Here, we model the net fitness consequences of sex-specific selection while explicitly considering both sexually concordant and sexually antagonistic selection. The model shows that, even when sexual antagonism is rare, the fitness costs that it imposes will generally overwhelm fitness benefits of sexually concordant selection. Furthermore, the cost of sexual antagonism is, at best, only partially resolved by the evolution of sex-limited gene expression. To evaluate the key parameters of the model, we analyze an extensive dataset of sex-specific selection gradients from wild populations, along with data from the experimental evolution literature. The model and data imply that sex-specific selection may likely impose a net cost on sexually reproducing species, although additional research will be required to confirm this conclusion.
Selecting Meteorological Input for the Global Modeling Initiative Assessments
NASA Technical Reports Server (NTRS)
Strahan, Susan; Douglass, Anne; Prather, Michael; Coy, Larry; Hall, Tim; Rasch, Phil; Sparling, Lynn
1999-01-01
The Global Modeling Initiative (GMI) science team has developed a three dimensional chemistry and transport model (CTM) to evaluate the impact of the exhaust of supersonic aircraft on the stratosphere. An important goal of the GMI is to test modules for numerical transport, photochemical integration, and model dynamics within a common framework. This work is focussed on the dependence of the overall assessment on the wind and temperature fields used by the CTM. Three meteorological data sets for the stratosphere were available to GMI: the National Center for Atmospheric Research Community Climate Model (CCM2), the Goddard Earth Observing System Data Assimilation System (GEOS-DAS), and the Goddard Institute for Space Studies general circulation model (GISS-2'). Objective criteria were established by the GMI team to evaluate which of these three data sets provided the best representation of trace gases in the stratosphere today. Tracer experiments were devised to test various aspects of model transport. Stratospheric measurements of long-lived trace gases were selected as a test of the CTM transport. This presentation describes the criteria used in grading the meteorological fields and the resulting choice of wind fields to be used in the GMI assessment. This type of objective model evaluation will lead to a higher level of confidence in these assessments. We suggest that the diagnostic tests shown here be used to augment traditional general circulation model evaluation methods.
Life cycle cost assessment of future low heat rejection engines
NASA Technical Reports Server (NTRS)
Petersen, D. R.
1986-01-01
The Adiabatic Diesel Engine Component Development (ADECD) represents a project which has the objective to accelerate the development of highway truck engines with advanced technology aimed at reduced fuel consumption. The project comprises three steps, including the synthesis of a number of engine candidate designs, the coupling of each with a number of systems for utilizing exhaust gas energy, and the evaluation of each combination in terms of desirability. Particular attention is given to the employed evaluation method and the development of this method. The objective of Life Cycle Cost (LCC) evaluation in the ADECD program was to select the best from among 42 different low heat rejection engine (LHRE)/exhaust energy recovery system configurations. The LCC model is discussed along with a maintenance cost model, the evaluation strategy, the selection of parameter ranges, and a full factorial analysis.
Evaluating candidate reactions to selection practices using organisational justice theory.
Patterson, Fiona; Zibarras, Lara; Carr, Victoria; Irish, Bill; Gregory, Simon
2011-03-01
This study aimed to examine candidate reactions to selection practices in postgraduate medical training using organisational justice theory. We carried out three independent cross-sectional studies using samples from three consecutive annual recruitment rounds. Data were gathered from candidates applying for entry into UK general practice (GP) training during 2007, 2008 and 2009. Participants completed an evaluation questionnaire immediately after the short-listing stage and after the selection centre (interview) stage. Participants were doctors applying for GP training in the UK. Main outcome measures were participants' evaluations of the selection methods and perceptions of the overall fairness of each selection stage (short-listing and selection centre). A total of 23,855 evaluation questionnaires were completed (6893 in 2007, 10,497 in 2008 and 6465 in 2009). Absolute levels of perceptions of fairness of all the selection methods at both the short-listing and selection centre stages were consistently high over the 3years. Similarly, all selection methods were considered to be job-related by candidates. However, in general, candidates considered the selection centre stage to be significantly fairer than the short-listing stage. Of all the selection methods, the simulated patient consultation completed at the selection centre stage was rated as the most job-relevant. This is the first study to use a model of organisational justice theory to evaluate candidate reactions during selection into postgraduate specialty training. The high-fidelity selection methods are consistently viewed as more job-relevant and fairer by candidates. This has important implications for the design of recruitment systems for all specialties and, potentially, for medical school admissions. Using this approach, recruiters can systematically compare perceptions of the fairness and job relevance of various selection methods. © Blackwell Publishing Ltd 2011.
ERIC Educational Resources Information Center
De los Santos, Saturnino; Norland, Emmalou Van Tilburg
A study evaluated the cacao farmer training program in the Dominican Republic by testing hypothesized relationships among reactions, knowledge and skills, attitudes, aspirations, and some selected demographic characteristics of farmers who attended programs. Bennett's hierarchical model of program evaluation was used as the framework of the study.…
Development of Ku-band rendezvous radar tracking and acquisition simulation programs
NASA Technical Reports Server (NTRS)
1986-01-01
The fidelity of the Space Shuttle Radar tracking simulation model was improved. The data from the Shuttle Orbiter Radar Test and Evaluation (SORTE) program experiments performed at the White Sands Missile Range (WSMR) were reviewed and analyzed. The selected flight rendezvous radar data was evaluated. Problems with the Inertial Line-of-Sight (ILOS) angle rate tracker were evaluated using the improved fidelity angle rate tracker simulation model.
Yang, Ziheng; Zhu, Tianqi
2018-02-20
The Bayesian method is noted to produce spuriously high posterior probabilities for phylogenetic trees in analysis of large datasets, but the precise reasons for this overconfidence are unknown. In general, the performance of Bayesian selection of misspecified models is poorly understood, even though this is of great scientific interest since models are never true in real data analysis. Here we characterize the asymptotic behavior of Bayesian model selection and show that when the competing models are equally wrong, Bayesian model selection exhibits surprising and polarized behaviors in large datasets, supporting one model with full force while rejecting the others. If one model is slightly less wrong than the other, the less wrong model will eventually win when the amount of data increases, but the method may become overconfident before it becomes reliable. We suggest that this extreme behavior may be a major factor for the spuriously high posterior probabilities for evolutionary trees. The philosophical implications of our results to the application of Bayesian model selection to evaluate opposing scientific hypotheses are yet to be explored, as are the behaviors of non-Bayesian methods in similar situations.
NASA Astrophysics Data System (ADS)
Anna, I. D.; Cahyadi, I.; Yakin, A.
2018-01-01
Selection of marketing strategy is a prominent competitive advantage for small and medium enterprises business development. The selection process is is a multiple criteria decision-making problem, which includes evaluation of various attributes or criteria in a process of strategy formulation. The objective of this paper is to develop a model for the selection of a marketing strategy in Batik Madura industry. The current study proposes an integrated approach based on analytic network process (ANP) and technique for order preference by similarity to ideal solution (TOPSIS) to determine the best strategy for Batik Madura marketing problems. Based on the results of group decision-making technique, this study selected fourteen criteria, including consistency, cost, trend following, customer loyalty, business volume, uniqueness manpower, customer numbers, promotion, branding, bussiness network, outlet location, credibility and the inovation as Batik Madura marketing strategy evaluation criteria. A survey questionnaire developed from literature review was distributed to a sample frame of Batik Madura SMEs in Pamekasan. In the decision procedure step, expert evaluators were asked to establish the decision matrix by comparing the marketing strategy alternatives under each of the individual criteria. Then, considerations obtained from ANP and TOPSIS methods were applied to build the specific criteria constraints and range of the launch strategy in the model. The model in this study demonstrates that, under current business situation, Straight-focus marketing strategy is the best marketing strategy for Batik Madura SMEs in Pamekasan.
Discriminative least squares regression for multiclass classification and feature selection.
Xiang, Shiming; Nie, Feiping; Meng, Gaofeng; Pan, Chunhong; Zhang, Changshui
2012-11-01
This paper presents a framework of discriminative least squares regression (LSR) for multiclass classification and feature selection. The core idea is to enlarge the distance between different classes under the conceptual framework of LSR. First, a technique called ε-dragging is introduced to force the regression targets of different classes moving along opposite directions such that the distances between classes can be enlarged. Then, the ε-draggings are integrated into the LSR model for multiclass classification. Our learning framework, referred to as discriminative LSR, has a compact model form, where there is no need to train two-class machines that are independent of each other. With its compact form, this model can be naturally extended for feature selection. This goal is achieved in terms of L2,1 norm of matrix, generating a sparse learning model for feature selection. The model for multiclass classification and its extension for feature selection are finally solved elegantly and efficiently. Experimental evaluation over a range of benchmark datasets indicates the validity of our method.
Protein attributes contribute to halo-stability, bioinformatics approach
2011-01-01
Halophile proteins can tolerate high salt concentrations. Understanding halophilicity features is the first step toward engineering halostable crops. To this end, we examined protein features contributing to the halo-toleration of halophilic organisms. We compared more than 850 features for halophilic and non-halophilic proteins with various screening, clustering, decision tree, and generalized rule induction models to search for patterns that code for halo-toleration. Up to 251 protein attributes selected by various attribute weighting algorithms as important features contribute to halo-stability; from them 14 attributes selected by 90% of models and the count of hydrogen gained the highest value (1.0) in 70% of attribute weighting models, showing the importance of this attribute in feature selection modeling. The other attributes mostly were the frequencies of di-peptides. No changes were found in the numbers of groups when K-Means and TwoStep clustering modeling were performed on datasets with or without feature selection filtering. Although the depths of induced trees were not high, the accuracies of trees were higher than 94% and the frequency of hydrophobic residues pointed as the most important feature to build trees. The performance evaluation of decision tree models had the same values and the best correctness percentage recorded with the Exhaustive CHAID and CHAID models. We did not find any significant difference in the percent of correctness, performance evaluation, and mean correctness of various decision tree models with or without feature selection. For the first time, we analyzed the performance of different screening, clustering, and decision tree algorithms for discriminating halophilic and non-halophilic proteins and the results showed that amino acid composition can be used to discriminate between halo-tolerant and halo-sensitive proteins. PMID:21592393
Vivekanandan, T; Sriman Narayana Iyengar, N Ch
2017-11-01
Enormous data growth in multiple domains has posed a great challenge for data processing and analysis techniques. In particular, the traditional record maintenance strategy has been replaced in the healthcare system. It is vital to develop a model that is able to handle the huge amount of e-healthcare data efficiently. In this paper, the challenging tasks of selecting critical features from the enormous set of available features and diagnosing heart disease are carried out. Feature selection is one of the most widely used pre-processing steps in classification problems. A modified differential evolution (DE) algorithm is used to perform feature selection for cardiovascular disease and optimization of selected features. Of the 10 available strategies for the traditional DE algorithm, the seventh strategy, which is represented by DE/rand/2/exp, is considered for comparative study. The performance analysis of the developed modified DE strategy is given in this paper. With the selected critical features, prediction of heart disease is carried out using fuzzy AHP and a feed-forward neural network. Various performance measures of integrating the modified differential evolution algorithm with fuzzy AHP and a feed-forward neural network in the prediction of heart disease are evaluated in this paper. The accuracy of the proposed hybrid model is 83%, which is higher than that of some other existing models. In addition, the prediction time of the proposed hybrid model is also evaluated and has shown promising results. Copyright © 2017 Elsevier Ltd. All rights reserved.
SOME USES OF MODELS OF QUANTITATIVE GENETIC SELECTION IN SOCIAL SCIENCE.
Weight, Michael D; Harpending, Henry
2017-01-01
The theory of selection of quantitative traits is widely used in evolutionary biology, agriculture and other related fields. The fundamental model known as the breeder's equation is simple, robust over short time scales, and it is often possible to estimate plausible parameters. In this paper it is suggested that the results of this model provide useful yardsticks for the description of social traits and the evaluation of transmission models. The differences on a standard personality test between samples of Old Order Amish and Indiana rural young men from the same county and the decline of homicide in Medieval Europe are used as illustrative examples of the overall approach. It is shown that the decline of homicide is unremarkable under a threshold model while the differences between rural Amish and non-Amish young men are too large to be a plausible outcome of simple genetic selection in which assortative mating by affiliation is equivalent to truncation selection.
Random forest feature selection approach for image segmentation
NASA Astrophysics Data System (ADS)
Lefkovits, László; Lefkovits, Szidónia; Emerich, Simina; Vaida, Mircea Florin
2017-03-01
In the field of image segmentation, discriminative models have shown promising performance. Generally, every such model begins with the extraction of numerous features from annotated images. Most authors create their discriminative model by using many features without using any selection criteria. A more reliable model can be built by using a framework that selects the important variables, from the point of view of the classification, and eliminates the unimportant once. In this article we present a framework for feature selection and data dimensionality reduction. The methodology is built around the random forest (RF) algorithm and its variable importance evaluation. In order to deal with datasets so large as to be practically unmanageable, we propose an algorithm based on RF that reduces the dimension of the database by eliminating irrelevant features. Furthermore, this framework is applied to optimize our discriminative model for brain tumor segmentation.
A study for development of aerothermodynamic test model materials and fabrication technique
NASA Technical Reports Server (NTRS)
Dean, W. G.; Connor, L. E.
1972-01-01
A literature survey, materials reformulation and tailoring, fabrication problems, and materials selection and evaluation for fabricating models to be used with the phase-change technique for obtaining quantitative aerodynamic heat transfer data are presented. The study resulted in the selection of two best materials, stycast 2762 FT, and an alumina ceramic. Characteristics of these materials and detailed fabrication methods are presented.
NASA Astrophysics Data System (ADS)
Lu, Hongwei; Ren, Lixia; Chen, Yizhong; Tian, Peipei; Liu, Jia
2017-12-01
Due to the uncertainty (i.e., fuzziness, stochasticity and imprecision) existed simultaneously during the process for groundwater remediation, the accuracy of ranking results obtained by the traditional methods has been limited. This paper proposes a cloud model based multi-attribute decision making framework (CM-MADM) with Monte Carlo for the contaminated-groundwater remediation strategies selection. The cloud model is used to handle imprecise numerical quantities, which can describe the fuzziness and stochasticity of the information fully and precisely. In the proposed approach, the contaminated concentrations are aggregated via the backward cloud generator and the weights of attributes are calculated by employing the weight cloud module. A case study on the remedial alternative selection for a contaminated site suffering from a 1,1,1-trichloroethylene leakage problem in Shanghai, China is conducted to illustrate the efficiency and applicability of the developed approach. Totally, an attribute system which consists of ten attributes were used for evaluating each alternative through the developed method under uncertainty, including daily total pumping rate, total cost and cloud model based health risk. Results indicated that A14 was evaluated to be the most preferred alternative for the 5-year, A5 for the 10-year, A4 for the 15-year and A6 for the 20-year remediation.
Li, Xingang; Li, Jia; Sui, Hong; He, Lin; Cao, Xingtao; Li, Yonghong
2018-07-05
Soil remediation has been considered as one of the most difficult pollution treatment tasks due to its high complexity in contaminants, geological conditions, usage, urgency, etc. The diversity in remediation technologies further makes quick selection of suitable remediation schemes much tougher even the site investigation has been done. Herein, a sustainable decision support hierarchical model has been developed to select, evaluate and determine preferred soil remediation schemes comprehensively based on modified analytic hierarchy process (MAHP). This MAHP method combines competence model and the Grubbs criteria with the conventional AHP. It not only considers the competence differences among experts in group decision, but also adjusts the big deviation caused by different experts' preference through sample analysis. This conversion allows the final remediation decision more reasonable. In this model, different evaluation criteria, including economic effect, environmental effect and technological effect, are employed to evaluate the integrated performance of remediation schemes followed by a strict computation using above MAHP. To confirm the feasibility of this developed model, it has been tested by a benzene workshop contaminated site in Beijing coking plant. Beyond soil remediation, this MAHP model would also be applied in other fields referring to multi-criteria group decision making. Copyright © 2018 Elsevier B.V. All rights reserved.
Xu, Cheng-Jian; van der Schaaf, Arjen; Schilstra, Cornelis; Langendijk, Johannes A; van't Veld, Aart A
2012-03-15
To study the impact of different statistical learning methods on the prediction performance of multivariate normal tissue complication probability (NTCP) models. In this study, three learning methods, stepwise selection, least absolute shrinkage and selection operator (LASSO), and Bayesian model averaging (BMA), were used to build NTCP models of xerostomia following radiotherapy treatment for head and neck cancer. Performance of each learning method was evaluated by a repeated cross-validation scheme in order to obtain a fair comparison among methods. It was found that the LASSO and BMA methods produced models with significantly better predictive power than that of the stepwise selection method. Furthermore, the LASSO method yields an easily interpretable model as the stepwise method does, in contrast to the less intuitive BMA method. The commonly used stepwise selection method, which is simple to execute, may be insufficient for NTCP modeling. The LASSO method is recommended. Copyright © 2012 Elsevier Inc. All rights reserved.
Selecting Growth Measures for Use in School Evaluation Systems: Should Proportionality Matter?
ERIC Educational Resources Information Center
Ehlert, Mark; Koedel, Cory; Parsons, Eric; Podgursky, Michael
2016-01-01
The specifics of how growth models should be constructed and used for educational evaluation is a topic of lively policy debate in states and school districts nationwide. In this article, we take up the question of model choice--framed within a policy context--and examine three competing approaches. The first approach, reflected in the popular…
On-line Model Structure Selection for Estimation of Plasma Boundary in a Tokamak
NASA Astrophysics Data System (ADS)
Škvára, Vít; Šmídl, Václav; Urban, Jakub
2015-11-01
Control of the plasma field in the tokamak requires reliable estimation of the plasma boundary. The plasma boundary is given by a complex mathematical model and the only available measurements are responses of induction coils around the plasma. For the purpose of boundary estimation the model can be reduced to simple linear regression with potentially infinitely many elements. The number of elements must be selected manually and this choice significantly influences the resulting shape. In this paper, we investigate the use of formal model structure estimation techniques for the problem. Specifically, we formulate a sparse least squares estimator using the automatic relevance principle. The resulting algorithm is a repetitive evaluation of the least squares problem which could be computed in real time. Performance of the resulting algorithm is illustrated on simulated data and evaluated with respect to a more detailed and computationally costly model FREEBIE.
An analytical framework to assist decision makers in the use of forest ecosystem model predictions
Larocque, Guy R.; Bhatti, Jagtar S.; Ascough, J.C.; Liu, J.; Luckai, N.; Mailly, D.; Archambault, L.; Gordon, Andrew M.
2011-01-01
The predictions from most forest ecosystem models originate from deterministic simulations. However, few evaluation exercises for model outputs are performed by either model developers or users. This issue has important consequences for decision makers using these models to develop natural resource management policies, as they cannot evaluate the extent to which predictions stemming from the simulation of alternative management scenarios may result in significant environmental or economic differences. Various numerical methods, such as sensitivity/uncertainty analyses, or bootstrap methods, may be used to evaluate models and the errors associated with their outputs. However, the application of each of these methods carries unique challenges which decision makers do not necessarily understand; guidance is required when interpreting the output generated from each model. This paper proposes a decision flow chart in the form of an analytical framework to help decision makers apply, in an orderly fashion, different steps involved in examining the model outputs. The analytical framework is discussed with regard to the definition of problems and objectives and includes the following topics: model selection, identification of alternatives, modelling tasks and selecting alternatives for developing policy or implementing management scenarios. Its application is illustrated using an on-going exercise in developing silvicultural guidelines for a forest management enterprise in Ontario, Canada.
The Evaluation of Hospital Performance in Iran: A Systematic Review Article
BAHADORI, Mohammadkarim; IZADI, Ahmad Reza; GHARDASHI, Fatemeh; RAVANGARD, Ramin; HOSSEINI, Seyed Mojtaba
2016-01-01
Background: This research aimed to systematically study and outline the methods of hospital performance evaluation used in Iran. Methods: In this systematic review, all Persian and English-language articles published in the Iranian and non-Iranian scientific journals indexed from Sep 2004 to Sep 2014 were studied. For finding the related articles, the researchers searched the Iranian electronic databases, including SID, IranMedex, IranDoc, Magiran, as well as the non-Iranian electronic databases, including Medline, Embase, Scopus, and Google Scholar. For reviewing the selected articles, a data extraction form, developed by the researchers was used. Results: The entire review process led to the selection of 51 articles. The publication of articles on the hospital performance evaluation in Iran has increased considerably in the recent years. Besides, among these 51 articles, 38 articles (74.51%) had been published in Persian language and 13 articles (25.49%) in English language. Eight models were recognized as evaluation model for Iranian hospitals. Totally, in 15 studies, the data envelopment analysis model had been used to evaluate the hospital performance. Conclusion: Using a combination of model to integrate indicators in the hospital evaluation process is inevitable. Therefore, the Ministry of Health and Medical Education should use a set of indicators such as the balanced scorecard in the process of hospital evaluation and accreditation and encourage the hospital managers to use them. PMID:27516991
The Evaluation of Hospital Performance in Iran: A Systematic Review Article.
Bahadori, Mohammadkarim; Izadi, Ahmad Reza; Ghardashi, Fatemeh; Ravangard, Ramin; Hosseini, Seyed Mojtaba
2016-07-01
This research aimed to systematically study and outline the methods of hospital performance evaluation used in Iran. In this systematic review, all Persian and English-language articles published in the Iranian and non-Iranian scientific journals indexed from Sep 2004 to Sep 2014 were studied. For finding the related articles, the researchers searched the Iranian electronic databases, including SID, IranMedex, IranDoc, Magiran, as well as the non-Iranian electronic databases, including Medline, Embase, Scopus, and Google Scholar. For reviewing the selected articles, a data extraction form, developed by the researchers was used. The entire review process led to the selection of 51 articles. The publication of articles on the hospital performance evaluation in Iran has increased considerably in the recent years. Besides, among these 51 articles, 38 articles (74.51%) had been published in Persian language and 13 articles (25.49%) in English language. Eight models were recognized as evaluation model for Iranian hospitals. Totally, in 15 studies, the data envelopment analysis model had been used to evaluate the hospital performance. Using a combination of model to integrate indicators in the hospital evaluation process is inevitable. Therefore, the Ministry of Health and Medical Education should use a set of indicators such as the balanced scorecard in the process of hospital evaluation and accreditation and encourage the hospital managers to use them.
Covariate Selection for Multilevel Models with Missing Data
Marino, Miguel; Buxton, Orfeu M.; Li, Yi
2017-01-01
Missing covariate data hampers variable selection in multilevel regression settings. Current variable selection techniques for multiply-imputed data commonly address missingness in the predictors through list-wise deletion and stepwise-selection methods which are problematic. Moreover, most variable selection methods are developed for independent linear regression models and do not accommodate multilevel mixed effects regression models with incomplete covariate data. We develop a novel methodology that is able to perform covariate selection across multiply-imputed data for multilevel random effects models when missing data is present. Specifically, we propose to stack the multiply-imputed data sets from a multiple imputation procedure and to apply a group variable selection procedure through group lasso regularization to assess the overall impact of each predictor on the outcome across the imputed data sets. Simulations confirm the advantageous performance of the proposed method compared with the competing methods. We applied the method to reanalyze the Healthy Directions-Small Business cancer prevention study, which evaluated a behavioral intervention program targeting multiple risk-related behaviors in a working-class, multi-ethnic population. PMID:28239457
An evaluation of preference for video and in vivo modeling.
Geiger, Kaneen B; Leblanc, Linda A; Dillon, Courtney M; Bates, Stephanie L
2010-01-01
We assessed preference for video or in vivo modeling using a concurrent-chains arrangement with 3 children with autism. The two modeling conditions produced similar acquisition rates and no differential selection (i.e., preference) for all 3 participants.
Lu, Dan; Ye, Ming; Meyer, Philip D.; Curtis, Gary P.; Shi, Xiaoqing; Niu, Xu-Feng; Yabusaki, Steve B.
2013-01-01
When conducting model averaging for assessing groundwater conceptual model uncertainty, the averaging weights are often evaluated using model selection criteria such as AIC, AICc, BIC, and KIC (Akaike Information Criterion, Corrected Akaike Information Criterion, Bayesian Information Criterion, and Kashyap Information Criterion, respectively). However, this method often leads to an unrealistic situation in which the best model receives overwhelmingly large averaging weight (close to 100%), which cannot be justified by available data and knowledge. It was found in this study that this problem was caused by using the covariance matrix, CE, of measurement errors for estimating the negative log likelihood function common to all the model selection criteria. This problem can be resolved by using the covariance matrix, Cek, of total errors (including model errors and measurement errors) to account for the correlation between the total errors. An iterative two-stage method was developed in the context of maximum likelihood inverse modeling to iteratively infer the unknown Cek from the residuals during model calibration. The inferred Cek was then used in the evaluation of model selection criteria and model averaging weights. While this method was limited to serial data using time series techniques in this study, it can be extended to spatial data using geostatistical techniques. The method was first evaluated in a synthetic study and then applied to an experimental study, in which alternative surface complexation models were developed to simulate column experiments of uranium reactive transport. It was found that the total errors of the alternative models were temporally correlated due to the model errors. The iterative two-stage method using Cekresolved the problem that the best model receives 100% model averaging weight, and the resulting model averaging weights were supported by the calibration results and physical understanding of the alternative models. Using Cek obtained from the iterative two-stage method also improved predictive performance of the individual models and model averaging in both synthetic and experimental studies.
NASA Astrophysics Data System (ADS)
Lopez Bobeda, J. R.
2017-12-01
The increasing use of groundwater for irrigation of crops has exacerbated groundwater sustainability issues faced by water limited regions. Gridded, process-based crop models have the potential to help farmers and policymakers asses the effects water shortages on yield and devise new strategies for sustainable water use. Gridded crop models are typically calibrated and evaluated using county-level survey data of yield, planting dates, and maturity dates. However, little is known about the ability of these models to reproduce observed crop evapotranspiration and water use at regional scales. The aim of this work is to evaluate a gridded version of the Decision Support System for Agrotechnology Transfer (DSSAT) crop model over the continental United States. We evaluated crop seasonal evapotranspiration over 5 arc-minute grids, and irrigation water use at the county level. Evapotranspiration was assessed only for rainfed agriculture to test the model evapotranspiration equations separate from the irrigation algorithm. Model evapotranspiration was evaluated against the Atmospheric Land Exchange Inverse (ALEXI) modeling product. Using a combination of the USDA crop land data layer (CDL) and the USGS Moderate Resolution Imaging Spectroradiometer Irrigated Agriculture Dataset for the United States (MIrAD-US), we selected only grids with more than 60% of their area planted with the simulated crops (corn, cotton, and soybean), and less than 20% of their area irrigated. Irrigation water use was compared against the USGS county level irrigated agriculture water use survey data. Simulated gridded data were aggregated to county level using USDA CDL and USGS MIrAD-US. Only counties where 70% or more of the irrigated land was corn, cotton, or soybean were selected for the evaluation. Our results suggest that gridded crop models can reasonably reproduce crop evapotranspiration at the country scale (RRMSE = 10%).
Composite Load Spectra for Select Space Propulsion Structural Components
NASA Technical Reports Server (NTRS)
Ho, Hing W.; Newell, James F.
1994-01-01
Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.
Agent-based modeling as a tool for program design and evaluation.
Lawlor, Jennifer A; McGirr, Sara
2017-12-01
Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.
Coutinho, C C; Mercadante, M E Z; Jorge, A M; Paz, C C P; El Faro, L; Monteiro, F M
2015-10-30
The effect of selection for postweaning weight was evaluated within the growth curve parameters for both growth and carcass traits. Records of 2404 Nellore animals from three selection lines were analyzed: two selection lines for high postweaning weight, selection (NeS) and traditional (NeT); and a control line (NeC) in which animals were selected for postweaning weight close to the average. Body weight (BW), hip height (HH), rib eye area (REA), back fat thickness (BFT), and rump fat thickness (RFT) were measured and records collected from animals 8 to 20 (males) and 11 to 26 (females) months of age. The parameters A (asymptotic value) and k (growth rate) were estimated using the nonlinear model procedure of the Statistical Analysis System program, which included fixed effect of line (NeS, NeT, and NeC) in the model, with the objective to evaluate differences in the estimated parameters between lines. Selected animals (NeS and NeT) showed higher growth rates than control line animals (NeC) for all traits. Line effect on curves parameters was significant (P < 0.001) for BW, HH, and REA in males, and for BFT and RFT in females. Selection for postweaning weight was effective in altering growth curves, resulting in animals with higher growth potential.
Disturbance characteristics of half-selected cells in a cross-point resistive switching memory array
NASA Astrophysics Data System (ADS)
Chen, Zhe; Li, Haitong; Chen, Hong-Yu; Chen, Bing; Liu, Rui; Huang, Peng; Zhang, Feifei; Jiang, Zizhen; Ye, Hongfei; Gao, Bin; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng; Wong, H.-S. Philip; Yu, Shimeng
2016-05-01
Disturbance characteristics of cross-point resistive random access memory (RRAM) arrays are comprehensively studied in this paper. An analytical model is developed to quantify the number of pulses (#Pulse) the cell can bear before disturbance occurs under various sub-switching voltage stresses based on physical understanding. An evaluation methodology is proposed to assess the disturb behavior of half-selected (HS) cells in cross-point RRAM arrays by combining the analytical model and SPICE simulation. The characteristics of cross-point RRAM arrays such as energy consumption, reliable operating cycles and total error bits are evaluated by the methodology. A possible solution to mitigate disturbance is proposed.
Why Medical Informatics (still) Needs Cognitive and Social Sciences.
Declerck, G; Aimé, X
2013-01-01
To summarize current excellent medical informatics research in the field of human factors and organizational issues. Using PubMed, a total of 3,024 papers were selected from 17 journals. The papers were evaluated on the basis of their title, keywords, and abstract, using several exclusion and inclusion criteria. 15 preselected papers were carefully evaluated by six referees using a standard evaluation grid. Six best papers were selected exemplifying the central role cognitive and social sciences can play in medical informatics research. Among other contributions, those studies: (i) make use of the distributed cognition paradigm to model and understand clinical care situations; (ii) take into account organizational issues to analyse the impact of HIT on information exchange and coordination processes; (iii) illustrate how models and empirical data from cognitive psychology can be used in medical informatics; and (iv) highlight the need of qualitative studies to analyze the unexpected side effects of HIT on cognitive and work processes. The selected papers demonstrate that paradigms, methodologies, models, and results from cognitive and social sciences can help to bridge the gap between HIT and end users, and contribute to limit adoption failures that are reported regularly.
Shan, Jiajia; Wang, Xue; Zhou, Hao; Han, Shuqing; Riza, Dimas Firmanda Al; Kondo, Naoshi
2018-03-13
Synchronous fluorescence spectra, combined with multivariate analysis were used to predict flavonoids content in green tea rapidly and nondestructively. This paper presented a new and efficient spectral intervals selection method called clustering based partial least square (CL-PLS), which selected informative wavelengths by combining clustering concept and partial least square (PLS) methods to improve models' performance by synchronous fluorescence spectra. The fluorescence spectra of tea samples were obtained and k-means and kohonen-self organizing map clustering algorithms were carried out to cluster full spectra into several clusters, and sub-PLS regression model was developed on each cluster. Finally, CL-PLS models consisting of gradually selected clusters were built. Correlation coefficient (R) was used to evaluate the effect on prediction performance of PLS models. In addition, variable influence on projection partial least square (VIP-PLS), selectivity ratio partial least square (SR-PLS), interval partial least square (iPLS) models and full spectra PLS model were investigated and the results were compared. The results showed that CL-PLS presented the best result for flavonoids prediction using synchronous fluorescence spectra.
A SWOT analysis of the organization and financing of the Danish health care system.
Christiansen, Terkel
2002-02-01
The organization and financing of the Danish health care system was evaluated within a framework of a SWOT analysis (analysis of Strengths, Weaknesses, Opportunities and Threats) by a panel of five members with a background in health economics. The present paper describes the methods and materials used for the evaluation: selection of panel members, structure of the evaluation task according to the health care triangle model, selection of background material consisting of documents and literature on the Danish health care system, and a 1-week study visit.
Ramnauth, Jailall; Speed, Joanne; Maddaford, Shawn P; Dove, Peter; Annedi, Subhash C; Renton, Paul; Rakhit, Suman; Andrews, John; Silverman, Sarah; Mladenova, Gabriela; Zinghini, Salvatore; Nair, Sheela; Catalano, Concettina; Lee, David K H; De Felice, Milena; Porreca, Frank
2011-08-11
Neuronal nitric oxide synthase (nNOS) inhibitors are effective in preclinical models of many neurological disorders. In this study, two related series of compounds, 3,4-dihydroquinolin-2(1H)-one and 1,2,3,4-tetrahydroquinoline, containing a 6-substituted thiophene amidine group were synthesized and evaluated as inhibitors of human nitric oxide synthase (NOS). A structure-activity relationship (SAR) study led to the identification of a number of potent and selective nNOS inhibitors. Furthermore, a few representative compounds were shown to possess druglike properties, features that are often difficult to achieve when designing nNOS inhibitors. Compound (S)-35, with excellent potency and selectivity for nNOS, was shown to fully reverse thermal hyperalgesia when given to rats at a dose of 30 mg/kg intraperitonieally (ip) in the L5/L6 spinal nerve ligation model of neuropathic pain (Chung model). In addition, this compound reduced tactile hyperesthesia (allodynia) after oral administration (30 mg/kg) in a rat model of dural inflammation relevant to migraine pain.
Odegård, J; Klemetsdal, G; Heringstad, B
2005-04-01
Several selection criteria for reducing incidence of mastitis were developed from a random regression sire model for test-day somatic cell score (SCS). For comparison, sire transmitting abilities were also predicted based on a cross-sectional model for lactation mean SCS. Only first-crop daughters were used in genetic evaluation of SCS, and the different selection criteria were compared based on their correlation with incidence of clinical mastitis in second-crop daughters (measured as mean daughter deviations). Selection criteria were predicted based on both complete and reduced first-crop daughter groups (261 or 65 daughters per sire, respectively). For complete daughter groups, predicted transmitting abilities at around 30 d in milk showed the best predictive ability for incidence of clinical mastitis, closely followed by average predicted transmitting abilities over the entire lactation. Both of these criteria were derived from the random regression model. These selection criteria improved accuracy of selection by approximately 2% relative to a cross-sectional model. However, for reduced daughter groups, the cross-sectional model yielded increased predictive ability compared with the selection criteria based on the random regression model. This result may be explained by the cross-sectional model being more robust, i.e., less sensitive to precision of (co)variance components estimates and effects of data structure.
A Systematic Approach to Sensor Selection for Aircraft Engine Health Estimation
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Garg, Sanjay
2009-01-01
A systematic approach for selecting an optimal suite of sensors for on-board aircraft gas turbine engine health estimation is presented. The methodology optimally chooses the engine sensor suite and the model tuning parameter vector to minimize the Kalman filter mean squared estimation error in the engine s health parameters or other unmeasured engine outputs. This technique specifically addresses the underdetermined estimation problem where there are more unknown system health parameters representing degradation than available sensor measurements. This paper presents the theoretical estimation error equations, and describes the optimization approach that is applied to select the sensors and model tuning parameters to minimize these errors. Two different model tuning parameter vector selection approaches are evaluated: the conventional approach of selecting a subset of health parameters to serve as the tuning parameters, and an alternative approach that selects tuning parameters as a linear combination of all health parameters. Results from the application of the technique to an aircraft engine simulation are presented, and compared to those from an alternative sensor selection strategy.
Using maximum entropy modeling for optimal selection of sampling sites for monitoring networks
Stohlgren, Thomas J.; Kumar, Sunil; Barnett, David T.; Evangelista, Paul H.
2011-01-01
Environmental monitoring programs must efficiently describe state shifts. We propose using maximum entropy modeling to select dissimilar sampling sites to capture environmental variability at low cost, and demonstrate a specific application: sample site selection for the Central Plains domain (453,490 km2) of the National Ecological Observatory Network (NEON). We relied on four environmental factors: mean annual temperature and precipitation, elevation, and vegetation type. A “sample site” was defined as a 20 km × 20 km area (equal to NEON’s airborne observation platform [AOP] footprint), within which each 1 km2 cell was evaluated for each environmental factor. After each model run, the most environmentally dissimilar site was selected from all potential sample sites. The iterative selection of eight sites captured approximately 80% of the environmental envelope of the domain, an improvement over stratified random sampling and simple random designs for sample site selection. This approach can be widely used for cost-efficient selection of survey and monitoring sites.
SU-F-R-10: Selecting the Optimal Solution for Multi-Objective Radiomics Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Z; Folkert, M; Wang, J
2016-06-15
Purpose: To develop an evidential reasoning approach for selecting the optimal solution from a Pareto solution set obtained by a multi-objective radiomics model for predicting distant failure in lung SBRT. Methods: In the multi-objective radiomics model, both sensitivity and specificity are considered as the objective functions simultaneously. A Pareto solution set with many feasible solutions will be resulted from the multi-objective optimization. In this work, an optimal solution Selection methodology for Multi-Objective radiomics Learning model using the Evidential Reasoning approach (SMOLER) was proposed to select the optimal solution from the Pareto solution set. The proposed SMOLER method used the evidentialmore » reasoning approach to calculate the utility of each solution based on pre-set optimal solution selection rules. The solution with the highest utility was chosen as the optimal solution. In SMOLER, an optimal learning model coupled with clonal selection algorithm was used to optimize model parameters. In this study, PET, CT image features and clinical parameters were utilized for predicting distant failure in lung SBRT. Results: Total 126 solution sets were generated by adjusting predictive model parameters. Each Pareto set contains 100 feasible solutions. The solution selected by SMOLER within each Pareto set was compared to the manually selected optimal solution. Five-cross-validation was used to evaluate the optimal solution selection accuracy of SMOLER. The selection accuracies for five folds were 80.00%, 69.23%, 84.00%, 84.00%, 80.00%, respectively. Conclusion: An optimal solution selection methodology for multi-objective radiomics learning model using the evidential reasoning approach (SMOLER) was proposed. Experimental results show that the optimal solution can be found in approximately 80% cases.« less
An infrastructure to mine molecular descriptors for ligand selection on virtual screening.
Seus, Vinicius Rosa; Perazzo, Giovanni Xavier; Winck, Ana T; Werhli, Adriano V; Machado, Karina S
2014-01-01
The receptor-ligand interaction evaluation is one important step in rational drug design. The databases that provide the structures of the ligands are growing on a daily basis. This makes it impossible to test all the ligands for a target receptor. Hence, a ligand selection before testing the ligands is needed. One possible approach is to evaluate a set of molecular descriptors. With the aim of describing the characteristics of promising compounds for a specific receptor we introduce a data warehouse-based infrastructure to mine molecular descriptors for virtual screening (VS). We performed experiments that consider as target the receptor HIV-1 protease and different compounds for this protein. A set of 9 molecular descriptors are taken as the predictive attributes and the free energy of binding is taken as a target attribute. By applying the J48 algorithm over the data we obtain decision tree models that achieved up to 84% of accuracy. The models indicate which molecular descriptors and their respective values are relevant to influence good FEB results. Using their rules we performed ligand selection on ZINC database. Our results show important reduction in ligands selection to be applied in VS experiments; for instance, the best selection model picked only 0.21% of the total amount of drug-like ligands.
A behavioural and neural evaluation of prospective decision-making under risk
Symmonds, Mkael; Bossaerts, Peter; Dolan, Raymond J.
2010-01-01
Making the best choice when faced with a chain of decisions requires a person to judge both anticipated outcomes and future actions. Although economic decision-making models account for both risk and reward in single choice contexts there is a dearth of similar knowledge about sequential choice. Classical utility-based models assume that decision-makers select and follow an optimal pre-determined strategy, irrespective of the particular order in which options are presented. An alternative model involves continuously re-evaluating decision utilities, without prescribing a specific future set of choices. Here, using behavioral and functional magnetic resonance imaging (fMRI) data, we studied human subjects in a sequential choice task and use these data to compare alternative decision models of valuation and strategy selection. We provide evidence that subjects adopt a model of re-evaluating decision utilities, where available strategies are continuously updated and combined in assessing action values. We validate this model by using simultaneously-acquired fMRI data to show that sequential choice evokes a pattern of neural response consistent with a tracking of anticipated distribution of future reward, as expected in such a model. Thus, brain activity evoked at each decision point reflects the expected mean, variance and skewness of possible payoffs, consistent with the idea that sequential choice evokes a prospective evaluation of both available strategies and possible outcomes. PMID:20980595
Aymerich, I; Rieger, L; Sobhani, R; Rosso, D; Corominas, Ll
2015-09-15
The objective of this paper is to demonstrate the importance of incorporating more realistic energy cost models (based on current energy tariff structures) into existing water resource recovery facilities (WRRFs) process models when evaluating technologies and cost-saving control strategies. In this paper, we first introduce a systematic framework to model energy usage at WRRFs and a generalized structure to describe energy tariffs including the most common billing terms. Secondly, this paper introduces a detailed energy cost model based on a Spanish energy tariff structure coupled with a WRRF process model to evaluate several control strategies and provide insights into the selection of the contracted power structure. The results for a 1-year evaluation on a 115,000 population-equivalent WRRF showed monthly cost differences ranging from 7 to 30% when comparing the detailed energy cost model to an average energy price. The evaluation of different aeration control strategies also showed that using average energy prices and neglecting energy tariff structures may lead to biased conclusions when selecting operating strategies or comparing technologies or equipment. The proposed framework demonstrated that for cost minimization, control strategies should be paired with a specific optimal contracted power. Hence, the design of operational and control strategies must take into account the local energy tariff. Copyright © 2015 Elsevier Ltd. All rights reserved.
Graham, Mark J; Lee, Richard G; Bell, Thomas A; Fu, Wuxia; Mullick, Adam E; Alexander, Veronica J; Singleton, Walter; Viney, Nick; Geary, Richard; Su, John; Baker, Brenda F; Burkey, Jennifer; Crooke, Stanley T; Crooke, Rosanne M
2013-05-24
Elevated plasma triglyceride levels have been recognized as a risk factor for the development of coronary heart disease. Apolipoprotein C-III (apoC-III) represents both an independent risk factor and a key regulatory factor of plasma triglyceride concentrations. Furthermore, elevated apoC-III levels have been associated with metabolic syndrome and type 2 diabetes mellitus. To date, no selective apoC-III therapeutic agent has been evaluated in the clinic. To test the hypothesis that selective inhibition of apoC-III with antisense drugs in preclinical models and in healthy volunteers would reduce plasma apoC-III and triglyceride levels. Rodent- and human-specific second-generation antisense oligonucleotides were identified and evaluated in preclinical models, including rats, mice, human apoC-III transgenic mice, and nonhuman primates. We demonstrated the selective reduction of both apoC-III and triglyceride in all preclinical pharmacological evaluations. We also showed that inhibition of apoC-III was well tolerated and not associated with increased liver triglyceride deposition or hepatotoxicity. A double-blind, placebo-controlled, phase I clinical study was performed in healthy subjects. Administration of the human apoC-III antisense drug resulted in dose-dependent reductions in plasma apoC-III, concomitant lowering of triglyceride levels, and produced no clinically meaningful signals in the safety evaluations. Antisense inhibition of apoC-III in preclinical models and in a phase I clinical trial with healthy subjects produced potent, selective reductions in plasma apoC-III and triglyceride, 2 known risk factors for cardiovascular disease. This compelling pharmacological profile supports further clinical investigations in hypertriglyceridemic subjects.
Discrete choice modeling of shovelnose sturgeon habitat selection in the Lower Missouri River
Bonnot, T.W.; Wildhaber, M.L.; Millspaugh, J.J.; DeLonay, A.J.; Jacobson, R.B.; Bryan, J.L.
2011-01-01
Substantive changes to physical habitat in the Lower Missouri River, resulting from intensive management, have been implicated in the decline of pallid (Scaphirhynchus albus) and shovelnose (S. platorynchus) sturgeon. To aid in habitat rehabilitation efforts, we evaluated habitat selection of gravid, female shovelnose sturgeon during the spawning season in two sections (lower and upper) of the Lower Missouri River in 2005 and in the upper section in 2007. We fit discrete choice models within an information theoretic framework to identify selection of means and variability in three components of physical habitat. Characterizing habitat within divisions around fish better explained selection than habitat values at the fish locations. In general, female shovelnose sturgeon were negatively associated with mean velocity between them and the bank and positively associated with variability in surrounding depths. For example, in the upper section in 2005, a 0.5 m s-1 decrease in velocity within 10 m in the bank direction increased the relative probability of selection 70%. In the upper section fish also selected sites with surrounding structure in depth (e.g., change in relief). Differences in models between sections and years, which are reinforced by validation rates, suggest that changes in habitat due to geomorphology, hydrology, and their interactions over time need to be addressed when evaluating habitat selection. Because of the importance of variability in surrounding depths, these results support an emphasis on restoring channel complexity as an objective of habitat restoration for shovelnose sturgeon in the Lower Missouri River.
Models Used to Select Strategic Planning Experts for High Technology Productions
NASA Astrophysics Data System (ADS)
Zakharova, Alexandra A.; Grigorjeva, Antonina A.; Tseplit, Anna P.; Ozgogov, Evgenij V.
2016-04-01
The article deals with the problems and specific aspects in organizing works of experts involved in assessment of companies that manufacture complex high-technology products. A model is presented that is intended for evaluating competences of experts in individual functional areas of expertise. Experts are selected to build a group on the basis of tables used to determine a competence level. An expert selection model based on fuzzy logic is proposed and additional requirements for the expert group composition can be taken into account, with regard to the needed quality and competence related preferences of decision-makers. A Web-based information system model is developed for the interaction between experts and decision-makers when carrying out online examinations.
33 CFR 385.33 - Revisions to models and analytical tools.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...
USING MM5 VERSION 2 WITH CMAQ AND MODELS-3, A USER'S GUIDE AND TUTORIAL
Meteorological data are important in many of the processes simulated in the Community Multi-Scale Air Quality (CMAQ) model and the Models-3 framework. The first meteorology model that has been selected and evaluated with CMAQ is the Fifth-Generation Pennsylvania State University...
Protein construct storage: Bayesian variable selection and prediction with mixtures.
Clyde, M A; Parmigiani, G
1998-07-01
Determining optimal conditions for protein storage while maintaining a high level of protein activity is an important question in pharmaceutical research. A designed experiment based on a space-filling design was conducted to understand the effects of factors affecting protein storage and to establish optimal storage conditions. Different model-selection strategies to identify important factors may lead to very different answers about optimal conditions. Uncertainty about which factors are important, or model uncertainty, can be a critical issue in decision-making. We use Bayesian variable selection methods for linear models to identify important variables in the protein storage data, while accounting for model uncertainty. We also use the Bayesian framework to build predictions based on a large family of models, rather than an individual model, and to evaluate the probability that certain candidate storage conditions are optimal.
FTM-West Model Results for Selected Fuel Treatment Scenarios
Andrew D. Kramp; Peter J. Ince
2006-01-01
This paper evaluated potential forest product market impacts in the U.S. West of increases in the supply of wood from thinnings to reduce fire hazard. Evaluations are done using the Fuel Treatment Market-West model for a set of hypothetical fuel treatment scenarios, which include stand-density-index (SDI) and thin-from-below (TFB) treatment regimes at alternative...
ERIC Educational Resources Information Center
Mackey, Eleanor Race; La Greca, Annette M.
2008-01-01
Based on the Theory of Reasoned Action, this study evaluated a "socialization" model linking girls' peer crowd affiliations (e.g., Jocks, Populars) with their own weight concern, perceived peer weight norms, and weight control behaviors. An alternative "selection" model was also evaluated. Girls (N = 236; M age = 15.95 years) from diverse ethnic…
ERIC Educational Resources Information Center
Darch, Craig; And Others
Several evaluation formats were used to examine the impact that the Direct Instruction Model had on 600 selected students in Williamsburg County, South Carolina, over a 7-year period. The performance of students in the Direct Instruction Model was contrasted with the performance of similar students (on the basis of family income, ethnicity,…
Selecting Growth Measures for School and Teacher Evaluations. Working Paper 80
ERIC Educational Resources Information Center
Ehlert, Mark; Koedel, Cory; Parsons, Eric; Podgursky, Michael
2012-01-01
The specifics of how growth models should be constructed and used to evaluate schools and teachers is a topic of lively policy debate in states and school districts nationwide. In this paper we take up the question of model choice and examine three competing approaches. The first approach, reflected in the popular student growth percentiles (SGPs)…
Rapid performance modeling and parameter regression of geodynamic models
NASA Astrophysics Data System (ADS)
Brown, J.; Duplyakin, D.
2016-12-01
Geodynamic models run in a parallel environment have many parameters with complicated effects on performance and scientifically-relevant functionals. Manually choosing an efficient machine configuration and mapping out the parameter space requires a great deal of expert knowledge and time-consuming experiments. We propose an active learning technique based on Gaussion Process Regression to automatically select experiments to map out the performance landscape with respect to scientific and machine parameters. The resulting performance model is then used to select optimal experiments for improving the accuracy of a reduced order model per unit of computational cost. We present the framework and evaluate its quality and capability using popular lithospheric dynamics models.
Appropriate antibiotic therapy improves Ureaplasma sepsis outcome in the neonatal mouse.
Weisman, Leonard E; Leeming, Angela H; Kong, Lingkun
2012-11-01
Ureaplasma causes sepsis in human neonates. Although erythromycin has been the standard treatment, it is not always effective. No published reports have evaluated Ureaplasma sepsis in a neonatal model. We hypothesized that appropriate antibiotic treatment improves Ureaplasma sepsis in a neonatal mouse model. Two ATCC strains and two clinical strains of Ureaplasma were evaluated in vitro for antibiotic minimum inhibitory concentration (MIC). In addition, FVB albino mice pups infected with Ureaplasma were randomly assigned to saline, erythromycin, or azithromycin therapy and survival, quantitative blood culture, and growth were evaluated. MICs ranged from 0.125 to 62.5 µg/ml and 0.25 to 1.0 µg/ml for erythromycin and azithromycin, respectively. The infecting strain and antibiotic selected for treatment appeared to affect survival and bacteremia, but only the infecting strain affected growth. Azithromycin improved survival and bacteremia against each strain, whereas erythromycin was effective against only one of four strains. We have established a neonatal model of Ureaplasma sepsis and observed that treatment outcome is related to infecting strain and antibiotic treatment. We speculate that appropriate antibiotic selection and dosing are required for effective treatment of Ureaplasma sepsis in neonates, and this model could be used to further evaluate these relationships.
Monofluorophosphate is a selective inhibitor of respiratory sulfate-reducing microorganisms.
Carlson, Hans K; Stoeva, Magdalena K; Justice, Nicholas B; Sczesnak, Andrew; Mullan, Mark R; Mosqueda, Lorraine A; Kuehl, Jennifer V; Deutschbauer, Adam M; Arkin, Adam P; Coates, John D
2015-03-17
Despite the environmental and economic cost of microbial sulfidogenesis in industrial operations, few compounds are known as selective inhibitors of respiratory sulfate reducing microorganisms (SRM), and no study has systematically and quantitatively evaluated the selectivity and potency of SRM inhibitors. Using general, high-throughput assays to quantitatively evaluate inhibitor potency and selectivity in a model sulfate-reducing microbial ecosystem as well as inhibitor specificity for the sulfate reduction pathway in a model SRM, we screened a panel of inorganic oxyanions. We identified several SRM selective inhibitors including selenate, selenite, tellurate, tellurite, nitrate, nitrite, perchlorate, chlorate, monofluorophosphate, vanadate, molydate, and tungstate. Monofluorophosphate (MFP) was not known previously as a selective SRM inhibitor, but has promising characteristics including low toxicity to eukaryotic organisms, high stability at circumneutral pH, utility as an abiotic corrosion inhibitor, and low cost. MFP remains a potent inhibitor of SRM growing by fermentation, and MFP is tolerated by nitrate and perchlorate reducing microorganisms. For SRM inhibition, MFP is synergistic with nitrite and chlorite, and could enhance the efficacy of nitrate or perchlorate treatments. Finally, MFP inhibition is multifaceted. Both inhibition of the central sulfate reduction pathway and release of cytoplasmic fluoride ion are implicated in the mechanism of MFP toxicity.
Cross-validation pitfalls when selecting and assessing regression and classification models.
Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon
2014-03-29
We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.
Computer aided design of Langasite resonant cantilevers: analytical models and simulations
NASA Astrophysics Data System (ADS)
Tellier, C. R.; Leblois, T. G.; Durand, S.
2010-05-01
Analytical models for the piezoelectric excitation and for the wet micromachining of resonant cantilevers are proposed. Firstly, computations of metrological performances of micro-resonators allow us to select special cuts and special alignment of the cantilevers. Secondly the self-elaborated simulator TENSOSIM based on the kinematic and tensorial model furnishes etching shapes of cantilevers. As the result the number of selected cuts is reduced. Finally the simulator COMSOL® is used to evaluate the influence of final etching shape on metrological performances and especially on the resonance frequency. Changes in frequency are evaluated and deviating behaviours of structures with less favourable built-ins are tested showing that the X cut is the best cut for LGS resonant cantilevers vibrating in flexural modes (type 1 and type 2) or in torsion mode.
Aboushanab, Tamer; AlSanad, Saud
2018-06-08
Cupping therapy is a popular treatment in various countries and regions, including Saudi Arabia. Cupping therapy is regulated in Saudi Arabia by the National Center for Complementary and Alternative Medicine (NCCAM), Ministry of Health. The authors recommend that this quality model for selecting patients in cupping clinics - first version (QMSPCC-1) - be used routinely as part of clinical practice and quality management in cupping clinics. The aim of the quality model is to ensure the safety of patients and to introduce and facilitate quality and auditing processes in cupping therapy clinics. Clinical evaluation of this tool is recommended. Continued development, re-evaluation and reassessment of this tool are important. Copyright © 2018. Published by Elsevier B.V.
Evaluation of brightness temperature from a forward model of ground-based microwave radiometer
NASA Astrophysics Data System (ADS)
Rambabu, S.; Pillai, J. S.; Agarwal, A.; Pandithurai, G.
2014-06-01
Ground-based microwave radiometers are getting great attention in recent years due to their capability to profile the temperature and humidity at high temporal and vertical resolution in the lower troposphere. The process of retrieving these parameters from the measurements of radiometric brightness temperature ( T B ) includes the inversion algorithm, which uses the back ground information from a forward model. In the present study, an algorithm development and evaluation of this forward model for a ground-based microwave radiometer, being developed by Society for Applied Microwave Electronics Engineering and Research (SAMEER) of India, is presented. Initially, the analysis of absorption coefficient and weighting function at different frequencies was made to select the channels. Further the range of variation of T B for these selected channels for the year 2011, over the two stations Mumbai and Delhi is discussed. Finally the comparison between forward-model simulated T B s and radiometer measured T B s at Mahabaleshwar (73.66 ∘E and 17.93∘N) is done to evaluate the model. There is good agreement between model simulations and radiometer observations, which suggests that these forward model simulations can be used as background for inversion models for retrieving the temperature and humidity profiles.
Neural Underpinnings of Decision Strategy Selection: A Review and a Theoretical Model.
Wichary, Szymon; Smolen, Tomasz
2016-01-01
In multi-attribute choice, decision makers use decision strategies to arrive at the final choice. What are the neural mechanisms underlying decision strategy selection? The first goal of this paper is to provide a literature review on the neural underpinnings and cognitive models of decision strategy selection and thus set the stage for a neurocognitive model of this process. The second goal is to outline such a unifying, mechanistic model that can explain the impact of noncognitive factors (e.g., affect, stress) on strategy selection. To this end, we review the evidence for the factors influencing strategy selection, the neural basis of strategy use and the cognitive models of this process. We also present the Bottom-Up Model of Strategy Selection (BUMSS). The model assumes that the use of the rational Weighted Additive strategy and the boundedly rational heuristic Take The Best can be explained by one unifying, neurophysiologically plausible mechanism, based on the interaction of the frontoparietal network, orbitofrontal cortex, anterior cingulate cortex and the brainstem nucleus locus coeruleus. According to BUMSS, there are three processes that form the bottom-up mechanism of decision strategy selection and lead to the final choice: (1) cue weight computation, (2) gain modulation, and (3) weighted additive evaluation of alternatives. We discuss how these processes might be implemented in the brain, and how this knowledge allows us to formulate novel predictions linking strategy use and neural signals.
Li, Juan; Jiang, Yue; Fan, Qi; Chen, Yang; Wu, Ruanqi
2014-05-05
This paper establishes a high-throughput and high selective method to determine the impurity named oxidized glutathione (GSSG) and radial tensile strength (RTS) of reduced glutathione (GSH) tablets based on near infrared (NIR) spectroscopy and partial least squares (PLS). In order to build and evaluate the calibration models, the NIR diffuse reflectance spectra (DRS) and transmittance spectra (TS) for 330 GSH tablets were accurately measured by using the optimized parameter values. For analyzing GSSG or RTS of GSH tablets, the NIR-DRS or NIR-TS were selected, subdivided reasonably into calibration and prediction sets, and processed appropriately with chemometric techniques. After selecting spectral sub-ranges and neglecting spectrum outliers, the PLS calibration models were built and the factor numbers were optimized. Then, the PLS models were evaluated by the root mean square errors of calibration (RMSEC), cross-validation (RMSECV) and prediction (RMSEP), and by the correlation coefficients of calibration (R(c)) and prediction (R(p)). The results indicate that the proposed models have good performances. It is thus clear that the NIR-PLS can simultaneously, selectively, nondestructively and rapidly analyze the GSSG and RTS of GSH tablets, although the contents of GSSG impurity were quite low while those of GSH active pharmaceutical ingredient (API) quite high. This strategy can be an important complement to the common NIR methods used in the on-line analysis of API in pharmaceutical preparations. And this work expands the NIR applications in the high-throughput and extraordinarily selective analysis. Copyright © 2014 Elsevier B.V. All rights reserved.
A Developed Meta-model for Selection of Cotton Fabrics Using Design of Experiments and TOPSIS Method
NASA Astrophysics Data System (ADS)
Chakraborty, Shankar; Chatterjee, Prasenjit
2017-12-01
Selection of cotton fabrics for providing optimal clothing comfort is often considered as a multi-criteria decision making problem consisting of an array of candidate alternatives to be evaluated based of several conflicting properties. In this paper, design of experiments and technique for order preference by similarity to ideal solution (TOPSIS) are integrated so as to develop regression meta-models for identifying the most suitable cotton fabrics with respect to the computed TOPSIS scores. The applicability of the adopted method is demonstrated using two real time examples. These developed models can also identify the statistically significant fabric properties and their interactions affecting the measured TOPSIS scores and final selection decisions. There exists good degree of congruence between the ranking patterns as derived using these meta-models and the existing methods for cotton fabric ranking and subsequent selection.
Zhou, Yuan; Shi, Tie-Mao; Hu, Yuan-Man; Gao, Chang; Liu, Miao; Song, Lin-Qi
2011-12-01
Based on geographic information system (GIS) technology and multi-objective location-allocation (LA) model, and in considering of four relatively independent objective factors (population density level, air pollution level, urban heat island effect level, and urban land use pattern), an optimized location selection for the urban parks within the Third Ring of Shenyang was conducted, and the selection results were compared with the spatial distribution of existing parks, aimed to evaluate the rationality of the spatial distribution of urban green spaces. In the location selection of urban green spaces in the study area, the factor air pollution was most important, and, compared with single objective factor, the weighted analysis results of multi-objective factors could provide optimized spatial location selection of new urban green spaces. The combination of GIS technology with LA model would be a new approach for the spatial optimizing of urban green spaces.
NASA Astrophysics Data System (ADS)
Peng, Hong-Gang; Wang, Jian-Qiang
2017-11-01
In recent years, sustainable energy crop has become an important energy development strategy topic in many countries. Selecting the most sustainable energy crop is a significant problem that must be addressed during any biofuel production process. The focus of this study is the development of an innovative multi-criteria decision-making (MCDM) method to handle sustainable energy crop selection problems. Given that various uncertain data are encountered in the evaluation of sustainable energy crops, linguistic intuitionistic fuzzy numbers (LIFNs) are introduced to present the information necessary to the evaluation process. Processing qualitative concepts requires the effective support of reliable tools; then, a cloud model can be used to deal with linguistic intuitionistic information. First, LIFNs are converted and a novel concept of linguistic intuitionistic cloud (LIC) is proposed. The operations, score function and similarity measurement of the LICs are defined. Subsequently, the linguistic intuitionistic cloud density-prioritised weighted Heronian mean operator is developed, which served as the basis for the construction of an applicable MCDM model for sustainable energy crop selection. Finally, an illustrative example is provided to demonstrate the proposed method, and its feasibility and validity are further verified by comparing it with other existing methods.
Evaluating the 239Pu prompt fission neutron spectrum induced by thermal to 30 MeV neutrons
Neudecker, Denise; Talou, Patrick; Kawano, Toshihiko; ...
2016-03-15
We present a new evaluation of the 239Pu prompt fission neutron spectrum (PFNS) induced by thermal to 30 MeV neutrons. Compared to the ENDF/B-VII.1 evaluation, this one includes recently published experimental data as well as an improved and extended model description to predict PFNS. For instance, the pre-equilibrium neutron emission component to the PFNS is considered and the incident energy dependence of model parameters is parametrized more realistically. Experimental and model parameter uncertainties and covariances are estimated in detail. Also, evaluated covariances are provided between all PFNS at different incident neutron energies. In conclusion, selected evaluation results and first benchmarkmore » calculations using this evaluation are briefly discussed.« less
Demirarslan, K Onur; Korucu, M Kemal; Karademir, Aykan
2016-08-01
Ecological problems arising after the construction and operation of a waste incineration plant generally originate from incorrect decisions made during the selection of the location of the plant. The main objective of this study is to investigate how the selection method for the location of a new municipal waste incineration plant can be improved by using a dispersion modelling approach supported by geographical information systems and multi-criteria decision analysis. Considering this aim, the appropriateness of the current location of an existent plant was assessed by applying a pollution dispersion model. Using this procedure, the site ranking for a total of 90 candidate locations and the site of the existing incinerator were determined by a new location selection practice and the current place of the plant was evaluated by ANOVA and Tukey tests. This ranking, made without the use of modelling approaches, was re-evaluated based on the modelling of various variables, including the concentration of pollutants, population and population density, demography, temporality of meteorological data, pollutant type, risk formation type by CALPUFF and re-ranking the results. The findings clearly indicate the impropriety of the location of the current plant, as the pollution distribution model showed that its location was the fourth-worst choice among 91 possibilities. It was concluded that the location selection procedures for waste incinerators should benefit from the improvements obtained by the articulation of pollution dispersion studies combined with the population density data to obtain the most suitable location. © The Author(s) 2016.
Dempsey, Steven J; Gese, Eric M; Kluever, Bryan M; Lonsinger, Robert C; Waits, Lisette P
2015-01-01
Development and evaluation of noninvasive methods for monitoring species distribution and abundance is a growing area of ecological research. While noninvasive methods have the advantage of reduced risk of negative factors associated with capture, comparisons to methods using more traditional invasive sampling is lacking. Historically kit foxes (Vulpes macrotis) occupied the desert and semi-arid regions of southwestern North America. Once the most abundant carnivore in the Great Basin Desert of Utah, the species is now considered rare. In recent decades, attempts have been made to model the environmental variables influencing kit fox distribution. Using noninvasive scat deposition surveys for determination of kit fox presence, we modeled resource selection functions to predict kit fox distribution using three popular techniques (Maxent, fixed-effects, and mixed-effects generalized linear models) and compared these with similar models developed from invasive sampling (telemetry locations from radio-collared foxes). Resource selection functions were developed using a combination of landscape variables including elevation, slope, aspect, vegetation height, and soil type. All models were tested against subsequent scat collections as a method of model validation. We demonstrate the importance of comparing multiple model types for development of resource selection functions used to predict a species distribution, and evaluating the importance of environmental variables on species distribution. All models we examined showed a large effect of elevation on kit fox presence, followed by slope and vegetation height. However, the invasive sampling method (i.e., radio-telemetry) appeared to be better at determining resource selection, and therefore may be more robust in predicting kit fox distribution. In contrast, the distribution maps created from the noninvasive sampling (i.e., scat transects) were significantly different than the invasive method, thus scat transects may be appropriate when used in an occupancy framework to predict species distribution. We concluded that while scat deposition transects may be useful for monitoring kit fox abundance and possibly occupancy, they do not appear to be appropriate for determining resource selection. On our study area, scat transects were biased to roadways, while data collected using radio-telemetry was dictated by movements of the kit foxes themselves. We recommend that future studies applying noninvasive scat sampling should consider a more robust random sampling design across the landscape (e.g., random transects or more complete road coverage) that would then provide a more accurate and unbiased depiction of resource selection useful to predict kit fox distribution.
Kwon, Jung-Hwan; Katz, Lynn E; Liljestrand, Howard M
2006-12-01
A parallel artificial lipid membrane system was developed to mimic passive mass transfer of hydrophobic organic chemicals in fish. In this physical model system, a membrane filter-supported lipid bilayer separates two aqueous phases that represent the external and internal aqueous environments of fish. To predict bioconcentration kinetics in small fish with this system, literature absorption and elimination rates were analyzed with an allometric diffusion model to quantify the mass transfer resistances in the aqueous and lipid phases of fish. The effect of the aqueous phase mass transfer resistance was controlled by adjusting stirring intensity to mimic bioconcentration rates in small fish. Twenty-three simple aromatic hydrocarbons were chosen as model compounds for purposes of evaluation. For most of the selected chemicals, literature absorption/elimination rates fall into the range predicted from measured membrane permeabilities and elimination rates of the selected chemicals determined by the diffusion model system.
NASA Astrophysics Data System (ADS)
Shan, Jiajia; Wang, Xue; Zhou, Hao; Han, Shuqing; Riza, Dimas Firmanda Al; Kondo, Naoshi
2018-04-01
Synchronous fluorescence spectra, combined with multivariate analysis were used to predict flavonoids content in green tea rapidly and nondestructively. This paper presented a new and efficient spectral intervals selection method called clustering based partial least square (CL-PLS), which selected informative wavelengths by combining clustering concept and partial least square (PLS) methods to improve models’ performance by synchronous fluorescence spectra. The fluorescence spectra of tea samples were obtained and k-means and kohonen-self organizing map clustering algorithms were carried out to cluster full spectra into several clusters, and sub-PLS regression model was developed on each cluster. Finally, CL-PLS models consisting of gradually selected clusters were built. Correlation coefficient (R) was used to evaluate the effect on prediction performance of PLS models. In addition, variable influence on projection partial least square (VIP-PLS), selectivity ratio partial least square (SR-PLS), interval partial least square (iPLS) models and full spectra PLS model were investigated and the results were compared. The results showed that CL-PLS presented the best result for flavonoids prediction using synchronous fluorescence spectra.
A multicriteria decision making model for assessment and selection of an ERP in a logistics context
NASA Astrophysics Data System (ADS)
Pereira, Teresa; Ferreira, Fernanda A.
2017-07-01
The aim of this work is to apply a methodology of decision support based on a multicriteria decision analyses (MCDA) model that allows the assessment and selection of an Enterprise Resource Planning (ERP) in a Portuguese logistics company by Group Decision Maker (GDM). A Decision Support system (DSS) that implements a MCDA - Multicriteria Methodology for the Assessment and Selection of Information Systems / Information Technologies (MMASSI / IT) is used based on its features and facility to change and adapt the model to a given scope. Using this DSS it was obtained the information system that best suited to the decisional context, being this result evaluated through a sensitivity and robustness analysis.
Program Planners’ Perspectives of Promotora Roles, Recruitment, and Selection
Koskan, Alexis; Hilfinger Messias, DeAnne K.; Friedman, Daniela B.; Brandt, Heather M.; Walsemann, Katrina M.
2013-01-01
Objective Program planners work with promotoras (the Spanish term for female community health workers) to reduce health disparities among underserved populations. Based on the Role-Outcomes Linkage Evaluation Model for Community Health Workers (ROLES) conceptual model, we explored how program planners conceptualized the promotora role and the approaches and strategies they used to recruit, select, and sustain promotoras. Design We conducted semi-structured, in-depth interviews with a purposive convenience sample of 24 program planners, program coordinators, promotora recruiters, research principal investigators, and other individuals who worked closely with promotoras on United States-based health programs for Hispanic women (ages 18 and older). Results Planners conceptualized the promotora role based on their personal experiences and their understanding of the underlying philosophical tenets of the promotora approach. Recruitment and selection methods reflected planners’ conceptualizations and experiences of promotoras as paid staff or volunteers. Participants described a variety of program planning and implementation methods. They focused on sustainability of the programs, the intended health behavior changes or activities, and the individual promotoras. Conclusion To strengthen health programs employing the promotora delivery model, job descriptions should delineate role expectations and boundaries and better guide promotora evaluations. We suggest including additional components such as information on funding sources, program type and delivery, and sustainability outcomes to enhance the ROLES conceptual model. The expanded model can be used to guide program planners in the planning, implementing, and evaluating of promotora health programs. PMID:23039847
Green material selection for sustainability: A hybrid MCDM approach.
Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng
2017-01-01
Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection.
Green material selection for sustainability: A hybrid MCDM approach
Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng
2017-01-01
Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection. PMID:28498864
[Evaluation on effectiveness of comprehensive control model for soil-transmitted nematodiasis].
Hong-Chun, Tian; Meng, Tang; Hong, Xie; Han-Gang, Li; Xiao-Ke, Zhou; Chang-Hua, Liu; De-Fu, Zheng; Zhong-Jiu, Tang; Ming-Hui, Li; Cheng-Yu, Wu; Yi-Zhu, Ren
2011-10-01
To evaluate the effect of a comprehensive control model for soil-transmitted nematodiasis. Danling County was selected as a demonstration county carrying out the comprehensive prevention model centering on health education, nematode deworming, and drinking water and lavatories changing. On the other side, Hejiang was selected as a control. The effects were evaluated by comparing some indicators such as the infection rates of soil-transmitted nematodiasis and so on. The infection rates of soil-transmitted nematodiasis declined obviously from 2006 to 2009 in the demonstration county. The infection rates of Ascaris lumbricoides, hookworms, Trichiuris trichiura decreased by 91.14%, 81.65% and 65.77%. In the control county, those rates did not have downward tendency. In 2006, those rates in the demonstration county were higher than those in the control, but in 2009 those rates in the demonstration county were lower than those in the control. Through the three-year comprehensive prevention, the infection rates of soil-transmitted nematodiasis declined obviously in the demonstration county. The epidemic situation of soil-transmitted nematodiasis could be controlled effectively by the comprehensive prevention model.
Shirk, Andrew J; Landguth, Erin L; Cushman, Samuel A
2018-01-01
Anthropogenic migration barriers fragment many populations and limit the ability of species to respond to climate-induced biome shifts. Conservation actions designed to conserve habitat connectivity and mitigate barriers are needed to unite fragmented populations into larger, more viable metapopulations, and to allow species to track their climate envelope over time. Landscape genetic analysis provides an empirical means to infer landscape factors influencing gene flow and thereby inform such conservation actions. However, there are currently many methods available for model selection in landscape genetics, and considerable uncertainty as to which provide the greatest accuracy in identifying the true landscape model influencing gene flow among competing alternative hypotheses. In this study, we used population genetic simulations to evaluate the performance of seven regression-based model selection methods on a broad array of landscapes that varied by the number and type of variables contributing to resistance, the magnitude and cohesion of resistance, as well as the functional relationship between variables and resistance. We also assessed the effect of transformations designed to linearize the relationship between genetic and landscape distances. We found that linear mixed effects models had the highest accuracy in every way we evaluated model performance; however, other methods also performed well in many circumstances, particularly when landscape resistance was high and the correlation among competing hypotheses was limited. Our results provide guidance for which regression-based model selection methods provide the most accurate inferences in landscape genetic analysis and thereby best inform connectivity conservation actions. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
Liu, Xiang; Peng, Yingwei; Tu, Dongsheng; Liang, Hua
2012-10-30
Survival data with a sizable cure fraction are commonly encountered in cancer research. The semiparametric proportional hazards cure model has been recently used to analyze such data. As seen in the analysis of data from a breast cancer study, a variable selection approach is needed to identify important factors in predicting the cure status and risk of breast cancer recurrence. However, no specific variable selection method for the cure model is available. In this paper, we present a variable selection approach with penalized likelihood for the cure model. The estimation can be implemented easily by combining the computational methods for penalized logistic regression and the penalized Cox proportional hazards models with the expectation-maximization algorithm. We illustrate the proposed approach on data from a breast cancer study. We conducted Monte Carlo simulations to evaluate the performance of the proposed method. We used and compared different penalty functions in the simulation studies. Copyright © 2012 John Wiley & Sons, Ltd.
Near-optimal experimental design for model selection in systems biology.
Busetto, Alberto Giovanni; Hauser, Alain; Krummenacher, Gabriel; Sunnåker, Mikael; Dimopoulos, Sotiris; Ong, Cheng Soon; Stelling, Jörg; Buhmann, Joachim M
2013-10-15
Biological systems are understood through iterations of modeling and experimentation. Not all experiments, however, are equally valuable for predictive modeling. This study introduces an efficient method for experimental design aimed at selecting dynamical models from data. Motivated by biological applications, the method enables the design of crucial experiments: it determines a highly informative selection of measurement readouts and time points. We demonstrate formal guarantees of design efficiency on the basis of previous results. By reducing our task to the setting of graphical models, we prove that the method finds a near-optimal design selection with a polynomial number of evaluations. Moreover, the method exhibits the best polynomial-complexity constant approximation factor, unless P = NP. We measure the performance of the method in comparison with established alternatives, such as ensemble non-centrality, on example models of different complexity. Efficient design accelerates the loop between modeling and experimentation: it enables the inference of complex mechanisms, such as those controlling central metabolic operation. Toolbox 'NearOED' available with source code under GPL on the Machine Learning Open Source Software Web site (mloss.org).
The response of numerical weather prediction analysis systems to FGGE 2b data
NASA Technical Reports Server (NTRS)
Hollingsworth, A.; Lorenc, A.; Tracton, S.; Arpe, K.; Cats, G.; Uppala, S.; Kallberg, P.
1985-01-01
An intercomparison of analyses of the main PGGE Level IIb data set is presented with three advanced analysis systems. The aims of the work are to estimate the extent and magnitude of the differences between the analyses, to identify the reasons for the differences, and finally to estimate the significance of the differences. Extratropical analyses only are considered. Objective evaluations of analysis quality, such as fit to observations, statistics of analysis differences, and mean fields are discussed. In addition, substantial emphasis is placed on subjective evaluation of a series of case studies that were selected to illustrate the importance of different aspects of the analysis procedures, such as quality control, data selection, resolution, dynamical balance, and the role of the assimilating forecast model. In some cases, the forecast models are used as selective amplifiers of analysis differences to assist in deciding which analysis was more nearly correct in the treatment of particular data.
ERIC Educational Resources Information Center
Smith, T. C., Jr.
The purpose of the 1968-69 investigation was to determine the applicability of a curriculum evaluation model to investigate high school students' achievement in three physics courses (traditional physics, Physical Science Study Curriculum, and Harvard Project Physics). Three tests were used to measure student progress: The Dunning-Abeles Physics…
Precipitation-runoff modeling system; user's manual
Leavesley, G.H.; Lichty, R.W.; Troutman, B.M.; Saindon, L.G.
1983-01-01
The concepts, structure, theoretical development, and data requirements of the precipitation-runoff modeling system (PRMS) are described. The precipitation-runoff modeling system is a modular-design, deterministic, distributed-parameter modeling system developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow, sediment yields, and general basin hydrology. Basin response to normal and extreme rainfall and snowmelt can be simulated to evaluate changes in water balance relationships, flow regimes, flood peaks and volumes, soil-water relationships, sediment yields, and groundwater recharge. Parameter-optimization and sensitivity analysis capabilites are provided to fit selected model parameters and evaluate their individual and joint effects on model output. The modular design provides a flexible framework for continued model system enhancement and hydrologic modeling research and development. (Author 's abstract)
NASA Technical Reports Server (NTRS)
Oken, S.; Skoumal, D. E.; Straayer, J. W.
1974-01-01
The development of metal structures reinforced with filamentary composites as a weight saving feature of the space shuttle components is discussed. A frame was selected for study that was representative of the type of construction used in the bulk frames of the orbiter vehicle. Theoretical and experimental investigations were conducted. Component tests were performed to evaluate the critical details used in the designs and to provide credibility to the weight saving results. A model frame was constructed of the reinforced metal material to provide a final evaluation of the construction under realistic load conditions.
Site selection model for new metro stations based on land use
NASA Astrophysics Data System (ADS)
Zhang, Nan; Chen, Xuewu
2015-12-01
Since the construction of metro system generally lags behind the development of urban land use, sites of metro stations should adapt to their surrounding situations, which was rarely discussed by previous research on station layout. This paper proposes a new site selection model to find the best location for a metro station, establishing the indicator system based on land use and combining AHP with entropy weight method to obtain the schemes' ranking. The feasibility and efficiency of this model has been validated by evaluating Nanjing Shengtai Road station and other potential sites.
Uncertain programming models for portfolio selection with uncertain returns
NASA Astrophysics Data System (ADS)
Zhang, Bo; Peng, Jin; Li, Shengguo
2015-10-01
In an indeterminacy economic environment, experts' knowledge about the returns of securities consists of much uncertainty instead of randomness. This paper discusses portfolio selection problem in uncertain environment in which security returns cannot be well reflected by historical data, but can be evaluated by the experts. In the paper, returns of securities are assumed to be given by uncertain variables. According to various decision criteria, the portfolio selection problem in uncertain environment is formulated as expected-variance-chance model and chance-expected-variance model by using the uncertainty programming. Within the framework of uncertainty theory, for the convenience of solving the models, some crisp equivalents are discussed under different conditions. In addition, a hybrid intelligent algorithm is designed in the paper to provide a general method for solving the new models in general cases. At last, two numerical examples are provided to show the performance and applications of the models and algorithm.
NASA Astrophysics Data System (ADS)
Ben Abdessalem, Anis; Dervilis, Nikolaos; Wagg, David; Worden, Keith
2018-01-01
This paper will introduce the use of the approximate Bayesian computation (ABC) algorithm for model selection and parameter estimation in structural dynamics. ABC is a likelihood-free method typically used when the likelihood function is either intractable or cannot be approached in a closed form. To circumvent the evaluation of the likelihood function, simulation from a forward model is at the core of the ABC algorithm. The algorithm offers the possibility to use different metrics and summary statistics representative of the data to carry out Bayesian inference. The efficacy of the algorithm in structural dynamics is demonstrated through three different illustrative examples of nonlinear system identification: cubic and cubic-quintic models, the Bouc-Wen model and the Duffing oscillator. The obtained results suggest that ABC is a promising alternative to deal with model selection and parameter estimation issues, specifically for systems with complex behaviours.
Database Selection: One Size Does Not Fit All.
ERIC Educational Resources Information Center
Allison, DeeAnn; McNeil, Beth; Swanson, Signe
2000-01-01
Describes a strategy for selecting a delivery method for electronic resources based on experiences at the University of Nebraska-Lincoln. Considers local conditions, pricing, feature options, hardware costs, and network availability and presents a model for evaluating the decision based on dollar requirements and local issues. (Author/LRW)
Currently, little justification is provided for nanomaterial testing concentrations in in vitro assays. The in vitro concentrations typically used may be higher than those experienced in exposed humans. Selection of concentration levels for hazard evaluation based on real-world ...
Pereira, Paulo; Westgard, James O; Encarnação, Pedro; Seghatchian, Jerard; de Sousa, Gracinda
2015-02-01
Blood establishments routinely perform screening immunoassays to assess safety of the blood components. As with any other screening test, results have an inherent uncertainty. In blood establishments the major concern is the chance of false negatives, due to its possible impact on patients' health. This article briefly reviews GUM and diagnostic accuracy models for screening immunoassays, recommending a scheme to support the screening laboratories' staffs on the selection of a model considering the intended use of the screening results (i.e., post-transfusion safety). The discussion is grounded on a "risk-based thinking", risk being considered from the blood donor selection to the screening immunoassays. A combination of GUM and diagnostic accuracy models to evaluate measurement uncertainty in blood establishments is recommended. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nerguizian, Vahe; Rafaf, Mustapha
2004-08-01
This article describes and provides valuable information for companies and universities with strategies to start fabricating MEMS for RF/Microwave and millimeter wave applications. The present work shows the infrastructure developed for RF/Microwave and millimeter wave MEMS platforms, which helps the identification, evaluation and selection of design tools and fabrication foundries taking into account packaging and testing. The selected and implemented simple infrastructure models, based on surface and bulk micromachining, yield inexpensive and innovative approaches for distributed choices of MEMS operating tools. With different educational or industrial institution needs, these models may be modified for specific resource changes using a careful analyzed iteration process. The inputs of the project are evaluation selection criteria and information sources such as financial, technical, availability, accessibility, simplicity, versatility and practical considerations. The outputs of the project are the selection of different MEMS design tools or software (solid modeling, electrostatic/electromagnetic and others, compatible with existing standard RF/Microwave design tools) and different MEMS manufacturing foundries. Typical RF/Microwave and millimeter wave MEMS solutions are introduced on the platform during the evaluation and development phases of the project for the validation of realistic results and operational decision making choices. The encountered challenges during the investigation and the development steps are identified and the dynamic behavior of the infrastructure is emphasized. The inputs (resources) and the outputs (demonstrated solutions) are presented in tables and flow chart mode diagrams.
Liu, Zun-lei; Yuan, Xing-wei; Yang, Lin-lin; Yan, Li-ping; Zhang, Hui; Cheng, Jia-hua
2015-02-01
Multiple hypotheses are available to explain recruitment rate. Model selection methods can be used to identify the best model that supports a particular hypothesis. However, using a single model for estimating recruitment success is often inadequate for overexploited population because of high model uncertainty. In this study, stock-recruitment data of small yellow croaker in the East China Sea collected from fishery dependent and independent surveys between 1992 and 2012 were used to examine density-dependent effects on recruitment success. Model selection methods based on frequentist (AIC, maximum adjusted R2 and P-values) and Bayesian (Bayesian model averaging, BMA) methods were applied to identify the relationship between recruitment and environment conditions. Interannual variability of the East China Sea environment was indicated by sea surface temperature ( SST) , meridional wind stress (MWS), zonal wind stress (ZWS), sea surface pressure (SPP) and runoff of Changjiang River ( RCR). Mean absolute error, mean squared predictive error and continuous ranked probability score were calculated to evaluate the predictive performance of recruitment success. The results showed that models structures were not consistent based on three kinds of model selection methods, predictive variables of models were spawning abundance and MWS by AIC, spawning abundance by P-values, spawning abundance, MWS and RCR by maximum adjusted R2. The recruitment success decreased linearly with stock abundance (P < 0.01), suggesting overcompensation effect in the recruitment success might be due to cannibalism or food competition. Meridional wind intensity showed marginally significant and positive effects on the recruitment success (P = 0.06), while runoff of Changjiang River showed a marginally negative effect (P = 0.07). Based on mean absolute error and continuous ranked probability score, predictive error associated with models obtained from BMA was the smallest amongst different approaches, while that from models selected based on the P-value of the independent variables was the highest. However, mean squared predictive error from models selected based on the maximum adjusted R2 was highest. We found that BMA method could improve the prediction of recruitment success, derive more accurate prediction interval and quantitatively evaluate model uncertainty.
Mental health courts and their selection processes: modeling variation for consistency.
Wolff, Nancy; Fabrikant, Nicole; Belenko, Steven
2011-10-01
Admission into mental health courts is based on a complicated and often variable decision-making process that involves multiple parties representing different expertise and interests. To the extent that eligibility criteria of mental health courts are more suggestive than deterministic, selection bias can be expected. Very little research has focused on the selection processes underpinning problem-solving courts even though such processes may dominate the performance of these interventions. This article describes a qualitative study designed to deconstruct the selection and admission processes of mental health courts. In this article, we describe a multi-stage, complex process for screening and admitting clients into mental health courts. The selection filtering model that is described has three eligibility screening stages: initial, assessment, and evaluation. The results of this study suggest that clients selected by mental health courts are shaped by the formal and informal selection criteria, as well as by the local treatment system.
The Limits of Natural Selection in a Nonequilibrium World.
Brandvain, Yaniv; Wright, Stephen I
2016-04-01
Evolutionary theory predicts that factors such as a small population size or low recombination rate can limit the action of natural selection. The emerging field of comparative population genomics offers an opportunity to evaluate these hypotheses. However, classical theoretical predictions assume that populations are at demographic equilibrium. This assumption is likely to be violated in the very populations researchers use to evaluate selection's limits: populations that have experienced a recent shift in population size and/or effective recombination rates. Here we highlight theory and data analyses concerning limitations on the action of natural selection in nonequilibrial populations and argue that substantial care is needed to appropriately test whether species and populations show meaningful differences in selection efficacy. A move toward model-based inferences that explicitly incorporate nonequilibrium dynamics provides a promising approach to more accurately contrast selection efficacy across populations and interpret its significance. Copyright © 2016 Elsevier Ltd. All rights reserved.
Bornhorst, Ellen R; Tang, Juming; Sablani, Shyam S; Barbosa-Cánovas, Gustavo V; Liu, Fang
2017-07-01
Development and selection of model foods is a critical part of microwave thermal process development, simulation validation, and optimization. Previously developed model foods for pasteurization process evaluation utilized Maillard reaction products as the time-temperature integrators, which resulted in similar temperature sensitivity among the models. The aim of this research was to develop additional model foods based on different time-temperature integrators, determine their dielectric properties and color change kinetics, and validate the optimal model food in hot water and microwave-assisted pasteurization processes. Color, quantified using a * value, was selected as the time-temperature indicator for green pea and garlic puree model foods. Results showed 915 MHz microwaves had a greater penetration depth into the green pea model food than the garlic. a * value reaction rates for the green pea model were approximately 4 times slower than in the garlic model food; slower reaction rates were preferred for the application of model food in this study, that is quality evaluation for a target process of 90 °C for 10 min at the cold spot. Pasteurization validation used the green pea model food and results showed that there were quantifiable differences between the color of the unheated control, hot water pasteurization, and microwave-assisted thermal pasteurization system. Both model foods developed in this research could be utilized for quality assessment and optimization of various thermal pasteurization processes. © 2017 Institute of Food Technologists®.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-12
... modeling demonstration should include supporting technical analyses and descriptions of all relevant....5 and NO X . The attainment demonstration includes: Technical analyses that locate, identify, and... modeling analysis is a complex technical evaluation that began with selection of the modeling system. The...
An Evaluation of a Testing Model for Listening Comprehension.
ERIC Educational Resources Information Center
Kangli, Ji
A model for testing listening comprehension in English as a Second Language is discussed and compared with the Test for English Majors (TEM). The model in question incorporates listening for: (1) understanding factual information; (2) comprehension and interpretation; (3) detailed and selective information; (4) global ideas; (5) on-line tasks…
Numerical, mathematical models of water and chemical movement in soils are used as decision aids for determining soil screening levels (SSLs) of radionuclides in the unsaturated zone. Many models require extensive input parameters which include uncertainty due to soil variabil...
A Decision Model for Evaluating Potential Change in Instructional Programs.
ERIC Educational Resources Information Center
Amor, J. P.; Dyer, J. S.
A statistical model designed to assist elementary school principals in the process of selection educational areas which should receive additional emphasis is presented. For each educational area, the model produces an index number which represents the expected "value" per dollar spent on an instructional program appropriate for strengthening that…
NASA Technical Reports Server (NTRS)
Roller, N. E. G.; Colwell, J. E.; Sellman, A. N.
1985-01-01
A study undertaken in support of NASA's Global Habitability Program is described. A demonstration of geographic information system (GIS) technology for site evaluation and selection is given. The objective was to locate potential fuelwood plantations within a 50 km radius of Nairobi, Kenya. A model was developed to evaluate site potential based on capability and suitability criteria and implemented using the Environmental Research Institute of Michigan's geographic information system.
Selectivity Mechanism of ATP-Competitive Inhibitors for PKB and PKA.
Wu, Ke; Pang, Jingzhi; Song, Dong; Zhu, Ying; Wu, Congwen; Shao, Tianqu; Chen, Haifeng
2015-07-01
Protein kinase B (PKB) acts as a central node on the PI3K kinase pathway. Constitutive activation and overexpression of PKB have been identified to involve in various cancers. However, protein kinase A (PKA) sharing high homology with PKB is essential for metabolic regulation. Therefore, specific targeting on PKB is crucial strategy in drug design and development for antitumor. Here, we had revealed the selectivity mechanism for PKB inhibitors with molecular dynamics simulation and 3D-QSAR methods. Selective inhibitors of PKB could form more hydrogen bonds and hydrophobic contacts with PKB than those with PKA. This could explain that selective inhibitor M128 is more potent to PKB than to PKA. Then, 3D-QSAR models were constructed for these selective inhibitors and evaluated by test set compounds. 3D-QSAR model comparison of PKB inhibitors and PKA inhibitors reveals possible methods to improve the selectivity of inhibitors. These models can be used to design new chemical entities and make quantitative prediction of the specific selective inhibitors before resorting to in vitro and in vivo experiment. © 2014 John Wiley & Sons A/S.
Multimodel Ensemble Methods for Prediction of Wake-Vortex Transport and Decay Originating NASA
NASA Technical Reports Server (NTRS)
Korner, Stephan; Ahmad, Nashat N.; Holzapfel, Frank; VanValkenburg, Randal L.
2017-01-01
Several multimodel ensemble methods are selected and further developed to improve the deterministic and probabilistic prediction skills of individual wake-vortex transport and decay models. The different multimodel ensemble methods are introduced, and their suitability for wake applications is demonstrated. The selected methods include direct ensemble averaging, Bayesian model averaging, and Monte Carlo simulation. The different methodologies are evaluated employing data from wake-vortex field measurement campaigns conducted in the United States and Germany.
Yakan, S D; Focks, A; Klasmeier, J; Okay, O S
2017-01-01
Polycyclic aromatic hydrocarbons (PAHs) are important organic pollutants in the aquatic environment due to their persistence and bioaccumulation potential both in organisms and in sediments. Benzo(a)anthracene (BaA) and phenanthrene (PHE), which are in the priority pollutant list of the U.S. EPA (Environmental Protection Agency), are selected as model compounds of the present study. Bioaccumulation and depuration experiments with local Mediterranean mussel species, Mytilus galloprovincialis were used as the basis of the study. Mussels were selected as bioindicator organisms due to their broad geographic distribution, immobility and low enzyme activity. Bioaccumulation and depuration kinetics of selected PAHs in Mytilus galloprovincialis were described using first order kinetic equations in a three compartment model. The compartments were defined as: (1) biota (mussel), (2) surrounding environment (seawater), and (3) algae (Phaeodactylum tricornutum) as food source of the mussels. Experimental study had been performed for three different concentrations. Middle concentration of the experimental data was used as the model input in order to represent other high and low concentrations of selected PAHs. Correlations of the experiment and model data revealed that they are in good agreement. Accumulation and depuration trend of PAHs in mussels regarding also the durations can be estimated effectively with the present study. Thus, this study can be evaluated as a supportive tool for risk assessment in addition to monitoring studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Pashaei, Elnaz; Pashaei, Elham; Aydin, Nizamettin
2018-04-14
In cancer classification, gene selection is an important data preprocessing technique, but it is a difficult task due to the large search space. Accordingly, the objective of this study is to develop a hybrid meta-heuristic Binary Black Hole Algorithm (BBHA) and Binary Particle Swarm Optimization (BPSO) (4-2) model that emphasizes gene selection. In this model, the BBHA is embedded in the BPSO (4-2) algorithm to make the BPSO (4-2) more effective and to facilitate the exploration and exploitation of the BPSO (4-2) algorithm to further improve the performance. This model has been associated with Random Forest Recursive Feature Elimination (RF-RFE) pre-filtering technique. The classifiers which are evaluated in the proposed framework are Sparse Partial Least Squares Discriminant Analysis (SPLSDA); k-nearest neighbor and Naive Bayes. The performance of the proposed method was evaluated on two benchmark and three clinical microarrays. The experimental results and statistical analysis confirm the better performance of the BPSO (4-2)-BBHA compared with the BBHA, the BPSO (4-2) and several state-of-the-art methods in terms of avoiding local minima, convergence rate, accuracy and number of selected genes. The results also show that the BPSO (4-2)-BBHA model can successfully identify known biologically and statistically significant genes from the clinical datasets. Copyright © 2018 Elsevier Inc. All rights reserved.
Selection Index in the Study of Adaptability and Stability in Maize
Lunezzo de Oliveira, Rogério; Garcia Von Pinho, Renzo; Furtado Ferreira, Daniel; Costa Melo, Wagner Mateus
2014-01-01
This paper proposes an alternative method for evaluating the stability and adaptability of maize hybrids using a genotype-ideotype distance index (GIDI) for selection. Data from seven variables were used, obtained through evaluation of 25 maize hybrids at six sites in southern Brazil. The GIDI was estimated by means of the generalized Mahalanobis distance for each plot of the test. We then proceeded to GGE biplot analysis in order to compare the predictive accuracy of the GGE models and the grouping of environments and to select the best five hybrids. The G × E interaction was significant for both variables assessed. The GGE model with two principal components obtained a predictive accuracy (PRECORR) of 0.8913 for the GIDI and 0.8709 for yield (t ha−1). Two groups of environments were obtained upon analyzing the GIDI, whereas all the environments remained in the same group upon analyzing yield. Coincidence occurred in only two hybrids considering evaluation of the two features. The GIDI assessment provided for selection of hybrids that combine adaptability and stability in most of the variables assessed, making its use more highly recommended than analyzing each variable separately. Not all the higher-yielding hybrids were the best in the other variables assessed. PMID:24696641
Model weights and the foundations of multimodel inference
Link, W.A.; Barker, R.J.
2006-01-01
Statistical thinking in wildlife biology and ecology has been profoundly influenced by the introduction of AIC (Akaike?s information criterion) as a tool for model selection and as a basis for model averaging. In this paper, we advocate the Bayesian paradigm as a broader framework for multimodel inference, one in which model averaging and model selection are naturally linked, and in which the performance of AIC-based tools is naturally evaluated. Prior model weights implicitly associated with the use of AIC are seen to highly favor complex models: in some cases, all but the most highly parameterized models in the model set are virtually ignored a priori. We suggest the usefulness of the weighted BIC (Bayesian information criterion) as a computationally simple alternative to AIC, based on explicit selection of prior model probabilities rather than acceptance of default priors associated with AIC. We note, however, that both procedures are only approximate to the use of exact Bayes factors. We discuss and illustrate technical difficulties associated with Bayes factors, and suggest approaches to avoiding these difficulties in the context of model selection for a logistic regression. Our example highlights the predisposition of AIC weighting to favor complex models and suggests a need for caution in using the BIC for computing approximate posterior model weights.
Neural Underpinnings of Decision Strategy Selection: A Review and a Theoretical Model
Wichary, Szymon; Smolen, Tomasz
2016-01-01
In multi-attribute choice, decision makers use decision strategies to arrive at the final choice. What are the neural mechanisms underlying decision strategy selection? The first goal of this paper is to provide a literature review on the neural underpinnings and cognitive models of decision strategy selection and thus set the stage for a neurocognitive model of this process. The second goal is to outline such a unifying, mechanistic model that can explain the impact of noncognitive factors (e.g., affect, stress) on strategy selection. To this end, we review the evidence for the factors influencing strategy selection, the neural basis of strategy use and the cognitive models of this process. We also present the Bottom-Up Model of Strategy Selection (BUMSS). The model assumes that the use of the rational Weighted Additive strategy and the boundedly rational heuristic Take The Best can be explained by one unifying, neurophysiologically plausible mechanism, based on the interaction of the frontoparietal network, orbitofrontal cortex, anterior cingulate cortex and the brainstem nucleus locus coeruleus. According to BUMSS, there are three processes that form the bottom-up mechanism of decision strategy selection and lead to the final choice: (1) cue weight computation, (2) gain modulation, and (3) weighted additive evaluation of alternatives. We discuss how these processes might be implemented in the brain, and how this knowledge allows us to formulate novel predictions linking strategy use and neural signals. PMID:27877103
Economic evaluation of genomic selection in small ruminants: a sheep meat breeding program.
Shumbusho, F; Raoul, J; Astruc, J M; Palhiere, I; Lemarié, S; Fugeray-Scarbel, A; Elsen, J M
2016-06-01
Recent genomic evaluation studies using real data and predicting genetic gain by modeling breeding programs have reported moderate expected benefits from the replacement of classic selection schemes by genomic selection (GS) in small ruminants. The objectives of this study were to compare the cost, monetary genetic gain and economic efficiency of classic selection and GS schemes in the meat sheep industry. Deterministic methods were used to model selection based on multi-trait indices from a sheep meat breeding program. Decisional variables related to male selection candidates and progeny testing were optimized to maximize the annual monetary genetic gain (AMGG), that is, a weighted sum of meat and maternal traits annual genetic gains. For GS, a reference population of 2000 individuals was assumed and genomic information was available for evaluation of male candidates only. In the classic selection scheme, males breeding values were estimated from own and offspring phenotypes. In GS, different scenarios were considered, differing by the information used to select males (genomic only, genomic+own performance, genomic+offspring phenotypes). The results showed that all GS scenarios were associated with higher total variable costs than classic selection (if the cost of genotyping was 123 euros/animal). In terms of AMGG and economic returns, GS scenarios were found to be superior to classic selection only if genomic information was combined with their own meat phenotypes (GS-Pheno) or with their progeny test information. The predicted economic efficiency, defined as returns (proportional to number of expressions of AMGG in the nucleus and commercial flocks) minus total variable costs, showed that the best GS scenario (GS-Pheno) was up to 15% more efficient than classic selection. For all selection scenarios, optimization increased the overall AMGG, returns and economic efficiency. As a conclusion, our study shows that some forms of GS strategies are more advantageous than classic selection, provided that GS is already initiated (i.e. the initial reference population is available). Optimizing decisional variables of the classic selection scheme could be of greater benefit than including genomic information in optimized designs.
SUPERCRITICAL WATER OXIDATION MODEL DEVELOPMENT FOR SELECTED EPA PRIORITY POLLUTANTS
Supercritical Water Oxidation (SCWO) evaluated for five compounds: acetic acid, 2,4-dichlorophenol, pentachlorophenol, pyridine, 2,4-dichlorophenoxyacetic acid (methyl ester). inetic models were developed for acetic acid, 2,4-dichlorophenol, and pyridine. he test compounds were e...
NASA Astrophysics Data System (ADS)
Kikuchi, C.; Ferre, P. A.; Vrugt, J. A.
2011-12-01
Hydrologic models are developed, tested, and refined based on the ability of those models to explain available hydrologic data. The optimization of model performance based upon mismatch between model outputs and real world observations has been extensively studied. However, identification of plausible models is sensitive not only to the models themselves - including model structure and model parameters - but also to the location, timing, type, and number of observations used in model calibration. Therefore, careful selection of hydrologic observations has the potential to significantly improve the performance of hydrologic models. In this research, we seek to reduce prediction uncertainty through optimization of the data collection process. A new tool - multiple model analysis with discriminatory data collection (MMA-DDC) - was developed to address this challenge. In this approach, multiple hydrologic models are developed and treated as competing hypotheses. Potential new data are then evaluated on their ability to discriminate between competing hypotheses. MMA-DDC is well-suited for use in recursive mode, in which new observations are continuously used in the optimization of subsequent observations. This new approach was applied to a synthetic solute transport experiment, in which ranges of parameter values constitute the multiple hydrologic models, and model predictions are calculated using likelihood-weighted model averaging. MMA-DDC was used to determine the optimal location, timing, number, and type of new observations. From comparison with an exhaustive search of all possible observation sequences, we find that MMA-DDC consistently selects observations which lead to the highest reduction in model prediction uncertainty. We conclude that using MMA-DDC to evaluate potential observations may significantly improve the performance of hydrologic models while reducing the cost associated with collecting new data.
Spectroscopic Diagnosis of Arsenic Contamination in Agricultural Soils
Shi, Tiezhu; Liu, Huizeng; Chen, Yiyun; Fei, Teng; Wang, Junjie; Wu, Guofeng
2017-01-01
This study investigated the abilities of pre-processing, feature selection and machine-learning methods for the spectroscopic diagnosis of soil arsenic contamination. The spectral data were pre-processed by using Savitzky-Golay smoothing, first and second derivatives, multiplicative scatter correction, standard normal variate, and mean centering. Principle component analysis (PCA) and the RELIEF algorithm were used to extract spectral features. Machine-learning methods, including random forests (RF), artificial neural network (ANN), radial basis function- and linear function- based support vector machine (RBF- and LF-SVM) were employed for establishing diagnosis models. The model accuracies were evaluated and compared by using overall accuracies (OAs). The statistical significance of the difference between models was evaluated by using McNemar’s test (Z value). The results showed that the OAs varied with the different combinations of pre-processing, feature selection, and classification methods. Feature selection methods could improve the modeling efficiencies and diagnosis accuracies, and RELIEF often outperformed PCA. The optimal models established by RF (OA = 86%), ANN (OA = 89%), RBF- (OA = 89%) and LF-SVM (OA = 87%) had no statistical difference in diagnosis accuracies (Z < 1.96, p < 0.05). These results indicated that it was feasible to diagnose soil arsenic contamination using reflectance spectroscopy. The appropriate combination of multivariate methods was important to improve diagnosis accuracies. PMID:28471412
AVN-492, A Novel Highly Selective 5-HT6R Antagonist: Preclinical Evaluation.
Ivachtchenko, Alexandre V; Okun, Ilya; Aladinskiy, Vladimir; Ivanenkov, Yan; Koryakova, Angela; Karapetyan, Ruben; Mitkin, Oleg; Salimov, Ramiz; Ivashchenko, Andrey
2017-01-01
Discovery of 5-HT6 receptor subtype and its exclusive localization within the central nervous system led to extensive investigations of its role in Alzheimer's disease, schizophrenia, and obesity. In the present study, we present preclinical evaluation of a novel highly-potent and highly-selective 5-HT6R antagonist, AVN-492. The affinity of AVN-492 to bind to 5-HT6R (Ki = 91 pM) was more than three orders of magnitude higher than that to bind to the only other target, 5-HT2BR, (Ki = 170 nM). Thus, the compound displayed great 5-HT6R selectivity against all other serotonin receptor subtypes, and is extremely specific against any other receptors such as adrenergic, GABAergic, dopaminergic, histaminergic, etc. AVN-492 demonstrates good in vitro and in vivo ADME profile with high oral bioavailability and good brain permeability in rodents. In behavioral tests, AVN-492 shows anxiolytic effect in elevated plus-maze model, prevents an apomorphine-induced disruption of startle pre-pulse inhibition (the PPI model) and reverses a scopolamine- and MK-801-induced memory deficit in passive avoidance model. No anti-obesity effect of AVN-492 was found in a murine model. The data presented here strongly indicate that due to its high oral bioavailability, extremely high selectivity, and potency to block the 5-HT6 receptor, AVN-492 is a very promising tool for evaluating the role the 5-HT6 receptor might play in cognitive and neurodegenerative impairments. AVN-492 is an excellent drug candidate to be tested for treatment of such diseases, and is currently being tested in Phase I trials.
NASA Astrophysics Data System (ADS)
Feng, Wenjie; Wu, Shenghe; Yin, Yanshu; Zhang, Jiajia; Zhang, Ke
2017-07-01
A training image (TI) can be regarded as a database of spatial structures and their low to higher order statistics used in multiple-point geostatistics (MPS) simulation. Presently, there are a number of methods to construct a series of candidate TIs (CTIs) for MPS simulation based on a modeler's subjective criteria. The spatial structures of TIs are often various, meaning that the compatibilities of different CTIs with the conditioning data are different. Therefore, evaluation and optimal selection of CTIs before MPS simulation is essential. This paper proposes a CTI evaluation and optimal selection method based on minimum data event distance (MDevD). In the proposed method, a set of MDevD properties are established through calculation of the MDevD of conditioning data events in each CTI. Then, CTIs are evaluated and ranked according to the mean value and variance of the MDevD properties. The smaller the mean value and variance of an MDevD property are, the more compatible the corresponding CTI is with the conditioning data. In addition, data events with low compatibility in the conditioning data grid can be located to help modelers select a set of complementary CTIs for MPS simulation. The MDevD property can also help to narrow the range of the distance threshold for MPS simulation. The proposed method was evaluated using three examples: a 2D categorical example, a 2D continuous example, and an actual 3D oil reservoir case study. To illustrate the method, a C++ implementation of the method is attached to the paper.
NASA Technical Reports Server (NTRS)
Kidd, Chris; Chapman, Lee
2012-01-01
Meteorological measurements within urban areas are becoming increasingly important due to the accentuating effects of climate change upon the Urban Heat Island (UHI). However, ensuring that such measurements are representative of the local area is often difficult due to the diversity of the urban environment. The evaluation of sites is important for both new sites and for the relocation of established sites to ensure that long term changes in the meteorological and climatological conditions continue to be faithfully recorded. Site selection is traditionally carried out in the field using both local knowledge and visual inspection. This paper exploits and assesses the use of lidar-derived digital surface models (DSMs) to quantitatively aid the site selection process. This is acheived by combining the DSM with a solar model, first to generate spatial maps of sky view factors and sun-hour potential and second, to generate site-specific views of the horizon. The results show that such a technique is a useful first-step approach to identify key sites that may be further evaluated for the location of meteorological stations within urban areas.
Angular selective window systems: Assessment of technical potential for energy savings
Fernandes, Luis L.; Lee, Eleanor S.; McNeil, Andrew; ...
2014-10-16
Static angular selective shading systems block direct sunlight and admit daylight within a specific range of incident solar angles. The objective of this study is to quantify their potential to reduce energy use and peak demand in commercial buildings using state-of-the art whole-building computer simulation software that allows accurate modeling of the behavior of optically-complex fenestration systems such as angular selective systems. Three commercial systems were evaluated: a micro-perforated screen, a tubular shading structure, and an expanded metal mesh. This evaluation was performed through computer simulation for multiple climates (Chicago, Illinois and Houston, Texas), window-to-wall ratios (0.15-0.60), building codes (ASHRAEmore » 90.1-2004 and 2010) and lighting control configurations (with and without). The modeling of the optical complexity of the systems took advantage of the development of state-of-the-art versions of the EnergyPlus, Radiance and Window simulation tools. Results show significant reductions in perimeter zone energy use; the best system reached 28% and 47% savings, respectively without and with daylighting controls (ASHRAE 90.1-2004, south facade, Chicago,WWR=0.45). As a result, angular selectivity and thermal conductance of the angle-selective layer, as well as spectral selectivity of low-emissivity coatings, were identified as factors with significant impact on performance.« less
NASA Astrophysics Data System (ADS)
Siami, Mohammad; Gholamian, Mohammad Reza; Basiri, Javad
2014-10-01
Nowadays, credit scoring is one of the most important topics in the banking sector. Credit scoring models have been widely used to facilitate the process of credit assessing. In this paper, an application of the locally linear model tree algorithm (LOLIMOT) was experimented to evaluate the superiority of its performance to predict the customer's credit status. The algorithm is improved with an aim of adjustment by credit scoring domain by means of data fusion and feature selection techniques. Two real world credit data sets - Australian and German - from UCI machine learning database were selected to demonstrate the performance of our new classifier. The analytical results indicate that the improved LOLIMOT significantly increase the prediction accuracy.
Three probes for diagnosing photochemical dynamics are presented and applied to specialized ambient surface-level observations and to a numerical photochemical model to better understand rates of production and other process information in the atmosphere and in the model. Howeve...
Neyazi, Narges; Arab, Prof Mohammad; Farzianpour, Freshteh; Mahmoudi Majdabadi, Mahmood
2016-12-01
Evaluation of higher education is an increasing demand for information on academic quality, which contributes to accountability among authorities and affects universities ranking. In educational institutions, the purpose of education is producing knowledgeable students and improving quality of the university system. Among many evaluation models, the CIPP model or Context, Input, Process, Product model is very beneficial and recommendable method to educational evaluation. This is a descriptive study conducted in four selected faculties of Tehran University of Medical Sciences (TUMS) (Public health, Nursing and Midwifery, Rehabilitation and Allied Medical Sciences), undergraduate educational departments in 2014. This research found out quality level of undergraduates courses in viewpoint of students and graduates and determined their weak points. Data were collected through researcher- made questionnaires. Collected data were then analyzed using descriptive and inferential statistics. Results showed undesirable situation of context, process and product area and undesirable situation for input except for "interest and understanding of students towards field and labor market" factor, which had relatively desirable situation. At the end, researchers recommend some steps to improve goals and mission of programs, allocated budget, curriculum and providing a system for communication with graduates. Copyright © 2016. Published by Elsevier Ltd.
Error Reduction Program. [combustor performance evaluation codes
NASA Technical Reports Server (NTRS)
Syed, S. A.; Chiappetta, L. M.; Gosman, A. D.
1985-01-01
The details of a study to select, incorporate and evaluate the best available finite difference scheme to reduce numerical error in combustor performance evaluation codes are described. The combustor performance computer programs chosen were the two dimensional and three dimensional versions of Pratt & Whitney's TEACH code. The criteria used to select schemes required that the difference equations mirror the properties of the governing differential equation, be more accurate than the current hybrid difference scheme, be stable and economical, be compatible with TEACH codes, use only modest amounts of additional storage, and be relatively simple. The methods of assessment used in the selection process consisted of examination of the difference equation, evaluation of the properties of the coefficient matrix, Taylor series analysis, and performance on model problems. Five schemes from the literature and three schemes developed during the course of the study were evaluated. This effort resulted in the incorporation of a scheme in 3D-TEACH which is usuallly more accurate than the hybrid differencing method and never less accurate.
Commodity-based Approach for Evaluating the Value of Freight Moving on Texas’ Roadway Network
DOT National Transportation Integrated Search
2017-12-10
The researchers took a commodity-based approach to evaluate the value of a list of selected commodities moved on the Texas freight network. This approach takes advantage of commodity-specific data sources and modeling processes. It provides a unique ...
An evaluation of selected in silico models for the assessment ...
Skin sensitization remains an important endpoint for consumers, manufacturers and regulators. Although the development of alternative approaches to assess skin sensitization potential has been extremely active over many years, the implication of regulations such as REACH and the Cosmetics Directive in EU has provided a much stronger impetus to actualize this research into practical tools for decision making. Thus there has been considerable focus on the development, evaluation, and integration of alternative approaches for skin sensitization hazard and risk assessment. This includes in silico approaches such as (Q)SARs and expert systems. This study aimed to evaluate the predictive performance of a selection of in silico models and then to explore whether combining those models led to an improvement in accuracy. A dataset of 473 substances that had been tested in the local lymph node assay (LLNA) was compiled. This comprised 295 sensitizers and 178 non-sensitizers. Four freely available models were identified - 2 statistical models VEGA and MultiCASE model A33 for skin sensitization (MCASE A33) from the Danish National Food Institute and two mechanistic models Toxtree’s Skin sensitization Reaction domains (Toxtree SS Rxn domains) and the OASIS v1.3 protein binding alerts for skin sensitization from the OECD Toolbox (OASIS). VEGA and MCASE A33 aim to predict sensitization as a binary score whereas the mechanistic models identified reaction domains or structura
Optimizing Experimental Design for Comparing Models of Brain Function
Daunizeau, Jean; Preuschoff, Kerstin; Friston, Karl; Stephan, Klaas
2011-01-01
This article presents the first attempt to formalize the optimization of experimental design with the aim of comparing models of brain function based on neuroimaging data. We demonstrate our approach in the context of Dynamic Causal Modelling (DCM), which relates experimental manipulations to observed network dynamics (via hidden neuronal states) and provides an inference framework for selecting among candidate models. Here, we show how to optimize the sensitivity of model selection by choosing among experimental designs according to their respective model selection accuracy. Using Bayesian decision theory, we (i) derive the Laplace-Chernoff risk for model selection, (ii) disclose its relationship with classical design optimality criteria and (iii) assess its sensitivity to basic modelling assumptions. We then evaluate the approach when identifying brain networks using DCM. Monte-Carlo simulations and empirical analyses of fMRI data from a simple bimanual motor task in humans serve to demonstrate the relationship between network identification and the optimal experimental design. For example, we show that deciding whether there is a feedback connection requires shorter epoch durations, relative to asking whether there is experimentally induced change in a connection that is known to be present. Finally, we discuss limitations and potential extensions of this work. PMID:22125485
Bommert, Andrea; Rahnenführer, Jörg; Lang, Michel
2017-01-01
Finding a good predictive model for a high-dimensional data set can be challenging. For genetic data, it is not only important to find a model with high predictive accuracy, but it is also important that this model uses only few features and that the selection of these features is stable. This is because, in bioinformatics, the models are used not only for prediction but also for drawing biological conclusions which makes the interpretability and reliability of the model crucial. We suggest using three target criteria when fitting a predictive model to a high-dimensional data set: the classification accuracy, the stability of the feature selection, and the number of chosen features. As it is unclear which measure is best for evaluating the stability, we first compare a variety of stability measures. We conclude that the Pearson correlation has the best theoretical and empirical properties. Also, we find that for the stability assessment behaviour it is most important that a measure contains a correction for chance or large numbers of chosen features. Then, we analyse Pareto fronts and conclude that it is possible to find models with a stable selection of few features without losing much predictive accuracy.
NASA Astrophysics Data System (ADS)
Müller, Aline Lima Hermes; Picoloto, Rochele Sogari; Mello, Paola de Azevedo; Ferrão, Marco Flores; dos Santos, Maria de Fátima Pereira; Guimarães, Regina Célia Lourenço; Müller, Edson Irineu; Flores, Erico Marlon Moraes
2012-04-01
Total sulfur concentration was determined in atmospheric residue (AR) and vacuum residue (VR) samples obtained from petroleum distillation process by Fourier transform infrared spectroscopy with attenuated total reflectance (FT-IR/ATR) in association with chemometric methods. Calibration and prediction set consisted of 40 and 20 samples, respectively. Calibration models were developed using two variable selection models: interval partial least squares (iPLS) and synergy interval partial least squares (siPLS). Different treatments and pre-processing steps were also evaluated for the development of models. The pre-treatment based on multiplicative scatter correction (MSC) and the mean centered data were selected for models construction. The use of siPLS as variable selection method provided a model with root mean square error of prediction (RMSEP) values significantly better than those obtained by PLS model using all variables. The best model was obtained using siPLS algorithm with spectra divided in 20 intervals and combinations of 3 intervals (911-824, 823-736 and 737-650 cm-1). This model produced a RMSECV of 400 mg kg-1 S and RMSEP of 420 mg kg-1 S, showing a correlation coefficient of 0.990.
Modeling of Spacecraft Advanced Chemical Propulsion Systems
NASA Technical Reports Server (NTRS)
Benfield, Michael P. J.; Belcher, Jeremy A.
2004-01-01
This paper outlines the development of the Advanced Chemical Propulsion System (ACPS) model for Earth and Space Storable propellants. This model was developed by the System Technology Operation of SAIC-Huntsville for the NASA MSFC In-Space Propulsion Project Office. Each subsystem of the model is described. Selected model results will also be shown to demonstrate the model's ability to evaluate technology changes in chemical propulsion systems.
Oliveira, Roberta B; Pereira, Aledir S; Tavares, João Manuel R S
2017-10-01
The number of deaths worldwide due to melanoma has risen in recent times, in part because melanoma is the most aggressive type of skin cancer. Computational systems have been developed to assist dermatologists in early diagnosis of skin cancer, or even to monitor skin lesions. However, there still remains a challenge to improve classifiers for the diagnosis of such skin lesions. The main objective of this article is to evaluate different ensemble classification models based on input feature manipulation to diagnose skin lesions. Input feature manipulation processes are based on feature subset selections from shape properties, colour variation and texture analysis to generate diversity for the ensemble models. Three subset selection models are presented here: (1) a subset selection model based on specific feature groups, (2) a correlation-based subset selection model, and (3) a subset selection model based on feature selection algorithms. Each ensemble classification model is generated using an optimum-path forest classifier and integrated with a majority voting strategy. The proposed models were applied on a set of 1104 dermoscopic images using a cross-validation procedure. The best results were obtained by the first ensemble classification model that generates a feature subset ensemble based on specific feature groups. The skin lesion diagnosis computational system achieved 94.3% accuracy, 91.8% sensitivity and 96.7% specificity. The input feature manipulation process based on specific feature subsets generated the greatest diversity for the ensemble classification model with very promising results. Copyright © 2017 Elsevier B.V. All rights reserved.
BIBLIO: A Computer System Designed to Support the Near-Library User Model of Information Retrieval.
ERIC Educational Resources Information Center
Belew, Richard K.; Holland, Maurita Peterson
1988-01-01
Description of the development of the Information Exchange Facility, a prototype microcomputer-based personal bibliographic facility, covers software selection, user selection, overview of the system, and evaluation. The plan for an integrated system, BIBLIO, and the future role of libraries are discussed. (eight references) (MES)
On using sample selection methods in estimating the price elasticity of firms' demand for insurance.
Marquis, M Susan; Louis, Thomas A
2002-01-01
We evaluate a technique based on sample selection models that has been used by health economists to estimate the price elasticity of firms' demand for insurance. We demonstrate that, this technique produces inflated estimates of the price elasticity. We show that alternative methods lead to valid estimates.
Safari, Parviz; Danyali, Syyedeh Fatemeh; Rahimi, Mehdi
2018-06-02
Drought is the main abiotic stress seriously influencing wheat production. Information about the inheritance of drought tolerance is necessary to determine the most appropriate strategy to develop tolerant cultivars and populations. In this study, generation means analysis to identify the genetic effects controlling grain yield inheritance in water deficit and normal conditions was considered as a model selection problem in a Bayesian framework. Stochastic search variable selection (SSVS) was applied to identify the most important genetic effects and the best fitted models using different generations obtained from two crosses applying two water regimes in two growing seasons. The SSVS is used to evaluate the effect of each variable on the dependent variable via posterior variable inclusion probabilities. The model with the highest posterior probability is selected as the best model. In this study, the grain yield was controlled by the main effects (additive and non-additive effects) and epistatic. The results demonstrate that breeding methods such as recurrent selection and subsequent pedigree method and hybrid production can be useful to improve grain yield.
Application of GRA for Sustainable Material Selection and Evaluation Using LCA
NASA Astrophysics Data System (ADS)
Jayakrishna, Kandasamy; Vinodh, Sekar; Sakthi Sanghvi, Vijayaselvan; Deepika, Chinadurai
2016-07-01
Material selection is identified as a successful key parameter in establishing any product to be sustainable, considering its end of life (EoL) characteristics. An accurate understanding of expected service conditions and environmental considerations are crucial in the selection of material plays a vital role with overwhelming customer expectations and stringent laws. Therefore, this article presents an integrated approach for sustainable material selection using grey relational analysis (GRA) considering the EoL disposal strategies with respect to an automotive product. GRA, an impact evaluation model measures the degree of similarity between the comparability (choice of material) sequence and reference (EoL strategies) sequence based on the relational grade. The ranking result shows that the outranking relationships in the order, ABS-REC > PP-INC > AL-REM > PP-LND > ABS-LND > ABS-INC > PU-LND > AL-REC > AL-LND > PU-INC > AL-INC. The best sustainable material selected was ABS and recycling was selected as the best EoL strategy with the grey relational value of 2.43856. The best material selected by this approach, ABS was evaluated for its viability using life cycle assessment and the estimated impacts also proved the practicability of the selected material highlighting the focus on dehumidification step in the manufacturing of the case product using this developed multi-criteria approach.
Lescroart, Mark D.; Stansbury, Dustin E.; Gallant, Jack L.
2015-01-01
Perception of natural visual scenes activates several functional areas in the human brain, including the Parahippocampal Place Area (PPA), Retrosplenial Complex (RSC), and the Occipital Place Area (OPA). It is currently unclear what specific scene-related features are represented in these areas. Previous studies have suggested that PPA, RSC, and/or OPA might represent at least three qualitatively different classes of features: (1) 2D features related to Fourier power; (2) 3D spatial features such as the distance to objects in a scene; or (3) abstract features such as the categories of objects in a scene. To determine which of these hypotheses best describes the visual representation in scene-selective areas, we applied voxel-wise modeling (VM) to BOLD fMRI responses elicited by a set of 1386 images of natural scenes. VM provides an efficient method for testing competing hypotheses by comparing predictions of brain activity based on encoding models that instantiate each hypothesis. Here we evaluated three different encoding models that instantiate each of the three hypotheses listed above. We used linear regression to fit each encoding model to the fMRI data recorded from each voxel, and we evaluated each fit model by estimating the amount of variance it predicted in a withheld portion of the data set. We found that voxel-wise models based on Fourier power or the subjective distance to objects in each scene predicted much of the variance predicted by a model based on object categories. Furthermore, the response variance explained by these three models is largely shared, and the individual models explain little unique variance in responses. Based on an evaluation of previous studies and the data we present here, we conclude that there is currently no good basis to favor any one of the three alternative hypotheses about visual representation in scene-selective areas. We offer suggestions for further studies that may help resolve this issue. PMID:26594164
Estimating skin blood saturation by selecting a subset of hyperspectral imaging data
NASA Astrophysics Data System (ADS)
Ewerlöf, Maria; Salerud, E. Göran; Strömberg, Tomas; Larsson, Marcus
2015-03-01
Skin blood haemoglobin saturation (?b) can be estimated with hyperspectral imaging using the wavelength (λ) range of 450-700 nm where haemoglobin absorption displays distinct spectral characteristics. Depending on the image size and photon transport algorithm, computations may be demanding. Therefore, this work aims to evaluate subsets with a reduced number of wavelengths for ?b estimation. White Monte Carlo simulations are performed using a two-layered tissue model with discrete values for epidermal thickness (?epi) and the reduced scattering coefficient (μ's ), mimicking an imaging setup. A detected intensity look-up table is calculated for a range of model parameter values relevant to human skin, adding absorption effects in the post-processing. Skin model parameters, including absorbers, are; μ's (λ), ?epi, haemoglobin saturation (?b), tissue fraction blood (?b) and tissue fraction melanin (?mel). The skin model paired with the look-up table allow spectra to be calculated swiftly. Three inverse models with varying number of free parameters are evaluated: A(?b, ?b), B(?b, ?b, ?mel) and C(all parameters free). Fourteen wavelength candidates are selected by analysing the maximal spectral sensitivity to ?b and minimizing the sensitivity to ?b. All possible combinations of these candidates with three, four and 14 wavelengths, as well as the full spectral range, are evaluated for estimating ?b for 1000 randomly generated evaluation spectra. The results show that the simplified models A and B estimated ?b accurately using four wavelengths (mean error 2.2% for model B). If the number of wavelengths increased, the model complexity needed to be increased to avoid poor estimations.
Veneri, Giacomo; Federico, Antonio; Rufa, Alessandra
2014-01-01
Attention allows us to selectively process the vast amount of information with which we are confronted, prioritizing some aspects of information and ignoring others by focusing on a certain location or aspect of the visual scene. Selective attention is guided by two cognitive mechanisms: saliency of the image (bottom up) and endogenous mechanisms (top down). These two mechanisms interact to direct attention and plan eye movements; then, the movement profile is sent to the motor system, which must constantly update the command needed to produce the desired eye movement. A new approach is described here to study how the eye motor control could influence this selection mechanism in clinical behavior: two groups of patients (SCA2 and late onset cerebellar ataxia LOCA) with well-known problems of motor control were studied; patients performed a cognitively demanding task; the results were compared to a stochastic model based on Monte Carlo simulations and a group of healthy subjects. The analytical procedure evaluated some energy functions for understanding the process. The implemented model suggested that patients performed an optimal visual search, reducing intrinsic noise sources. Our findings theorize a strict correlation between the "optimal motor system" and the "optimal stimulus encoders."
Eads, David A.; Jachowski, David S.; Biggins, Dean E.; Livieri, Travis M.; Matchett, Marc R.; Millspaugh, Joshua J.
2012-01-01
Wildlife-habitat relationships are often conceptualized as resource selection functions (RSFs)—models increasingly used to estimate species distributions and prioritize habitat conservation. We evaluated the predictive capabilities of 2 black-footed ferret (Mustela nigripes) RSFs developed on a 452-ha colony of black-tailed prairie dogs (Cynomys ludovicianus) in the Conata Basin, South Dakota. We used the RSFs to project the relative probability of occurrence of ferrets throughout an adjacent 227-ha colony. We evaluated performance of the RSFs using ferret space use data collected via postbreeding spotlight surveys June–October 2005–2006. In home ranges and core areas, ferrets selected the predicted "very high" and "high" occurrence categories of both RSFs. Count metrics also suggested selection of these categories; for each model in each year, approximately 81% of ferret locations occurred in areas of very high or high predicted occurrence. These results suggest usefulness of the RSFs in estimating the distribution of ferrets throughout a black-tailed prairie dog colony. The RSFs provide a fine-scale habitat assessment for ferrets that can be used to prioritize releases of ferrets and habitat restoration for prairie dogs and ferrets. A method to quickly inventory the distribution of prairie dog burrow openings would greatly facilitate application of the RSFs.
Kirsch, Florian
2015-01-01
Diabetes is the most expensive chronic disease; therefore, disease management programs (DMPs) were introduced. The aim of this review is to determine whether Markov models are adequate to evaluate the cost-effectiveness of complex interventions such as DMPs. Additionally, the quality of the models was evaluated using Philips and Caro quality appraisals. The five reviewed models incorporated the DMP into the model differently: two models integrated effectiveness rates derived from one clinical trial/meta-analysis and three models combined interventions from different sources into a DMP. The results range from cost savings and a QALY gain to costs of US$85,087 per QALY. The Spearman's rank coefficient assesses no correlation between the quality appraisals. With restrictions to the data selection process, Markov models are adequate to determine the cost-effectiveness of DMPs; however, to allow prioritization of medical services, more flexibility in the models is necessary to enable the evaluation of single additional interventions.
Feasibility of quasi-random band model in evaluating atmospheric radiance
NASA Technical Reports Server (NTRS)
Tiwari, S. N.; Mirakhur, N.
1980-01-01
The use of the quasi-random band model in evaluating upwelling atmospheric radiation is investigated. The spectral transmittance and total band adsorptance are evaluated for selected molecular bands by using the line by line model, quasi-random band model, exponential sum fit method, and empirical correlations, and these are compared with the available experimental results. The atmospheric transmittance and upwelling radiance were calculated by using the line by line and quasi random band models and were compared with the results of an existing program called LOWTRAN. The results obtained by the exponential sum fit and empirical relations were not in good agreement with experimental results and their use cannot be justified for atmospheric studies. The line by line model was found to be the best model for atmospheric applications, but it is not practical because of high computational costs. The results of the quasi random band model compare well with the line by line and experimental results. The use of the quasi random band model is recommended for evaluation of the atmospheric radiation.
Variable Selection for Regression Models of Percentile Flows
NASA Astrophysics Data System (ADS)
Fouad, G.
2017-12-01
Percentile flows describe the flow magnitude equaled or exceeded for a given percent of time, and are widely used in water resource management. However, these statistics are normally unavailable since most basins are ungauged. Percentile flows of ungauged basins are often predicted using regression models based on readily observable basin characteristics, such as mean elevation. The number of these independent variables is too large to evaluate all possible models. A subset of models is typically evaluated using automatic procedures, like stepwise regression. This ignores a large variety of methods from the field of feature (variable) selection and physical understanding of percentile flows. A study of 918 basins in the United States was conducted to compare an automatic regression procedure to the following variable selection methods: (1) principal component analysis, (2) correlation analysis, (3) random forests, (4) genetic programming, (5) Bayesian networks, and (6) physical understanding. The automatic regression procedure only performed better than principal component analysis. Poor performance of the regression procedure was due to a commonly used filter for multicollinearity, which rejected the strongest models because they had cross-correlated independent variables. Multicollinearity did not decrease model performance in validation because of a representative set of calibration basins. Variable selection methods based strictly on predictive power (numbers 2-5 from above) performed similarly, likely indicating a limit to the predictive power of the variables. Similar performance was also reached using variables selected based on physical understanding, a finding that substantiates recent calls to emphasize physical understanding in modeling for predictions in ungauged basins. The strongest variables highlighted the importance of geology and land cover, whereas widely used topographic variables were the weakest predictors. Variables suffered from a high degree of multicollinearity, possibly illustrating the co-evolution of climatic and physiographic conditions. Given the ineffectiveness of many variables used here, future work should develop new variables that target specific processes associated with percentile flows.
Interactive model evaluation tool based on IPython notebook
NASA Astrophysics Data System (ADS)
Balemans, Sophie; Van Hoey, Stijn; Nopens, Ingmar; Seuntjes, Piet
2015-04-01
In hydrological modelling, some kind of parameter optimization is mostly performed. This can be the selection of a single best parameter set, a split in behavioural and non-behavioural parameter sets based on a selected threshold or a posterior parameter distribution derived with a formal Bayesian approach. The selection of the criterion to measure the goodness of fit (likelihood or any objective function) is an essential step in all of these methodologies and will affect the final selected parameter subset. Moreover, the discriminative power of the objective function is also dependent from the time period used. In practice, the optimization process is an iterative procedure. As such, in the course of the modelling process, an increasing amount of simulations is performed. However, the information carried by these simulation outputs is not always fully exploited. In this respect, we developed and present an interactive environment that enables the user to intuitively evaluate the model performance. The aim is to explore the parameter space graphically and to visualize the impact of the selected objective function on model behaviour. First, a set of model simulation results is loaded along with the corresponding parameter sets and a data set of the same variable as the model outcome (mostly discharge). The ranges of the loaded parameter sets define the parameter space. A selection of the two parameters visualised can be made by the user. Furthermore, an objective function and a time period of interest need to be selected. Based on this information, a two-dimensional parameter response surface is created, which actually just shows a scatter plot of the parameter combinations and assigns a color scale corresponding with the goodness of fit of each parameter combination. Finally, a slider is available to change the color mapping of the points. Actually, the slider provides a threshold to exclude non behaviour parameter sets and the color scale is only attributed to the remaining parameter sets. As such, by interactively changing the settings and interpreting the graph, the user gains insight in the model structural behaviour. Moreover, a more deliberate choice of objective function and periods of high information content can be identified. The environment is written in an IPython notebook and uses the available interactive functions provided by the IPython community. As such, the power of the IPython notebook as a development environment for scientific computing is illustrated (Shen, 2014).
Distributed-parameter watershed models are often utilized for evaluating the effectiveness of sediment and nutrient abatement strategies through the traditional {calibrate→ validate→ predict} approach. The applicability of the method is limited due to modeling approximations. In ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaczmarski, Krzysztof; Guiochon, Georges A
The adsorption isotherms of selected compounds are our main source of information on the mechanisms of adsorption processes. Thus, the selection of the methods used to determine adsorption isotherm data and to evaluate the errors made is critical. Three chromatographic methods were evaluated, frontal analysis (FA), frontal analysis by characteristic point (FACP), and the pulse or perturbation method (PM), and their accuracies were compared. Using the equilibrium-dispersive (ED) model of chromatography, breakthrough curves of single components were generated corresponding to three different adsorption isotherm models: the Langmuir, the bi-Langmuir, and the Moreau isotherms. For each breakthrough curve, the best conventionalmore » procedures of each method (FA, FACP, PM) were used to calculate the corresponding data point, using typical values of the parameters of each isotherm model, for four different values of the column efficiency (N = 500, 1000, 2000, and 10,000). Then, the data points were fitted to each isotherm model and the corresponding isotherm parameters were compared to those of the initial isotherm model. When isotherm data are derived with a chromatographic method, they may suffer from two types of errors: (1) the errors made in deriving the experimental data points from the chromatographic records; (2) the errors made in selecting an incorrect isotherm model and fitting to it the experimental data. Both errors decrease significantly with increasing column efficiency with FA and FACP, but not with PM.« less
Endometrial cancer risk prediction including serum-based biomarkers: results from the EPIC cohort.
Fortner, Renée T; Hüsing, Anika; Kühn, Tilman; Konar, Meric; Overvad, Kim; Tjønneland, Anne; Hansen, Louise; Boutron-Ruault, Marie-Christine; Severi, Gianluca; Fournier, Agnès; Boeing, Heiner; Trichopoulou, Antonia; Benetou, Vasiliki; Orfanos, Philippos; Masala, Giovanna; Agnoli, Claudia; Mattiello, Amalia; Tumino, Rosario; Sacerdote, Carlotta; Bueno-de-Mesquita, H B As; Peeters, Petra H M; Weiderpass, Elisabete; Gram, Inger T; Gavrilyuk, Oxana; Quirós, J Ramón; Maria Huerta, José; Ardanaz, Eva; Larrañaga, Nerea; Lujan-Barroso, Leila; Sánchez-Cantalejo, Emilio; Butt, Salma Tunå; Borgquist, Signe; Idahl, Annika; Lundin, Eva; Khaw, Kay-Tee; Allen, Naomi E; Rinaldi, Sabina; Dossus, Laure; Gunter, Marc; Merritt, Melissa A; Tzoulaki, Ioanna; Riboli, Elio; Kaaks, Rudolf
2017-03-15
Endometrial cancer risk prediction models including lifestyle, anthropometric and reproductive factors have limited discrimination. Adding biomarker data to these models may improve predictive capacity; to our knowledge, this has not been investigated for endometrial cancer. Using a nested case-control study within the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort, we investigated the improvement in discrimination gained by adding serum biomarker concentrations to risk estimates derived from an existing risk prediction model based on epidemiologic factors. Serum concentrations of sex steroid hormones, metabolic markers, growth factors, adipokines and cytokines were evaluated in a step-wise backward selection process; biomarkers were retained at p < 0.157 indicating improvement in the Akaike information criterion (AIC). Improvement in discrimination was assessed using the C-statistic for all biomarkers alone, and change in C-statistic from addition of biomarkers to preexisting absolute risk estimates. We used internal validation with bootstrapping (1000-fold) to adjust for over-fitting. Adiponectin, estrone, interleukin-1 receptor antagonist, tumor necrosis factor-alpha and triglycerides were selected into the model. After accounting for over-fitting, discrimination was improved by 2.0 percentage points when all evaluated biomarkers were included and 1.7 percentage points in the model including the selected biomarkers. Models including etiologic markers on independent pathways and genetic markers may further improve discrimination. © 2016 UICC.
How to determine an optimal threshold to classify real-time crash-prone traffic conditions?
Yang, Kui; Yu, Rongjie; Wang, Xuesong; Quddus, Mohammed; Xue, Lifang
2018-08-01
One of the proactive approaches in reducing traffic crashes is to identify hazardous traffic conditions that may lead to a traffic crash, known as real-time crash prediction. Threshold selection is one of the essential steps of real-time crash prediction. And it provides the cut-off point for the posterior probability which is used to separate potential crash warnings against normal traffic conditions, after the outcome of the probability of a crash occurring given a specific traffic condition on the basis of crash risk evaluation models. There is however a dearth of research that focuses on how to effectively determine an optimal threshold. And only when discussing the predictive performance of the models, a few studies utilized subjective methods to choose the threshold. The subjective methods cannot automatically identify the optimal thresholds in different traffic and weather conditions in real application. Thus, a theoretical method to select the threshold value is necessary for the sake of avoiding subjective judgments. The purpose of this study is to provide a theoretical method for automatically identifying the optimal threshold. Considering the random effects of variable factors across all roadway segments, the mixed logit model was utilized to develop the crash risk evaluation model and further evaluate the crash risk. Cross-entropy, between-class variance and other theories were employed and investigated to empirically identify the optimal threshold. And K-fold cross-validation was used to validate the performance of proposed threshold selection methods with the help of several evaluation criteria. The results indicate that (i) the mixed logit model can obtain a good performance; (ii) the classification performance of the threshold selected by the minimum cross-entropy method outperforms the other methods according to the criteria. This method can be well-behaved to automatically identify thresholds in crash prediction, by minimizing the cross entropy between the original dataset with continuous probability of a crash occurring and the binarized dataset after using the thresholds to separate potential crash warnings against normal traffic conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.
Žuvela, Petar; Liu, J Jay; Macur, Katarzyna; Bączek, Tomasz
2015-10-06
In this work, performance of five nature-inspired optimization algorithms, genetic algorithm (GA), particle swarm optimization (PSO), artificial bee colony (ABC), firefly algorithm (FA), and flower pollination algorithm (FPA), was compared in molecular descriptor selection for development of quantitative structure-retention relationship (QSRR) models for 83 peptides that originate from eight model proteins. The matrix with 423 descriptors was used as input, and QSRR models based on selected descriptors were built using partial least squares (PLS), whereas root mean square error of prediction (RMSEP) was used as a fitness function for their selection. Three performance criteria, prediction accuracy, computational cost, and the number of selected descriptors, were used to evaluate the developed QSRR models. The results show that all five variable selection methods outperform interval PLS (iPLS), sparse PLS (sPLS), and the full PLS model, whereas GA is superior because of its lowest computational cost and higher accuracy (RMSEP of 5.534%) with a smaller number of variables (nine descriptors). The GA-QSRR model was validated initially through Y-randomization. In addition, it was successfully validated with an external testing set out of 102 peptides originating from Bacillus subtilis proteomes (RMSEP of 22.030%). Its applicability domain was defined, from which it was evident that the developed GA-QSRR exhibited strong robustness. All the sources of the model's error were identified, thus allowing for further application of the developed methodology in proteomics.
Yu, Yuncui; Jia, Lulu; Meng, Yao; Hu, Lihua; Liu, Yiwei; Nie, Xiaolu; Zhang, Meng; Zhang, Xuan; Han, Sheng; Peng, Xiaoxia; Wang, Xiaoling
2018-04-01
Establishing a comprehensive clinical evaluation system is critical in enacting national drug policy and promoting rational drug use. In China, the 'Clinical Comprehensive Evaluation System for Pediatric Drugs' (CCES-P) project, which aims to compare drugs based on clinical efficacy and cost effectiveness to help decision makers, was recently proposed; therefore, a systematic and objective method is required to guide the process. An evidence-based multi-criteria decision analysis model that involved an analytic hierarchy process (AHP) was developed, consisting of nine steps: (1) select the drugs to be reviewed; (2) establish the evaluation criterion system; (3) determine the criterion weight based on the AHP; (4) construct the evidence body for each drug under evaluation; (5) select comparative measures and calculate the original utility score; (6) place a common utility scale and calculate the standardized utility score; (7) calculate the comprehensive utility score; (8) rank the drugs; and (9) perform a sensitivity analysis. The model was applied to the evaluation of three different inhaled corticosteroids (ICSs) used for asthma management in children (a total of 16 drugs with different dosage forms and strengths or different manufacturers). By applying the drug analysis model, the 16 ICSs under review were successfully scored and evaluated. Budesonide suspension for inhalation (drug ID number: 7) ranked the highest, with comprehensive utility score of 80.23, followed by fluticasone propionate inhaled aerosol (drug ID number: 16), with a score of 79.59, and budesonide inhalation powder (drug ID number: 6), with a score of 78.98. In the sensitivity analysis, the ranking of the top five and lowest five drugs remains unchanged, suggesting this model is generally robust. An evidence-based drug evaluation model based on AHP was successfully developed. The model incorporates sufficient utility and flexibility for aiding the decision-making process, and can be a useful tool for the CCES-P.
Application of a Multidimensional Nested Logit Model to Multiple-Choice Test Items
ERIC Educational Resources Information Center
Bolt, Daniel M.; Wollack, James A.; Suh, Youngsuk
2012-01-01
Nested logit models have been presented as an alternative to multinomial logistic models for multiple-choice test items (Suh and Bolt in "Psychometrika" 75:454-473, 2010) and possess a mathematical structure that naturally lends itself to evaluating the incremental information provided by attending to distractor selection in scoring. One potential…
US/Canada wheat and barley crop calender exploratory experiment implementation plan
NASA Technical Reports Server (NTRS)
1980-01-01
A plan is detailed for a supplemental experiment to evaluate several crop growth stage models and crop starter models. The objective of this experiment is to provide timely information to aid in understanding crop calendars and to provide data that will allow a selection between current crop calendar models.
ERIC Educational Resources Information Center
Hamilton, Erica R.; Rosenberg, Joshua M.; Akcaoglu, Mete
2016-01-01
The Substitution, Augmentation, Modification, and Redefinition (SAMR) model is a four-level, taxonomy-based approach for selecting, using, and evaluating technology in K-12 settings (Puentedura 2006). Despite its increasing popularity among practitioners, the SAMR model is not currently represented in the extant literature. To focus the ongoing…
Acquisition Management for Systems-of-Systems: Exploratory Model Development and Experimentation
2009-04-22
outputs of the Requirements Development and Logical Analysis processes into alternative design solutions and selects a final design solution. Decision...Analysis Provides the basis for evaluating and selecting alternatives when decisions need to be made. Implementation Yields the lowest-level system... Dependenc y Matrix 1 ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎣ ⎡ 011 100 110 2 ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎣ ⎡ 000 100 100 a) Example of SoS b) Model Structure for Example SoS
Thermal sensing of cryogenic wind tunnel model surfaces Evaluation of silicon diodes
NASA Technical Reports Server (NTRS)
Daryabeigi, K.; Ash, R. L.; Dillon-Townes, L. A.
1986-01-01
Different sensors and installation techniques for surface temperature measurement of cryogenic wind tunnel models were investigated. Silicon diodes were selected for further consideration because of their good inherent accuracy. Their average absolute temperature deviation in comparison tests with standard platinum resistance thermometers was found to be 0.2 K in the range from 125 to 273 K. Subsurface temperature measurement was selected as the installation technique in order to minimize aerodynamic interference. Temperature distortion caused by an embedded silicon diode was studied numerically.
Thermal sensing of cryogenic wind tunnel model surfaces - Evaluation of silicon diodes
NASA Technical Reports Server (NTRS)
Daryabeigi, Kamran; Ash, Robert L.; Dillon-Townes, Lawrence A.
1986-01-01
Different sensors and installation techniques for surface temperature measurement of cryogenic wind tunnel models were investigated. Silicon diodes were selected for further consideration because of their good inherent accuracy. Their average absolute temperature deviation in comparison tests with standard platinum resistance thermometers was found to be 0.2 K in the range from 125 to 273 K. Subsurface temperature measurement was selected as the installation technique in order to minimize aerodynamic interference. Temperature distortion caused by an embedded silicon diode was studied numerically.
A Fuzzy-Based Decision Support Model for Selecting the Best Dialyser Flux in Haemodialysis.
Oztürk, Necla; Tozan, Hakan
2015-01-01
Decision making is an important procedure for every organization. The procedure is particularly challenging for complicated multi-criteria problems. Selection of dialyser flux is one of the decisions routinely made for haemodialysis treatment provided for chronic kidney failure patients. This study provides a decision support model for selecting the best dialyser flux between high-flux and low-flux dialyser alternatives. The preferences of decision makers were collected via a questionnaire. A total of 45 questionnaires filled by dialysis physicians and nephrologists were assessed. A hybrid fuzzy-based decision support software that enables the use of Analytic Hierarchy Process (AHP), Fuzzy Analytic Hierarchy Process (FAHP), Analytic Network Process (ANP), and Fuzzy Analytic Network Process (FANP) was used to evaluate the flux selection model. In conclusion, the results showed that a high-flux dialyser is the best. option for haemodialysis treatment.
2013-01-01
Drugs that selectively activate estrogen receptor β (ERβ) are potentially safer than the nonselective estrogens currently used in hormonal replacement treatments that activate both ERβ and ERα. The selective ERβ agonist AC-186 was evaluated in a rat model of Parkinson’s disease induced through bilateral 6-hydroxydopamine lesions of the substantia nigra. In this model, AC-186 prevented motor, cognitive, and sensorimotor gating deficits and mitigated the loss of dopamine neurons in the substantia nigra, in males, but not in females. Furthermore, in male rats, 17β-estradiol, which activates ERβ and ERα with equal potency, did not show the same neuroprotective benefits as AC-186. Hence, in addition to a beneficial safety profile for use in both males and females, a selective ERβ agonist has a differentiated pharmacological profile compared to 17β-estradiol in males. PMID:23898966
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-08-01
The TTI CM/AQ Evaluation Model evaluates potential projects based on the following criteria: eligibility, travel impacts, emission impacts, and cost-effectiveness. To compare independent projects within a region during the decision process for CM/AQ funding, each project evaluated with this model is given an overall score based on the project`s effects for the criteria listed above. Training workshops were held by TTI in the first quarter of 1995 to teach metropolitan planning organization, state department of transportation, and regional air quality organization staff how to use this model. Basics of sketch-planning applications were also taught. The DRCOG and TTI CM/AQ Evaluationmore » Models represent significant steps toward the development of analytical methodologies for selecting projects for CM/AQ funding. Because the needs of nonattainment and attainment areas change over time, this model is particularly useful as key evaluation criteria can be modified to reflect the changing needs of a metropolitan area.« less
Evaluation of a Nutrition Education Program for Family Practice Residents.
ERIC Educational Resources Information Center
Gray, David S.; And Others
1988-01-01
A nutrition education program at the University of South Alabama Medical Center that was based on the "co-counseling model" as described by Moore and Larsen is described. Patients with one of three problem areas were selected for evaluation: hypertension, diabetes mellitus, and pregnancy. (MLW)
Firefly as a novel swarm intelligence variable selection method in spectroscopy.
Goodarzi, Mohammad; dos Santos Coelho, Leandro
2014-12-10
A critical step in multivariate calibration is wavelength selection, which is used to build models with better prediction performance when applied to spectral data. Up to now, many feature selection techniques have been developed. Among all different types of feature selection techniques, those based on swarm intelligence optimization methodologies are more interesting since they are usually simulated based on animal and insect life behavior to, e.g., find the shortest path between a food source and their nests. This decision is made by a crowd, leading to a more robust model with less falling in local minima during the optimization cycle. This paper represents a novel feature selection approach to the selection of spectroscopic data, leading to more robust calibration models. The performance of the firefly algorithm, a swarm intelligence paradigm, was evaluated and compared with genetic algorithm and particle swarm optimization. All three techniques were coupled with partial least squares (PLS) and applied to three spectroscopic data sets. They demonstrate improved prediction results in comparison to when only a PLS model was built using all wavelengths. Results show that firefly algorithm as a novel swarm paradigm leads to a lower number of selected wavelengths while the prediction performance of built PLS stays the same. Copyright © 2014. Published by Elsevier B.V.
Clark, Steven M.; Dunham, Jason B.; McEnroe, Jeffery R.; Lightcap, Scott W.
2014-01-01
The fitness of female Pacific salmon (Oncorhynchus spp.) with respect to breeding behavior can be partitioned into at least four fitness components: survival to reproduction, competition for breeding sites, success of egg incubation, and suitability of the local environment near breeding sites for early rearing of juveniles. We evaluated the relative influences of habitat features linked to these fitness components with respect to selection of breeding sites by coho salmon (Oncorhynchus kisutch). We also evaluated associations between breeding site selection and additions of large wood, as the latter were introduced into the study system as a means of restoring habitat conditions to benefit coho salmon. We used a model selection approach to organize specific habitat features into groupings reflecting fitness components and influences of large wood. Results of this work suggest that female coho salmon likely select breeding sites based on a wide range of habitat features linked to all four hypothesized fitness components. More specifically, model parameter estimates indicated that breeding site selection was most strongly influenced by proximity to pool-tail crests and deeper water (mean and maximum depths). Linkages between large wood and breeding site selection were less clear. Overall, our findings suggest that breeding site selection by coho salmon is influenced by a suite of fitness components in addition to the egg incubation environment, which has been the emphasis of much work in the past.
Seismic activity prediction using computational intelligence techniques in northern Pakistan
NASA Astrophysics Data System (ADS)
Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat
2017-10-01
Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.
Optimal experimental designs for fMRI when the model matrix is uncertain.
Kao, Ming-Hung; Zhou, Lin
2017-07-15
This study concerns optimal designs for functional magnetic resonance imaging (fMRI) experiments when the model matrix of the statistical model depends on both the selected stimulus sequence (fMRI design), and the subject's uncertain feedback (e.g. answer) to each mental stimulus (e.g. question) presented to her/him. While practically important, this design issue is challenging. This mainly is because that the information matrix cannot be fully determined at the design stage, making it difficult to evaluate the quality of the selected designs. To tackle this challenging issue, we propose an easy-to-use optimality criterion for evaluating the quality of designs, and an efficient approach for obtaining designs optimizing this criterion. Compared with a previously proposed method, our approach requires a much less computing time to achieve designs with high statistical efficiencies. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zoraghi, Nima; Amiri, Maghsoud; Talebi, Golnaz; Zowghi, Mahdi
2013-12-01
This paper presents a fuzzy multi-criteria decision-making (FMCDM) model by integrating both subjective and objective weights for ranking and evaluating the service quality in hotels. The objective method selects weights of criteria through mathematical calculation, while the subjective method uses judgments of decision makers. In this paper, we use a combination of weights obtained by both approaches in evaluating service quality in hotel industries. A real case study that considered ranking five hotels is illustrated. Examples are shown to indicate capabilities of the proposed method.
USDA-ARS?s Scientific Manuscript database
Genomic Selection (GS) is a new breeding method in which genome-wide markers are used to predict the breeding value of individuals in a breeding population. GS has been shown to improve breeding efficiency in dairy cattle and several crop plant species, and here we evaluate for the first time its ef...
An Integrated DEMATEL-VIKOR Method-Based Approach for Cotton Fibre Selection and Evaluation
NASA Astrophysics Data System (ADS)
Chakraborty, Shankar; Chatterjee, Prasenjit; Prasad, Kanika
2018-01-01
Selection of the most appropriate cotton fibre type for yarn manufacturing is often treated as a multi-criteria decision-making (MCDM) problem as the optimal selection decision needs to be taken in presence of several conflicting fibre properties. In this paper, two popular MCDM methods in the form of decision making trial and evaluation laboratory (DEMATEL) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR) are integrated to aid the cotton fibre selection decision. DEMATEL method addresses the interrelationships between various physical properties of cotton fibres while segregating them into cause and effect groups, whereas, VIKOR method helps in ranking all the considered 17 cotton fibres from the best to the worst. The derived ranking of cotton fibre alternatives closely matches with that obtained by the past researchers. This model can assist the spinning industry personnel in the blending process while making accurate fibre selection decision when cotton fibre properties are numerous and interrelated.
An Integrated DEMATEL-VIKOR Method-Based Approach for Cotton Fibre Selection and Evaluation
NASA Astrophysics Data System (ADS)
Chakraborty, Shankar; Chatterjee, Prasenjit; Prasad, Kanika
2018-06-01
Selection of the most appropriate cotton fibre type for yarn manufacturing is often treated as a multi-criteria decision-making (MCDM) problem as the optimal selection decision needs to be taken in presence of several conflicting fibre properties. In this paper, two popular MCDM methods in the form of decision making trial and evaluation laboratory (DEMATEL) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR) are integrated to aid the cotton fibre selection decision. DEMATEL method addresses the interrelationships between various physical properties of cotton fibres while segregating them into cause and effect groups, whereas, VIKOR method helps in ranking all the considered 17 cotton fibres from the best to the worst. The derived ranking of cotton fibre alternatives closely matches with that obtained by the past researchers. This model can assist the spinning industry personnel in the blending process while making accurate fibre selection decision when cotton fibre properties are numerous and interrelated.
Enhancing the Performance of LibSVM Classifier by Kernel F-Score Feature Selection
NASA Astrophysics Data System (ADS)
Sarojini, Balakrishnan; Ramaraj, Narayanasamy; Nickolas, Savarimuthu
Medical Data mining is the search for relationships and patterns within the medical datasets that could provide useful knowledge for effective clinical decisions. The inclusion of irrelevant, redundant and noisy features in the process model results in poor predictive accuracy. Much research work in data mining has gone into improving the predictive accuracy of the classifiers by applying the techniques of feature selection. Feature selection in medical data mining is appreciable as the diagnosis of the disease could be done in this patient-care activity with minimum number of significant features. The objective of this work is to show that selecting the more significant features would improve the performance of the classifier. We empirically evaluate the classification effectiveness of LibSVM classifier on the reduced feature subset of diabetes dataset. The evaluations suggest that the feature subset selected improves the predictive accuracy of the classifier and reduce false negatives and false positives.
2014-10-01
variability with well trained readers. Figure 7: comparison between the PD (percent density using Cumulus area) and the automatic PD. The...evaluation of outlier correction, comparison of several different software methods, precision measurement, and evaluation of variation by mammography...chart review for selected cases (month 4-6). Comparison of information from the Breast Cancer Database and medical records showed good consistency
Lippman, Sheri A.; Shade, Starley B.; Hubbard, Alan E.
2011-01-01
Background Intervention effects estimated from non-randomized intervention studies are plagued by biases, yet social or structural intervention studies are rarely randomized. There are underutilized statistical methods available to mitigate biases due to self-selection, missing data, and confounding in longitudinal, observational data permitting estimation of causal effects. We demonstrate the use of Inverse Probability Weighting (IPW) to evaluate the effect of participating in a combined clinical and social STI/HIV prevention intervention on reduction of incident chlamydia and gonorrhea infections among sex workers in Brazil. Methods We demonstrate the step-by-step use of IPW, including presentation of the theoretical background, data set up, model selection for weighting, application of weights, estimation of effects using varied modeling procedures, and discussion of assumptions for use of IPW. Results 420 sex workers contributed data on 840 incident chlamydia and gonorrhea infections. Participators were compared to non-participators following application of inverse probability weights to correct for differences in covariate patterns between exposed and unexposed participants and between those who remained in the intervention and those who were lost-to-follow-up. Estimators using four model selection procedures provided estimates of intervention effect between odds ratio (OR) .43 (95% CI:.22-.85) and .53 (95% CI:.26-1.1). Conclusions After correcting for selection bias, loss-to-follow-up, and confounding, our analysis suggests a protective effect of participating in the Encontros intervention. Evaluations of behavioral, social, and multi-level interventions to prevent STI can benefit by introduction of weighting methods such as IPW. PMID:20375927
Wolc, Anna; Stricker, Chris; Arango, Jesus; Settar, Petek; Fulton, Janet E; O'Sullivan, Neil P; Preisinger, Rudolf; Habier, David; Fernando, Rohan; Garrick, Dorian J; Lamont, Susan J; Dekkers, Jack C M
2011-01-21
Genomic selection involves breeding value estimation of selection candidates based on high-density SNP genotypes. To quantify the potential benefit of genomic selection, accuracies of estimated breeding values (EBV) obtained with different methods using pedigree or high-density SNP genotypes were evaluated and compared in a commercial layer chicken breeding line. The following traits were analyzed: egg production, egg weight, egg color, shell strength, age at sexual maturity, body weight, albumen height, and yolk weight. Predictions appropriate for early or late selection were compared. A total of 2,708 birds were genotyped for 23,356 segregating SNP, including 1,563 females with records. Phenotypes on relatives without genotypes were incorporated in the analysis (in total 13,049 production records).The data were analyzed with a Reduced Animal Model using a relationship matrix based on pedigree data or on marker genotypes and with a Bayesian method using model averaging. Using a validation set that consisted of individuals from the generation following training, these methods were compared by correlating EBV with phenotypes corrected for fixed effects, selecting the top 30 individuals based on EBV and evaluating their mean phenotype, and by regressing phenotypes on EBV. Using high-density SNP genotypes increased accuracies of EBV up to two-fold for selection at an early age and by up to 88% for selection at a later age. Accuracy increases at an early age can be mostly attributed to improved estimates of parental EBV for shell quality and egg production, while for other egg quality traits it is mostly due to improved estimates of Mendelian sampling effects. A relatively small number of markers was sufficient to explain most of the genetic variation for egg weight and body weight.
Aesthetic evolution by mate choice: Darwin's really dangerous idea.
Prum, Richard O
2012-08-19
Darwin proposed an explicitly aesthetic theory of sexual selection in which he described mate preferences as a 'taste for the beautiful', an 'aesthetic capacity', etc. These statements were not merely colourful Victorian mannerisms, but explicit expressions of Darwin's hypothesis that mate preferences can evolve for arbitrarily attractive traits that do not provide any additional benefits to mate choice. In his critique of Darwin, A. R. Wallace proposed an entirely modern mechanism of mate preference evolution through the correlation of display traits with male vigour or viability, but he called this mechanism natural selection. Wallace's honest advertisement proposal was stridently anti-Darwinian and anti-aesthetic. Most modern sexual selection research relies on essentially the same Neo-Wallacean theory renamed as sexual selection. I define the process of aesthetic evolution as the evolution of a communication signal through sensory/cognitive evaluation, which is most elaborated through coevolution of the signal and its evaluation. Sensory evaluation includes the possibility that display traits do not encode information that is being assessed, but are merely preferred. A genuinely Darwinian, aesthetic theory of sexual selection requires the incorporation of the Lande-Kirkpatrick null model into sexual selection research, but also encompasses the possibility of sensory bias, good genes and direct benefits mechanisms.
Model Identification of Integrated ARMA Processes
ERIC Educational Resources Information Center
Stadnytska, Tetiana; Braun, Simone; Werner, Joachim
2008-01-01
This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…
Bacteriophage: A Model System for Active Learning.
ERIC Educational Resources Information Center
Luciano, Carl S.; Young, Matthew W.; Patterson, Robin R.
2002-01-01
Describes a student-centered laboratory course in which student teams select phage from sewage samples and characterize the phage in a semester-long project that models real-life scientific research. Results of student evaluations indicate a high level of satisfaction with the course. (Author/MM)
The Multiple Component Alternative for Gifted Education.
ERIC Educational Resources Information Center
Swassing, Ray
1984-01-01
The Multiple Component Model (MCM) of gifted education includes instruction which may overlap in literature, history, art, enrichment, languages, science, physics, math, music, and dance. The model rests on multifactored identification and requires systematic development and selection of components with ongoing feedback and evaluation. (CL)
Watershed scale response to climate change--Yampa River Basin, Colorado
Hay, Lauren E.; Battaglin, William A.; Markstrom, Steven L.
2012-01-01
General Circulation Model simulations of future climate through 2099 project a wide range of possible scenarios. To determine the sensitivity and potential effect of long-term climate change on the freshwater resources of the United States, the U.S. Geological Survey Global Change study, "An integrated watershed scale response to global change in selected basins across the United States" was started in 2008. The long-term goal of this national study is to provide the foundation for hydrologically based climate change studies across the nation. Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Yampa River Basin at Steamboat Springs, Colorado.
NASA Astrophysics Data System (ADS)
Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison
2017-11-01
Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.
Constructing optimal ensemble projections for predictive environmental modelling in Northern Eurasia
NASA Astrophysics Data System (ADS)
Anisimov, Oleg; Kokorev, Vasily
2013-04-01
Large uncertainties in climate impact modelling are associated with the forcing climate data. This study is targeted at the evaluation of the quality of GCM-based climatic projections in the specific context of predictive environmental modelling in Northern Eurasia. To accomplish this task, we used the output from 36 CMIP5 GCMs from the IPCC AR-5 data base for the control period 1975-2005 and calculated several climatic characteristics and indexes that are most often used in the impact models, i.e. the summer warmth index, duration of the vegetation growth period, precipitation sums, dryness index, thawing degree-day sums, and the annual temperature amplitude. We used data from 744 weather stations in Russia and neighbouring countries to analyze the spatial patterns of modern climatic change and to delineate 17 large regions with coherent temperature changes in the past few decades. GSM results and observational data were averaged over the coherent regions and compared with each other. Ultimately, we evaluated the skills of individual models, ranked them in the context of regional impact modelling and identified top-end GCMs that "better than average" reproduce modern regional changes of the selected meteorological parameters and climatic indexes. Selected top-end GCMs were used to compose several ensembles, each combining results from the different number of models. Ensembles were ranked using the same algorithm and outliers eliminated. We then used data from top-end ensembles for the 2000-2100 period to construct the climatic projections that are likely to be "better than average" in predicting climatic parameters that govern the state of environment in Northern Eurasia. The ultimate conclusions of our study are the following. • High-end GCMs that demonstrate excellent skills in conventional atmospheric model intercomparison experiments are not necessarily the best in replicating climatic characteristics that govern the state of environment in Northern Eurasia, and independent model evaluation on regional level is necessary to identify "better than average" GCMs. • Each of the ensembles combining results from several "better than average" models replicate selected meteorological parameters and climatic indexes better than any single GCM. The ensemble skills are parameter-specific and depend on models it consists of. The best results are not necessarily those based on the ensemble comprised by all "better than average" models. • Comprehensive evaluation of climatic scenarios using specific criteria narrows the range of uncertainties in environmental projections.
Watershed scale response to climate change--Trout Lake Basin, Wisconsin
Walker, John F.; Hunt, Randall J.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Trout River Basin at Trout Lake in northern Wisconsin.
Watershed scale response to climate change--Clear Creek Basin, Iowa
Christiansen, Daniel E.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Clear Creek Basin, near Coralville, Iowa.
Watershed scale response to climate change--Feather River Basin, California
Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Feather River Basin, California.
Watershed scale response to climate change--South Fork Flathead River Basin, Montana
Chase, Katherine J.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the South Fork Flathead River Basin, Montana.
Watershed scale response to climate change--Cathance Stream Basin, Maine
Dudley, Robert W.; Hay, Lauren E.; Markstrom, Steven L.; Hodgkins, Glenn A.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Cathance Stream Basin, Maine.
Watershed scale response to climate change--Pomperaug River Watershed, Connecticut
Bjerklie, David M.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Pomperaug River Basin at Southbury, Connecticut.
Watershed scale response to climate change--Starkweather Coulee Basin, North Dakota
Vining, Kevin C.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Starkweather Coulee Basin near Webster, North Dakota.
Watershed scale response to climate change--Sagehen Creek Basin, California
Markstrom, Steven L.; Hay, Lauren E.; Regan, R. Steven
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Sagehen Creek Basin near Truckee, California.
Watershed scale response to climate change--Sprague River Basin, Oregon
Risley, John; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Sprague River Basin near Chiloquin, Oregon.
Watershed scale response to climate change--Black Earth Creek Basin, Wisconsin
Hunt, Randall J.; Walker, John F.; Westenbroek, Steven M.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Black Earth Creek Basin, Wisconsin.
Watershed scale response to climate change--East River Basin, Colorado
Battaglin, William A.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the East River Basin, Colorado.
Watershed scale response to climate change--Naches River Basin, Washington
Mastin, Mark C.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Naches River Basin below Tieton River in Washington.
Watershed scale response to climate change--Flint River Basin, Georgia
Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Flint River Basin at Montezuma, Georgia.
Evaluation of the effects of a freeze/thaw environment on cellular glass
NASA Technical Reports Server (NTRS)
Frickland, P.; Cleland, E.; Hasegawa, T.
1981-01-01
Using the evaluation criteria of water vapor permeability and conformability, a protective butylrubber/silicone conformal coating system was selected for use on Foamglas substrates in a freeze/thaw environment. The selection of a specific freeze/thaw cycle which closely models field conditions is discussed. A sampling plan is described which allows independent evaluation of the effects of conformal coatings, cycle number and location within the environmental chamber. The results of visual examination, measurement of density, modulus of rupture and Young's modulus are reported. Based upon statistical evaluation of the experimental results, it is concluded that no degradation in mechanical properties of either coated or uncoated Foamglas occurred within the duration of the test (53 freeze/thaw cycles).
Conducting field studies for testing pesticide leaching models
Smith, Charles N.; Parrish, Rudolph S.; Brown, David S.
1990-01-01
A variety of predictive models are being applied to evaluate the transport and transformation of pesticides in the environment. These include well known models such as the Pesticide Root Zone Model (PRZM), the Risk of Unsaturated-Saturated Transport and Transformation Interactions for Chemical Concentrations Model (RUSTIC) and the Groundwater Loading Effects of Agricultural Management Systems Model (GLEAMS). The potentially large impacts of using these models as tools for developing pesticide management strategies and regulatory decisions necessitates development of sound model validation protocols. This paper offers guidance on many of the theoretical and practical problems encountered in the design and implementation of field-scale model validation studies. Recommendations are provided for site selection and characterization, test compound selection, data needs, measurement techniques, statistical design considerations and sampling techniques. A strategy is provided for quantitatively testing models using field measurements.
NASA Astrophysics Data System (ADS)
Krishnan, Govindarajapuram Subramaniam
1997-12-01
The National Aeronautics & Space Administration (NASA), the European Space Agency (ESA), and the Canadian Space Agency (CSA) missions involve the performance of scientific experiments in Space. Instruments used in such experiments are fabricated using electronic parts such as microcircuits, inductors, capacitors, diodes, transistors, etc. For instruments to perform reliably the selection of commercial parts must be monitored and strictly controlled. The process used to achieve this goal is by a manual review and approval of every part used to build the instrument. The present system to select and approve parts for space applications is manual, inefficient, inconsistent, slow and tedious, and very costly. In this dissertation a computer based decision support model is developed for implementing this process using artificial intelligence concepts based on the current information (expert sources). Such a model would result in a greater consistency, accuracy, and timeliness of evaluation. This study presents the methodology of development and features of the model, and the analysis of the data pertaining to the performance of the model in the field. The model was evaluated for three different part types by experts from three different space agencies. The results show that the model was more consistent than the manual evaluation for all part types considered. The study concludes with the cost and benefits analysis of implementing the models and shows that implementation of the model will result in significant cost savings. Other implementation details are highlighted.
ERIC Educational Resources Information Center
Haider, Zubair; Latif, Farah; Akhtar, Samina; Mushtaq, Maria
2012-01-01
Validity, reliability and item analysis are critical to the process of evaluating the quality of an educational measurement. The present study evaluates the quality of an assessment constructed to measure elementary school student's achievement in English. In this study, the survey model of descriptive research was used as a research method.…
Li, Zhidong; Marinova, Dora; Guo, Xiumei; Gao, Yuan
2015-01-01
Many steel-based cities in China were established between the 1950s and 1960s. After more than half a century of development and boom, these cities are starting to decline and industrial transformation is urgently needed. This paper focuses on evaluating the transformation capability of resource-based cities building an evaluation model. Using Text Mining and the Document Explorer technique as a way of extracting text features, the 200 most frequently used words are derived from 100 publications related to steel- and other resource-based cities. The Expert Evaluation Method (EEM) and Analytic Hierarchy Process (AHP) techniques are then applied to select 53 indicators, determine their weights and establish an index system for evaluating the transformation capability of the pillar industry of China’s steel-based cities. Using real data and expert reviews, the improved Fuzzy Relation Matrix (FRM) method is applied to two case studies in China, namely Panzhihua and Daye, and the evaluation model is developed using Fuzzy Comprehensive Evaluation (FCE). The cities’ abilities to carry out industrial transformation are evaluated with concerns expressed for the case of Daye. The findings have policy implications for the potential and required industrial transformation in the two selected cities and other resource-based towns. PMID:26422266
Li, Zhidong; Marinova, Dora; Guo, Xiumei; Gao, Yuan
2015-01-01
Many steel-based cities in China were established between the 1950s and 1960s. After more than half a century of development and boom, these cities are starting to decline and industrial transformation is urgently needed. This paper focuses on evaluating the transformation capability of resource-based cities building an evaluation model. Using Text Mining and the Document Explorer technique as a way of extracting text features, the 200 most frequently used words are derived from 100 publications related to steel- and other resource-based cities. The Expert Evaluation Method (EEM) and Analytic Hierarchy Process (AHP) techniques are then applied to select 53 indicators, determine their weights and establish an index system for evaluating the transformation capability of the pillar industry of China's steel-based cities. Using real data and expert reviews, the improved Fuzzy Relation Matrix (FRM) method is applied to two case studies in China, namely Panzhihua and Daye, and the evaluation model is developed using Fuzzy Comprehensive Evaluation (FCE). The cities' abilities to carry out industrial transformation are evaluated with concerns expressed for the case of Daye. The findings have policy implications for the potential and required industrial transformation in the two selected cities and other resource-based towns.
Endothelialized ePTFE Graft by Nanobiotechnology
2013-11-29
The Apparatus for Processing the Tubular Graft Modification Will be Designed and Evaluated.; The On-site Capturing of the Endothelial (Progenitor) Cells by Peptide-mediated Selective Adhesion in Vitro and in Vivo Will Also be Elucidated.; The Patency Rate of ITRI-made Artificial Blood Vessels Will be Evaluated by the Porcine Animal Model.
Chung, Eun-Sung; Lee, Kil Seong
2009-03-01
The objective of this study is to develop an alternative evaluation index (AEI) in order to determine the priorities of a range of alternatives using both the hydrological simulation program in FORTRAN (HSPF) and multicriteria decision making (MCDM) techniques. In order to formulate the HSPF model, sensitivity analyses of water quantity (peak discharge and total volume) and quality (BOD peak concentrations and total loads) are conducted and a number of critical parameters were selected. To achieve a more precise simulation, the study watershed is divided into four regions for calibration and verification according to landuse, location, slope, and climate data. All evaluation criteria were selected using the Driver-Pressure-State-Impact-Response (DPSIR) model, a sustainability evaluation concept. The Analytic Hierarchy Process is used to estimate the weights of the criteria and the effects of water quantity and quality were quantified by HSPF simulation. In addition, AEIs that reflected residents' preferences for management objectives are proposed in order to induce the stakeholder to participate in the decision making process.
NASA Astrophysics Data System (ADS)
Wentworth, Mami Tonoe
Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.
Brown, Andrew D; Marotta, Thomas R
2017-02-01
Incorrect imaging protocol selection can contribute to increased healthcare cost and waste. To help healthcare providers improve the quality and safety of medical imaging services, we developed and evaluated three natural language processing (NLP) models to determine whether NLP techniques could be employed to aid in clinical decision support for protocoling and prioritization of magnetic resonance imaging (MRI) brain examinations. To test the feasibility of using an NLP model to support clinical decision making for MRI brain examinations, we designed three different medical imaging prediction tasks, each with a unique outcome: selecting an examination protocol, evaluating the need for contrast administration, and determining priority. We created three models for each prediction task, each using a different classification algorithm-random forest, support vector machine, or k-nearest neighbor-to predict outcomes based on the narrative clinical indications and demographic data associated with 13,982 MRI brain examinations performed from January 1, 2013 to June 30, 2015. Test datasets were used to calculate the accuracy, sensitivity and specificity, predictive values, and the area under the curve. Our optimal results show an accuracy of 82.9%, 83.0%, and 88.2% for the protocol selection, contrast administration, and prioritization tasks, respectively, demonstrating that predictive algorithms can be used to aid in clinical decision support for examination protocoling. NLP models developed from the narrative clinical information provided by referring clinicians and demographic data are feasible methods to predict the protocol and priority of MRI brain examinations. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
A data-driven multi-model methodology with deep feature selection for short-term wind forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias
With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by firstmore » layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.« less
Evaluation of Mass Filtered, Time Dilated, Time-of-Flight Mass Spectrometry
2010-01-01
Figure 4.4: Mass resolution dependence on field for selected actinides and surrogates...45 Figure 4.7: Mass resolution dependence on field for selected actinides and actinide surrogates, modeled with no initial...system. A somewhat better mass resolution would need to be achieved in order to separate hydride molecules in the actinide region. However, the
A comparison of two modeling approaches for evaluating wildlife--habitat relationships
Ryan A. Long; Jonathan D. Muir; Janet L. Rachlow; John G. Kie
2009-01-01
Studies of resource selection form the basis for much of our understanding of wildlife habitat requirements, and resource selection functions (RSFs), which predict relative probability of use, have been proposed as a unifying concept for analysis and interpretation of wildlife habitat data. Logistic regression that contrasts used and available or unused resource units...
ERIC Educational Resources Information Center
Lang, Russell; Regester, April; Mulloy, Austin; Rispoli, Mandy; Botout, Amanda
2011-01-01
We evaluated a behavioral intervention for a 9-year-old girl with selective mutism. The intervention consisted of role play and video self-modeling. The frequency of spoken initiations, responses to questions, and communication breakdowns was measured during three social situations (i.e., ordering in a restaurant, meeting new adults, and playing…
ERIC Educational Resources Information Center
Hansen, Henrik; Klejnstrup, Ninja Ritter; Andersen, Ole Winckler
2013-01-01
There is a long-standing debate as to whether nonexperimental estimators of causal effects of social programs can overcome selection bias. Most existing reviews either are inconclusive or point to significant selection biases in nonexperimental studies. However, many of the reviews, the so-called "between-studies," do not make direct…
NASA Astrophysics Data System (ADS)
Sitek, P.; Vilenius, E.; Mall, U.
2008-01-01
We describe the performance evaluation of a sample of InGaAs detectors from which the best unit had to be selected for the flight model of the SIR-2 NIR-spectrometer to be flown on the Chandrayaan-1 mission in 2008.
ERIC Educational Resources Information Center
Myer, Teresa A.
This study examined four teacher in-service environmental education programs to: (1) suggest a workable evaluative model for such programs; (2) assess their content with respect to stated activities and objectives; and (3) determine whether or not the experiences correlated with changes in selected teaching behaviors. The research design included…
Bayesian Covariate Selection in Mixed-Effects Models For Longitudinal Shape Analysis
Muralidharan, Prasanna; Fishbaugh, James; Kim, Eun Young; Johnson, Hans J.; Paulsen, Jane S.; Gerig, Guido; Fletcher, P. Thomas
2016-01-01
The goal of longitudinal shape analysis is to understand how anatomical shape changes over time, in response to biological processes, including growth, aging, or disease. In many imaging studies, it is also critical to understand how these shape changes are affected by other factors, such as sex, disease diagnosis, IQ, etc. Current approaches to longitudinal shape analysis have focused on modeling age-related shape changes, but have not included the ability to handle covariates. In this paper, we present a novel Bayesian mixed-effects shape model that incorporates simultaneous relationships between longitudinal shape data and multiple predictors or covariates to the model. Moreover, we place an Automatic Relevance Determination (ARD) prior on the parameters, that lets us automatically select which covariates are most relevant to the model based on observed data. We evaluate our proposed model and inference procedure on a longitudinal study of Huntington's disease from PREDICT-HD. We first show the utility of the ARD prior for model selection in a univariate modeling of striatal volume, and next we apply the full high-dimensional longitudinal shape model to putamen shapes. PMID:28090246
Gale, T C E; Roberts, M J; Sice, P J; Langton, J A; Patterson, F C; Carr, A S; Anderson, I R; Lam, W H; Davies, P R F
2010-11-01
Assessment centres are an accepted method of recruitment in industry and are gaining popularity within medicine. We describe the development and validation of a selection centre for recruitment to speciality training in anaesthesia based on an assessment centre model incorporating the rating of candidate's non-technical skills. Expert consensus identified non-technical skills suitable for assessment at the point of selection. Four stations-structured interview, portfolio review, presentation, and simulation-were developed, the latter two being realistic scenarios of work-related tasks. Evaluation of the selection centre focused on applicant and assessor feedback ratings, inter-rater agreement, and internal consistency reliability coefficients. Predictive validity was sought via correlations of selection centre scores with subsequent workplace-based ratings of appointed trainees. Two hundred and twenty-four candidates were assessed over two consecutive annual recruitment rounds; 68 were appointed and followed up during training. Candidates and assessors demonstrated strong approval of the selection centre with more than 70% of ratings 'good' or 'excellent'. Mean inter-rater agreement coefficients ranged from 0.62 to 0.77 and internal consistency reliability of the selection centre score was high (Cronbach's α=0.88-0.91). The overall selection centre score was a good predictor of workplace performance during the first year of appointment. An assessment centre model based on the rating of non-technical skills can produce a reliable and valid selection tool for recruitment to speciality training in anaesthesia. Early results on predictive validity are encouraging and justify further development and evaluation.
Evidence-based selection process to the Master of Public Health program at Medical University.
Panczyk, Mariusz; Juszczyk, Grzegorz; Zarzeka, Aleksander; Samoliński, Łukasz; Belowska, Jarosława; Cieślak, Ilona; Gotlib, Joanna
2017-09-11
Evaluation of the predictive validity of selected sociodemographic factors and admission criteria for Master's studies in Public Health at the Faculty of Health Sciences, Medical University of Warsaw (MUW). For the evaluation purposes recruitment data and learning results of students enrolled between 2008 and 2012 were used (N = 605, average age 22.9 ± 3.01). The predictive analysis was performed using the multiple linear regression method. In the proposed regression model 12 predictors were selected, including: sex, age, professional degree (BA), the Bachelor's studies grade point average (GPA), total score of the preliminary examination broken down into five thematic areas. Depending on the tested model, one of two dependent variables was used: first-year GPA or cumulative GPA in the Master program. The regression model based on the result variable of Master's GPA program was better matched to data in comparison to the model based on the first year GPA (adjusted R 2 0.413 versus 0.476 respectively). The Bachelor's studies GPA and each of the five subtests comprising the test entrance exam were significant predictors of success achieved by a student both after the first year and at the end of the course of studies. Criteria of admissions with total score of MCQs exam and Bachelor's studies GPA can be successfully used for selection of the candidates for Master's degree studies in Public Health. The high predictive validity of the recruitment system confirms the validity of the adopted admission policy at MUW.
An improved swarm optimization for parameter estimation and biological model selection.
Abdullah, Afnizanfaizal; Deris, Safaai; Mohamad, Mohd Saberi; Anwar, Sohail
2013-01-01
One of the key aspects of computational systems biology is the investigation on the dynamic biological processes within cells. Computational models are often required to elucidate the mechanisms and principles driving the processes because of the nonlinearity and complexity. The models usually incorporate a set of parameters that signify the physical properties of the actual biological systems. In most cases, these parameters are estimated by fitting the model outputs with the corresponding experimental data. However, this is a challenging task because the available experimental data are frequently noisy and incomplete. In this paper, a new hybrid optimization method is proposed to estimate these parameters from the noisy and incomplete experimental data. The proposed method, called Swarm-based Chemical Reaction Optimization, integrates the evolutionary searching strategy employed by the Chemical Reaction Optimization, into the neighbouring searching strategy of the Firefly Algorithm method. The effectiveness of the method was evaluated using a simulated nonlinear model and two biological models: synthetic transcriptional oscillators, and extracellular protease production models. The results showed that the accuracy and computational speed of the proposed method were better than the existing Differential Evolution, Firefly Algorithm and Chemical Reaction Optimization methods. The reliability of the estimated parameters was statistically validated, which suggests that the model outputs produced by these parameters were valid even when noisy and incomplete experimental data were used. Additionally, Akaike Information Criterion was employed to evaluate the model selection, which highlighted the capability of the proposed method in choosing a plausible model based on the experimental data. In conclusion, this paper presents the effectiveness of the proposed method for parameter estimation and model selection problems using noisy and incomplete experimental data. This study is hoped to provide a new insight in developing more accurate and reliable biological models based on limited and low quality experimental data.
Web-video-mining-supported workflow modeling for laparoscopic surgeries.
Liu, Rui; Zhang, Xiaoli; Zhang, Hao
2016-11-01
As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.
Ortíz, Miguel A; Felizzola, Heriberto A; Nieto Isaza, Santiago
2015-01-01
The project selection process is a crucial step for healthcare organizations at the moment of implementing six sigma programs in both administrative and caring processes. However, six-sigma project selection is often defined as a decision making process with interaction and feedback between criteria; so that it is necessary to explore different methods to help healthcare companies to determine the Six-sigma projects that provide the maximum benefits. This paper describes the application of both ANP (Analytic Network process) and DEMATEL (Decision Making trial and evaluation laboratory)-ANP in a public medical centre to establish the most suitable six sigma project and finally, these methods were compared to evaluate their performance in the decision making process. ANP and DEMATEL-ANP were used to evaluate 6 six sigma project alternatives under an evaluation model composed by 3 strategies, 4 criteria and 15 sub-criteria. Judgement matrixes were completed by the six sigma team whose participants worked in different departments of the medical centre. The improving of care opportunity in obstetric outpatients was elected as the most suitable six sigma project with a score of 0,117 as contribution to the organization goals. DEMATEL-ANP performed better at decision making process since it reduced the error probability due to interactions and feedback. ANP and DEMATEL-ANP effectively supported six sigma project selection processes, helping to create a complete framework that guarantees the prioritization of projects that provide maximum benefits to healthcare organizations. As DEMATEL- ANP performed better, it should be used by practitioners involved in decisions related to the implementation of six sigma programs in healthcare sector accompanied by the adequate identification of the evaluation criteria that support the decision making model. Thus, this comparative study contributes to choosing more effective approaches in this field. Suggestions of further work are also proposed so that these methods can be applied more adequate in six sigma project selection processes in healthcare.
2015-01-01
Background The project selection process is a crucial step for healthcare organizations at the moment of implementing six sigma programs in both administrative and caring processes. However, six-sigma project selection is often defined as a decision making process with interaction and feedback between criteria; so that it is necessary to explore different methods to help healthcare companies to determine the Six-sigma projects that provide the maximum benefits. This paper describes the application of both ANP (Analytic Network process) and DEMATEL (Decision Making trial and evaluation laboratory)-ANP in a public medical centre to establish the most suitable six sigma project and finally, these methods were compared to evaluate their performance in the decision making process. Methods ANP and DEMATEL-ANP were used to evaluate 6 six sigma project alternatives under an evaluation model composed by 3 strategies, 4 criteria and 15 sub-criteria. Judgement matrixes were completed by the six sigma team whose participants worked in different departments of the medical centre. Results The improving of care opportunity in obstetric outpatients was elected as the most suitable six sigma project with a score of 0,117 as contribution to the organization goals. DEMATEL-ANP performed better at decision making process since it reduced the error probability due to interactions and feedback. Conclusions ANP and DEMATEL-ANP effectively supported six sigma project selection processes, helping to create a complete framework that guarantees the prioritization of projects that provide maximum benefits to healthcare organizations. As DEMATEL- ANP performed better, it should be used by practitioners involved in decisions related to the implementation of six sigma programs in healthcare sector accompanied by the adequate identification of the evaluation criteria that support the decision making model. Thus, this comparative study contributes to choosing more effective approaches in this field. Suggestions of further work are also proposed so that these methods can be applied more adequate in six sigma project selection processes in healthcare. PMID:26391445
Uehlinger, F D; Johnston, A C; Bollinger, T K; Waldner, C L
2016-08-22
Chronic wasting disease (CWD) is a contagious, fatal prion disease affecting cervids in a growing number of regions across North America. Projected deer population declines and concern about potential spread of CWD to other species warrant strategies to manage this disease. Control efforts to date have been largely unsuccessful, resulting in continuing spread and increasing prevalence. This systematic review summarizes peer-reviewed published reports describing field-applicable CWD control strategies in wild deer populations in North America using systematic review methods. Ten databases were searched for peer-reviewed literature. Following deduplication, relevance screening, full-text appraisal, subject matter expert review and qualitative data extraction, nine references were included describing four distinct management strategies. Six of the nine studies used predictive modeling to evaluate control strategies. All six demonstrated one or more interventions to be effective but results were dependant on parameters and assumptions used in the model. Three found preferential removal of CWD infected deer to be effective in reducing CWD prevalence; one model evaluated a test and slaughter strategy, the other selective removal of infected deer by predators and the third evaluated increased harvest of the sex with highest prevalence (males). Three models evaluated non-selective harvest of deer. There were only three reports that examined primary data collected as part of observational studies. Two of these studies supported the effectiveness of intensive non-selective culling; the third study did not find a difference between areas that were subjected to culling and those that were not. Seven of the nine studies were conducted in the United States. This review highlights the paucity of evaluated, field-applicable control strategies for CWD in wild deer populations. Knowledge gaps in the complex epidemiology of CWD and the intricacies inherent to prion diseases currently pose significant challenges to effective control of this disease in wild deer in North America.
Evaluation of the 29-km Eta Model. Part 1; Objective Verification at Three Selected Stations
NASA Technical Reports Server (NTRS)
Nutter, Paul A.; Manobianco, John; Merceret, Francis J. (Technical Monitor)
1998-01-01
This paper describes an objective verification of the National Centers for Environmental Prediction (NCEP) 29-km eta model from May 1996 through January 1998. The evaluation was designed to assess the model's surface and upper-air point forecast accuracy at three selected locations during separate warm (May - August) and cool (October - January) season periods. In order to enhance sample sizes available for statistical calculations, the objective verification includes two consecutive warm and cool season periods. Systematic model deficiencies comprise the larger portion of the total error in most of the surface forecast variables that were evaluated. The error characteristics for both surface and upper-air forecasts vary widely by parameter, season, and station location. At upper levels, a few characteristic biases are identified. Overall however, the upper-level errors are more nonsystematic in nature and could be explained partly by observational measurement uncertainty. With a few exceptions, the upper-air results also indicate that 24-h model error growth is not statistically significant. In February and August 1997, NCEP implemented upgrades to the eta model's physical parameterizations that were designed to change some of the model's error characteristics near the surface. The results shown in this paper indicate that these upgrades led to identifiable and statistically significant changes in forecast accuracy for selected surface parameters. While some of the changes were expected, others were not consistent with the intent of the model updates and further emphasize the need for ongoing sensitivity studies and localized statistical verification efforts. Objective verification of point forecasts is a stringent measure of model performance, but when used alone, is not enough to quantify the overall value that model guidance may add to the forecast process. Therefore, results from a subjective verification of the meso-eta model over the Florida peninsula are discussed in the companion paper by Manobianco and Nutter. Overall verification results presented here and in part two should establish a reasonable benchmark from which model users and developers may pursue the ongoing eta model verification strategies in the future.
Development of a Value Inquiry Model in Biology Education.
ERIC Educational Resources Information Center
Jeong, Eun-Young; Kim, Young-Soo
2000-01-01
Points out the rapid advances in biology, increasing bioethical issues, and how students need to make rational decisions. Introduces a value inquiry model development that includes identifying and clarifying value problems; understanding biological knowledge related to conflict situations; considering, selecting, and evaluating each alternative;…
USDA-ARS?s Scientific Manuscript database
Nonpoint source pollution from agriculture and the impacts of mitigating best management practices are commonly evaluated based on hydrologic boundaries using watershed models. However, management practice effectiveness is impacted by which of the feasible practices are actually selected, implemente...
CAT Model with Personalized Algorithm for Evaluation of Estimated Student Knowledge
ERIC Educational Resources Information Center
Andjelic, Svetlana; Cekerevac, Zoran
2014-01-01
This article presents the original model of the computer adaptive testing and grade formation, based on scientifically recognized theories. The base of the model is a personalized algorithm for selection of questions depending on the accuracy of the answer to the previous question. The test is divided into three basic levels of difficulty, and the…
ERIC Educational Resources Information Center
Battisti, Bryce Thomas; Hanegan, Nikki; Sudweeks, Richard; Cates, Rex
2010-01-01
Concept inventories are often used to assess current student understanding although conceptual change models are problematic. Due to controversies with conceptual change models and the realities of student assessment, it is important that concept inventories are evaluated using a variety of theoretical models to improve quality. This study used a…
Bao, Le; Gu, Hong; Dunn, Katherine A; Bielawski, Joseph P
2007-02-08
Models of codon evolution have proven useful for investigating the strength and direction of natural selection. In some cases, a priori biological knowledge has been used successfully to model heterogeneous evolutionary dynamics among codon sites. These are called fixed-effect models, and they require that all codon sites are assigned to one of several partitions which are permitted to have independent parameters for selection pressure, evolutionary rate, transition to transversion ratio or codon frequencies. For single gene analysis, partitions might be defined according to protein tertiary structure, and for multiple gene analysis partitions might be defined according to a gene's functional category. Given a set of related fixed-effect models, the task of selecting the model that best fits the data is not trivial. In this study, we implement a set of fixed-effect codon models which allow for different levels of heterogeneity among partitions in the substitution process. We describe strategies for selecting among these models by a backward elimination procedure, Akaike information criterion (AIC) or a corrected Akaike information criterion (AICc). We evaluate the performance of these model selection methods via a simulation study, and make several recommendations for real data analysis. Our simulation study indicates that the backward elimination procedure can provide a reliable method for model selection in this setting. We also demonstrate the utility of these models by application to a single-gene dataset partitioned according to tertiary structure (abalone sperm lysin), and a multi-gene dataset partitioned according to the functional category of the gene (flagellar-related proteins of Listeria). Fixed-effect models have advantages and disadvantages. Fixed-effect models are desirable when data partitions are known to exhibit significant heterogeneity or when a statistical test of such heterogeneity is desired. They have the disadvantage of requiring a priori knowledge for partitioning sites. We recommend: (i) selection of models by using backward elimination rather than AIC or AICc, (ii) use a stringent cut-off, e.g., p = 0.0001, and (iii) conduct sensitivity analysis of results. With thoughtful application, fixed-effect codon models should provide a useful tool for large scale multi-gene analyses.
A polynomial based model for cell fate prediction in human diseases.
Ma, Lichun; Zheng, Jie
2017-12-21
Cell fate regulation directly affects tissue homeostasis and human health. Research on cell fate decision sheds light on key regulators, facilitates understanding the mechanisms, and suggests novel strategies to treat human diseases that are related to abnormal cell development. In this study, we proposed a polynomial based model to predict cell fate. This model was derived from Taylor series. As a case study, gene expression data of pancreatic cells were adopted to test and verify the model. As numerous features (genes) are available, we employed two kinds of feature selection methods, i.e. correlation based and apoptosis pathway based. Then polynomials of different degrees were used to refine the cell fate prediction function. 10-fold cross-validation was carried out to evaluate the performance of our model. In addition, we analyzed the stability of the resultant cell fate prediction model by evaluating the ranges of the parameters, as well as assessing the variances of the predicted values at randomly selected points. Results show that, within both the two considered gene selection methods, the prediction accuracies of polynomials of different degrees show little differences. Interestingly, the linear polynomial (degree 1 polynomial) is more stable than others. When comparing the linear polynomials based on the two gene selection methods, it shows that although the accuracy of the linear polynomial that uses correlation analysis outcomes is a little higher (achieves 86.62%), the one within genes of the apoptosis pathway is much more stable. Considering both the prediction accuracy and the stability of polynomial models of different degrees, the linear model is a preferred choice for cell fate prediction with gene expression data of pancreatic cells. The presented cell fate prediction model can be extended to other cells, which may be important for basic research as well as clinical study of cell development related diseases.
Crossa, José; Campos, Gustavo de Los; Pérez, Paulino; Gianola, Daniel; Burgueño, Juan; Araus, José Luis; Makumbi, Dan; Singh, Ravi P; Dreisigacker, Susanne; Yan, Jianbing; Arief, Vivi; Banziger, Marianne; Braun, Hans-Joachim
2010-10-01
The availability of dense molecular markers has made possible the use of genomic selection (GS) for plant breeding. However, the evaluation of models for GS in real plant populations is very limited. This article evaluates the performance of parametric and semiparametric models for GS using wheat (Triticum aestivum L.) and maize (Zea mays) data in which different traits were measured in several environmental conditions. The findings, based on extensive cross-validations, indicate that models including marker information had higher predictive ability than pedigree-based models. In the wheat data set, and relative to a pedigree model, gains in predictive ability due to inclusion of markers ranged from 7.7 to 35.7%. Correlation between observed and predictive values in the maize data set achieved values up to 0.79. Estimates of marker effects were different across environmental conditions, indicating that genotype × environment interaction is an important component of genetic variability. These results indicate that GS in plant breeding can be an effective strategy for selecting among lines whose phenotypes have yet to be observed.
Climatic Models Ensemble-based Mid-21st Century Runoff Projections: A Bayesian Framework
NASA Astrophysics Data System (ADS)
Achieng, K. O.; Zhu, J.
2017-12-01
There are a number of North American Regional Climate Change Assessment Program (NARCCAP) climatic models that have been used to project surface runoff in the mid-21st century. Statistical model selection techniques are often used to select the model that best fits data. However, model selection techniques often lead to different conclusions. In this study, ten models are averaged in Bayesian paradigm to project runoff. Bayesian Model Averaging (BMA) is used to project and identify effect of model uncertainty on future runoff projections. Baseflow separation - a two-digital filter which is also called Eckhardt filter - is used to separate USGS streamflow (total runoff) into two components: baseflow and surface runoff. We use this surface runoff as the a priori runoff when conducting BMA of runoff simulated from the ten RCM models. The primary objective of this study is to evaluate how well RCM multi-model ensembles simulate surface runoff, in a Bayesian framework. Specifically, we investigate and discuss the following questions: How well do ten RCM models ensemble jointly simulate surface runoff by averaging over all the models using BMA, given a priori surface runoff? What are the effects of model uncertainty on surface runoff simulation?
NASA Astrophysics Data System (ADS)
Fogel, Gary B.; Cheung, Mars; Pittman, Eric; Hecht, David
2008-01-01
Modeling studies were performed on known inhibitors of the quadruple mutant Plasmodium falciparum dihydrofolate reductase (DHFR). GOLD was used to dock 32 pyrimethamine derivatives into the active site of DHFR obtained from the x-ray crystal structure 1J3K.pdb. Several scoring functions were evaluated and the Molegro Protein-Ligand Interaction Score was determined to have one of the best correlation to experimental p K i . In conjunction with Protein-Ligand Interaction scores, predicted binding modes and key protein-ligand interactions were evaluated and analyzed in order to develop criteria for selecting compounds having a greater chance of activity versus resistant strains of Plasmodium falciparum. This methodology will be used in future studies for selection of compounds for focused screening libraries.
Color model comparative analysis for breast cancer diagnosis using H and E stained images
NASA Astrophysics Data System (ADS)
Li, Xingyu; Plataniotis, Konstantinos N.
2015-03-01
Digital cancer diagnosis is a research realm where signal processing techniques are used to analyze and to classify color histopathology images. Different from grayscale image analysis of magnetic resonance imaging or X-ray, colors in histopathology images convey large amount of histological information and thus play significant role in cancer diagnosis. Though color information is widely used in histopathology works, as today, there is few study on color model selections for feature extraction in cancer diagnosis schemes. This paper addresses the problem of color space selection for digital cancer classification using H and E stained images, and investigates the effectiveness of various color models (RGB, HSV, CIE L*a*b*, and stain-dependent H and E decomposition model) in breast cancer diagnosis. Particularly, we build a diagnosis framework as a comparison benchmark and take specific concerns of medical decision systems into account in evaluation. The evaluation methodologies include feature discriminate power evaluation and final diagnosis performance comparison. Experimentation on a publicly accessible histopathology image set suggests that the H and E decomposition model outperforms other assessed color spaces. For reasons behind various performance of color spaces, our analysis via mutual information estimation demonstrates that color components in the H and E model are less dependent, and thus most feature discriminate power is collected in one channel instead of spreading out among channels in other color spaces.
Initial proposition of kinematics model for selected karate actions analysis
NASA Astrophysics Data System (ADS)
Hachaj, Tomasz; Koptyra, Katarzyna; Ogiela, Marek R.
2017-03-01
The motivation for this paper is to initially propose and evaluate two new kinematics models that were developed to describe motion capture (MoCap) data of karate techniques. We decided to develop this novel proposition to create the model that is capable to handle actions description both from multimedia and professional MoCap hardware. For the evaluation purpose we have used 25-joints data with karate techniques recordings acquired with Kinect version 2. It is consisted of MoCap recordings of two professional sport (black belt) instructors and masters of Oyama Karate. We have selected following actions for initial analysis: left-handed furi-uchi punch, right leg hiza-geri kick, right leg yoko-geri kick and left-handed jodan-uke block. Basing on evaluation we made we can conclude that both proposed kinematics models seems to be convenient method for karate actions description. From two proposed variables models it seems that global might be more useful for further usage. We think that because in case of considered punches variables seems to be less correlated and they might also be easier to interpret because of single reference coordinate system. Also principal components analysis proved to be reliable way to examine the quality of kinematics models and with the plot of the variable in principal components space we can nicely present the dependences between variables.
Vafaee Sharbaf, Fatemeh; Mosafer, Sara; Moattar, Mohammad Hossein
2016-06-01
This paper proposes an approach for gene selection in microarray data. The proposed approach consists of a primary filter approach using Fisher criterion which reduces the initial genes and hence the search space and time complexity. Then, a wrapper approach which is based on cellular learning automata (CLA) optimized with ant colony method (ACO) is used to find the set of features which improve the classification accuracy. CLA is applied due to its capability to learn and model complicated relationships. The selected features from the last phase are evaluated using ROC curve and the most effective while smallest feature subset is determined. The classifiers which are evaluated in the proposed framework are K-nearest neighbor; support vector machine and naïve Bayes. The proposed approach is evaluated on 4 microarray datasets. The evaluations confirm that the proposed approach can find the smallest subset of genes while approaching the maximum accuracy. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Verfaillie, Deborah; Déqué, Michel; Morin, Samuel; Lafaysse, Matthieu
2017-11-01
We introduce the method ADAMONT v1.0 to adjust and disaggregate daily climate projections from a regional climate model (RCM) using an observational dataset at hourly time resolution. The method uses a refined quantile mapping approach for statistical adjustment and an analogous method for sub-daily disaggregation. The method ultimately produces adjusted hourly time series of temperature, precipitation, wind speed, humidity, and short- and longwave radiation, which can in turn be used to force any energy balance land surface model. While the method is generic and can be employed for any appropriate observation time series, here we focus on the description and evaluation of the method in the French mountainous regions. The observational dataset used here is the SAFRAN meteorological reanalysis, which covers the entire French Alps split into 23 massifs, within which meteorological conditions are provided for several 300 m elevation bands. In order to evaluate the skills of the method itself, it is applied to the ALADIN-Climate v5 RCM using the ERA-Interim reanalysis as boundary conditions, for the time period from 1980 to 2010. Results of the ADAMONT method are compared to the SAFRAN reanalysis itself. Various evaluation criteria are used for temperature and precipitation but also snow depth, which is computed by the SURFEX/ISBA-Crocus model using the meteorological driving data from either the adjusted RCM data or the SAFRAN reanalysis itself. The evaluation addresses in particular the time transferability of the method (using various learning/application time periods), the impact of the RCM grid point selection procedure for each massif/altitude band configuration, and the intervariable consistency of the adjusted meteorological data generated by the method. Results show that the performance of the method is satisfactory, with similar or even better evaluation metrics than alternative methods. However, results for air temperature are generally better than for precipitation. Results in terms of snow depth are satisfactory, which can be viewed as indicating a reasonably good intervariable consistency of the meteorological data produced by the method. In terms of temporal transferability (evaluated over time periods of 15 years only), results depend on the learning period. In terms of RCM grid point selection technique, the use of a complex RCM grid points selection technique, taking into account horizontal but also altitudinal proximity to SAFRAN massif centre points/altitude couples, generally degrades evaluation metrics for high altitudes compared to a simpler grid point selection method based on horizontal distance.
Evaluating the Impact of Aerosols on Numerical Weather Prediction
NASA Astrophysics Data System (ADS)
Freitas, Saulo; Silva, Arlindo; Benedetti, Angela; Grell, Georg; Members, Wgne; Zarzur, Mauricio
2015-04-01
The Working Group on Numerical Experimentation (WMO, http://www.wmo.int/pages/about/sec/rescrosscut/resdept_wgne.html) has organized an exercise to evaluate the impact of aerosols on NWP. This exercise will involve regional and global models currently used for weather forecast by the operational centers worldwide and aims at addressing the following questions: a) How important are aerosols for predicting the physical system (NWP, seasonal, climate) as distinct from predicting the aerosols themselves? b) How important is atmospheric model quality for air quality forecasting? c) What are the current capabilities of NWP models to simulate aerosol impacts on weather prediction? Toward this goal we have selected 3 strong or persistent events of aerosol pollution worldwide that could be fairly represented in current NWP models and that allowed for an evaluation of the aerosol impact on weather prediction. The selected events includes a strong dust storm that blew off the coast of Libya and over the Mediterranean, an extremely severe episode of air pollution in Beijing and surrounding areas, and an extreme case of biomass burning smoke in Brazil. The experimental design calls for simulations with and without explicitly accounting for aerosol feedbacks in the cloud and radiation parameterizations. In this presentation we will summarize the results of this study focusing on the evaluation of model performance in terms of its ability to faithfully simulate aerosol optical depth, and the assessment of the aerosol impact on the predictions of near surface wind, temperature, humidity, rainfall and the surface energy budget.
The effect of mis-specification on mean and selection between the Weibull and lognormal models
NASA Astrophysics Data System (ADS)
Jia, Xiang; Nadarajah, Saralees; Guo, Bo
2018-02-01
The lognormal and Weibull models are commonly used to analyse data. Although selection procedures have been extensively studied, it is possible that the lognormal model could be selected when the true model is Weibull or vice versa. As the mean is important in applications, we focus on the effect of mis-specification on mean. The effect on lognormal mean is first considered if the lognormal sample is wrongly fitted by a Weibull model. The maximum likelihood estimate (MLE) and quasi-MLE (QMLE) of lognormal mean are obtained based on lognormal and Weibull models. Then, the impact is evaluated by computing ratio of biases and ratio of mean squared errors (MSEs) between MLE and QMLE. For completeness, the theoretical results are demonstrated by simulation studies. Next, the effect of the reverse mis-specification on Weibull mean is discussed. It is found that the ratio of biases and the ratio of MSEs are independent of the location and scale parameters of the lognormal and Weibull models. The influence could be ignored if some special conditions hold. Finally, a model selection method is proposed by comparing ratios concerning biases and MSEs. We also present a published data to illustrate the study in this paper.
Müller, Aline Lima Hermes; Picoloto, Rochele Sogari; de Azevedo Mello, Paola; Ferrão, Marco Flores; de Fátima Pereira dos Santos, Maria; Guimarães, Regina Célia Lourenço; Müller, Edson Irineu; Flores, Erico Marlon Moraes
2012-04-01
Total sulfur concentration was determined in atmospheric residue (AR) and vacuum residue (VR) samples obtained from petroleum distillation process by Fourier transform infrared spectroscopy with attenuated total reflectance (FT-IR/ATR) in association with chemometric methods. Calibration and prediction set consisted of 40 and 20 samples, respectively. Calibration models were developed using two variable selection models: interval partial least squares (iPLS) and synergy interval partial least squares (siPLS). Different treatments and pre-processing steps were also evaluated for the development of models. The pre-treatment based on multiplicative scatter correction (MSC) and the mean centered data were selected for models construction. The use of siPLS as variable selection method provided a model with root mean square error of prediction (RMSEP) values significantly better than those obtained by PLS model using all variables. The best model was obtained using siPLS algorithm with spectra divided in 20 intervals and combinations of 3 intervals (911-824, 823-736 and 737-650 cm(-1)). This model produced a RMSECV of 400 mg kg(-1) S and RMSEP of 420 mg kg(-1) S, showing a correlation coefficient of 0.990. Copyright © 2011 Elsevier B.V. All rights reserved.
Zhang, Ying-Ying; Zhou, Xiao-Bin; Wang, Qiu-Zhen; Zhu, Xiao-Yan
2017-05-01
Multivariable logistic regression (MLR) has been increasingly used in Chinese clinical medical research during the past few years. However, few evaluations of the quality of the reporting strategies in these studies are available.To evaluate the reporting quality and model accuracy of MLR used in published work, and related advice for authors, readers, reviewers, and editors.A total of 316 articles published in 5 leading Chinese clinical medical journals with high impact factor from January 2010 to July 2015 were selected for evaluation. Articles were evaluated according 12 established criteria for proper use and reporting of MLR models.Among the articles, the highest quality score was 9, the lowest 1, and the median 5 (4-5). A total of 85.1% of the articles scored below 6. No significant differences were found among these journals with respect to quality score (χ = 6.706, P = .15). More than 50% of the articles met the following 5 criteria: complete identification of the statistical software application that was used (97.2%), calculation of the odds ratio and its confidence interval (86.4%), description of sufficient events (>10) per variable, selection of variables, and fitting procedure (78.2%, 69.3%, and 58.5%, respectively). Less than 35% of the articles reported the coding of variables (18.7%). The remaining 5 criteria were not satisfied by a sufficient number of articles: goodness-of-fit (10.1%), interactions (3.8%), checking for outliers (3.2%), collinearity (1.9%), and participation of statisticians and epidemiologists (0.3%). The criterion of conformity with linear gradients was applicable to 186 articles; however, only 7 (3.8%) mentioned or tested it.The reporting quality and model accuracy of MLR in selected articles were not satisfactory. In fact, severe deficiencies were noted. Only 1 article scored 9. We recommend authors, readers, reviewers, and editors to consider MLR models more carefully and cooperate more closely with statisticians and epidemiologists. Journals should develop statistical reporting guidelines concerning MLR.
A randomised approach for NARX model identification based on a multivariate Bernoulli distribution
NASA Astrophysics Data System (ADS)
Bianchi, F.; Falsone, A.; Prandini, M.; Piroddi, L.
2017-04-01
The identification of polynomial NARX models is typically performed by incremental model building techniques. These methods assess the importance of each regressor based on the evaluation of partial individual models, which may ultimately lead to erroneous model selections. A more robust assessment of the significance of a specific model term can be obtained by considering ensembles of models, as done by the RaMSS algorithm. In that context, the identification task is formulated in a probabilistic fashion and a Bernoulli distribution is employed to represent the probability that a regressor belongs to the target model. Then, samples of the model distribution are collected to gather reliable information to update it, until convergence to a specific model. The basic RaMSS algorithm employs multiple independent univariate Bernoulli distributions associated to the different candidate model terms, thus overlooking the correlations between different terms, which are typically important in the selection process. Here, a multivariate Bernoulli distribution is employed, in which the sampling of a given term is conditioned by the sampling of the others. The added complexity inherent in considering the regressor correlation properties is more than compensated by the achievable improvements in terms of accuracy of the model selection process.
Brenn, T; Arnesen, E
1985-01-01
For comparative evaluation, discriminant analysis, logistic regression and Cox's model were used to select risk factors for total and coronary deaths among 6595 men aged 20-49 followed for 9 years. Groups with mortality between 5 and 93 per 1000 were considered. Discriminant analysis selected variable sets only marginally different from the logistic and Cox methods which always selected the same sets. A time-saving option, offered for both the logistic and Cox selection, showed no advantage compared with discriminant analysis. Analysing more than 3800 subjects, the logistic and Cox methods consumed, respectively, 80 and 10 times more computer time than discriminant analysis. When including the same set of variables in non-stepwise analyses, all methods estimated coefficients that in most cases were almost identical. In conclusion, discriminant analysis is advocated for preliminary or stepwise analysis, otherwise Cox's method should be used.
NASA Astrophysics Data System (ADS)
Wang, Quanchao; Yu, Yang; Li, Fuhua; Zhang, Xiaojun; Xiang, Jianhai
2017-09-01
Genomic selection (GS) can be used to accelerate genetic improvement by shortening the selection interval. The successful application of GS depends largely on the accuracy of the prediction of genomic estimated breeding value (GEBV). This study is a first attempt to understand the practicality of GS in Litopenaeus vannamei and aims to evaluate models for GS on growth traits. The performance of GS models in L. vannamei was evaluated in a population consisting of 205 individuals, which were genotyped for 6 359 single nucleotide polymorphism (SNP) markers by specific length amplified fragment sequencing (SLAF-seq) and phenotyped for body length and body weight. Three GS models (RR-BLUP, BayesA, and Bayesian LASSO) were used to obtain the GEBV, and their predictive ability was assessed by the reliability of the GEBV and the bias of the predicted phenotypes. The mean reliability of the GEBVs for body length and body weight predicted by the different models was 0.296 and 0.411, respectively. For each trait, the performances of the three models were very similar to each other with respect to predictability. The regression coefficients estimated by the three models were close to one, suggesting near to zero bias for the predictions. Therefore, when GS was applied in a L. vannamei population for the studied scenarios, all three models appeared practicable. Further analyses suggested that improved estimation of the genomic prediction could be realized by increasing the size of the training population as well as the density of SNPs.
POST-PROCESSING ANALYSIS FOR THC SEEPAGE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Y. SUN
This report describes the selection of water compositions for the total system performance assessment (TSPA) model of results from the thermal-hydrological-chemical (THC) seepage model documented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). The selection has been conducted in accordance with ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Post-Processing Analysis for THC Seepage) Report Integration'' (BSC 2004 [DIRS 171334]). This technical work plan (TWP) was prepared in accordance with AP-2.27Q, ''Planning for Science Activities''. Section 1.2.3 of the TWP describes planning information pertaining to the technical scope, content, and managementmore » of this report. The post-processing analysis for THC seepage (THC-PPA) documented in this report provides a methodology for evaluating the near-field compositions of water and gas around a typical waste emplacement drift as these relate to the chemistry of seepage, if any, into the drift. The THC-PPA inherits the conceptual basis of the THC seepage model, but is an independently developed process. The relationship between the post-processing analysis and other closely related models, together with their main functions in providing seepage chemistry information for the Total System Performance Assessment for the License Application (TSPA-LA), are illustrated in Figure 1-1. The THC-PPA provides a data selection concept and direct input to the physical and chemical environment (P&CE) report that supports the TSPA model. The purpose of the THC-PPA is further discussed in Section 1.2. The data selection methodology of the post-processing analysis (Section 6.2.1) was initially applied to results of the THC seepage model as presented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). Other outputs from the THC seepage model (DTN: LB0302DSCPTHCS.002 [DIRS 161976]) used in the P&CE (BSC 2004 [DIRS 169860], Section 6.6) were also subjected to the same initial selection. The present report serves as a full documentation of this selection and also provides additional analyses in support of the choice of waters selected for further evaluation in ''Engineered Barrier System: Physical and Chemical Environment'' (BSC 2004 [DIRS 169860], Section 6.6). The work scope for the studies presented in this report is described in the TWP (BSC 2004 [DIRS 171334]) and other documents cited above and can be used to estimate water and gas compositions near waste emplacement drifts. Results presented in this report were submitted to the Technical Data Management System (TDMS) under specific data tracking numbers (DTNs) as listed in Appendix A. The major change from previous selection of results from the THC seepage model is that the THC-PPA now considers data selection in space around the modeled waste emplacement drift, tracking the evolution of pore-water and gas-phase composition at the edge of the dryout zone around the drift. This post-processing analysis provides a scientific background for the selection of potential seepage water compositions.« less
Kwon, Tae-Rin; Choi, Eun Ja; Oh, Chang Taek; Bak, Dong-Ho; Im, Song-I; Ko, Eun Jung; Hong, Hyuck Ki; Choi, Yeon Shik; Seok, Joon; Choi, Sun Young; Ahn, Gun Young; Kim, Beom Joon
2017-04-01
Many studies have investigated the application of micro-insulated needles with radio frequency (RF) to treat acne in humans; however, the use of a micro-insulated needle RF applicator has not yet been studied in an animal model. The purpose of this study was to evaluate the effectiveness of a micro-insulated needle RF applicator in a rabbit ear acne (REA) model. In this study, we investigated the effect of selectively destroying the sebaceous glands using a micro-insulated needle RF applicator on the formation of comedones induced by application of 50% oleic acid and intradermal injection of P. acnes in the orifices of the external auditory canals of rabbits. The effects of the micro-insulated needle RF applicator treatment were evaluated using regular digital photography in addition to 3D Primos imaging evaluation, Skin Visio Meter microscopic photography, and histologic analyses. Use of the micro-insulated needle RF applicator resulted in successful selective destruction of the sebaceous glands and attenuated TNF-alpha release in an REA model. The mechanisms by which micro-insulated needles with RF using 1 MHz exerts its effects may involve inhibition of comedone formation, triggering of the wound healing process, and destruction of the sebaceous glands and papules. The use of micro-insulated needles with RF applicators provides a safe and effective method for improving the appearance of symptoms in an REA model. The current in vivo study confirms that the micro-insulated needle RF applicator is selectively destroying the sebaceous glands. Lasers Surg. Med. 49:395-401, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Chaikh, Abdulhamid; Calugaru, Valentin; Bondiau, Pierre-Yves; Thariat, Juliette; Balosso, Jacques
2018-06-07
The aim of this study is to evaluate the impact of normal tissue complication probability (NTCP)-based radiobiological models on the estimated risk for late radiation lung damages. The second goal is to propose a medical decision-making approach to select the eligible patient for particle therapy. 14 pediatric patients undergoing cranio-spinal irradiation were evaluated. For each patient, two treatment plans were generated using photon and proton therapy with the same dose prescriptions. Late radiation damage to lung was estimated using three NTCP concepts: the Lyman-Kutcher-Burman, the equivalent uniform dose (EUD) and the mean lung dose according to the quantitative analysis of normal tissue effects in the clinic QUANTEC review. Wilcoxon paired test was used to calculate p-value. Proton therapy achieved lower lung EUD (Gy). The average NTCP values were significantly lower with proton plans, p < 0.05, using the three NTCP concepts. However, applying the same TD 50/5 using radiobiological models to compare NTCP from proton and photon therapy, the ΔNTCP was not a convincing method to measure the potential benefit of proton therapy. Late radiation pneumonitis estimated from the mean lung dose model correlated with QUANTEC data better. treatment effectiveness assessed on NTCP reduction depends on radiobiological predictions and parameters used as inputs for in silico evaluation. Since estimates of absolute NTCP values from LKB and GN models are imprecise due to EUD ≪ TD 50/5 , a reduction of the EUD value with proton plans would better predict a reduction of dose/toxicity. The EUD concept appears as a robust radiobiological surrogate of the dose distribution to select the optimal patient's plan.
An Integrated Model for Supplier Selection for a High-Tech Manufacturer
NASA Astrophysics Data System (ADS)
Lee, Amy H. I.; Kang, He-Yau; Lin, Chun-Yu
2011-11-01
Global competitiveness has become the biggest concern of manufacturing companies, especially in high-tech industries. Improving competitive edges in an environment with rapidly changing technological innovations and dynamic customer needs is essential for a firm to survive and to acquire a decent profit. Thus, the introduction of successful new products is a source of new sales and profits and is a necessity in the intense competitive international market. After a product is developed, a firm needs the cooperation of upstream suppliers to provide satisfactory components and parts for manufacturing final products. Therefore, the selection of suitable suppliers has also become a very important decision. In this study, an analytical approach is proposed to select the most appropriate critical-part suppliers in order to maintain a high reliability of the supply chain. A fuzzy analytic network process (FANP) model, which incorporates the benefits, opportunities, costs and risks (BOCR) concept, is constructed to evaluate various aspects of suppliers. The proposed model is adopted in a TFT-LCD manufacturer in Taiwan in evaluating the expected performance of suppliers with respect to each important factor, and an overall ranking of the suppliers can be generated as a result.
Effects of baseline conditions on the simulated hydrologic response to projected climate change
Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.
2011-01-01
Changes in temperature and precipitation projected from five general circulation models, using one late-twentieth-century and three twenty-first-century emission scenarios, were downscaled to three different baseline conditions. Baseline conditions are periods of measured temperature and precipitation data selected to represent twentieth-century climate. The hydrologic effects of the climate projections are evaluated using the Precipitation-Runoff Modeling System (PRMS), which is a watershed hydrology simulation model. The Almanor Catchment in the North Fork of the Feather River basin, California, is used as a case study. Differences and similarities between PRMS simulations of hydrologic components (i.e., snowpack formation and melt, evapotranspiration, and streamflow) are examined, and results indicate that the selection of a specific time period used for baseline conditions has a substantial effect on some, but not all, hydrologic variables. This effect seems to be amplified in hydrologic variables, which accumulate over time, such as soil-moisture content. Results also indicate that uncertainty related to the selection of baseline conditions should be evaluated using a range of different baseline conditions. This is particularly important for studies in basins with highly variable climate, such as the Almanor Catchment.
Enhanced Fuzzy-OWA model for municipal solid waste landfill site selection
NASA Astrophysics Data System (ADS)
Ahmad, Siti Zubaidah; Ahamad, Mohd Sanusi S.; Yusoff, Mohd Suffian; Abujayyab, Sohaib K. M.
2017-10-01
In Malaysia, the municipal solid waste landfill site is an essential facility that needs to be evaluated as its demand is infrequently getting higher. The increment of waste generation forces the government to cater the appropriate site for waste disposal. However, the selection process for new landfill sites is a difficult task with regard to land scarcity and time consumption. In addition, the complication will proliferate when there are various criteria to be considered. Therefore, this paper intends to show the significance of the fuzzy logic-ordered weighted average (Fuzzy-OWA) model for the landfill site suitability analysis. The model was developed to generalize the multi-criteria combination that was extended to the GIS applications as part of the decision support module. OWA has the capability to implement different combination operators through the selection of appropriate order weight that is possible in changing the form of aggregation such as minimum, intermediate and maximum types of combination. OWA give six forms of aggregation results that have their specific significance that indirectly evaluates the environmental, physical and socio-economic (EPSE) criteria respectively. Nevertheless, one of the aggregated results has shown similarity with the weighted linear combination (WLC) method.
The Modular Modeling System (MMS): User's Manual
Leavesley, G.H.; Restrepo, Pedro J.; Markstrom, S.L.; Dixon, M.; Stannard, L.G.
1996-01-01
The Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide the research and operational framework needed to support development, testing, and evaluation of physical-process algorithms and to facilitate integration of user-selected sets of algorithms into operational physical-process models. MMS uses a module library that contains modules for simulating a variety of water, energy, and biogeochemical processes. A model is created by selectively coupling the most appropriate modules from the library to create a 'suitable' model for the desired application. Where existing modules do not provide appropriate process algorithms, new modules can be developed. The MMS user's manual provides installation instructions and a detailed discussion of system concepts, module development, and model development and application using the MMS graphical user interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szoka de Valladares, M.R.; Mack, S.
The DOE Hydrogen Program needs to develop criteria as part of a systematic evaluation process for proposal identification, evaluation and selection. The H Scan component of this process provides a framework in which a project proposer can fully describe their candidate technology system and its components. The H Scan complements traditional methods of capturing cost and technical information. It consists of a special set of survey forms designed to elicit information so expert reviewers can assess the proposal relative to DOE specified selection criteria. The Analytic Hierarchy Process (AHP) component of the decision process assembles the management defined evaluation andmore » selection criteria into a coherent multi-level decision construct by which projects can be evaluated in pair-wise comparisons. The AHP model will reflect management`s objectives and it will assist in the ranking of individual projects based on the extent to which each contributes to management`s objectives. This paper contains a detailed description of the products and activities associated with the planning and evaluation process: The objectives or criteria; the H Scan; and The Analytic Hierarchy Process (AHP).« less
Two-Stage Modeling of Formaldehyde-Induced Tumor Incidence in the Rat—analysis of Uncertainties
This works extends the 2-stage cancer modeling of tumor incidence in formaldehyde-exposed rats carried out at the CIIT Centers for Health Research. We modify key assumptions, evaluate the effect of selected uncertainties, and develop confidence bounds on parameter estimates. Th...
One approach to predictive modeling of biological contamination of recreational waters and drinking water sources involves applying process-based models that consider microbial sources, hydrodynamic transport, and microbial fate. Fecal indicator bacteria such as enterococci have ...
A Statistical Decision Model for Periodical Selection for a Specialized Information Center
ERIC Educational Resources Information Center
Dym, Eleanor D.; Shirey, Donald L.
1973-01-01
An experiment is described which attempts to define a quantitative methodology for the identification and evaluation of all possibly relevant periodical titles containing toxicological-biological information. A statistical decision model was designed and employed, along with yes/no criteria questions, a training technique and a quality control…
Andrzejewska, Anna; Kaczmarski, Krzysztof; Guiochon, Georges
2009-02-13
The adsorption isotherms of selected compounds are our main source of information on the mechanisms of adsorption processes. Thus, the selection of the methods used to determine adsorption isotherm data and to evaluate the errors made is critical. Three chromatographic methods were evaluated, frontal analysis (FA), frontal analysis by characteristic point (FACP), and the pulse or perturbation method (PM), and their accuracies were compared. Using the equilibrium-dispersive (ED) model of chromatography, breakthrough curves of single components were generated corresponding to three different adsorption isotherm models: the Langmuir, the bi-Langmuir, and the Moreau isotherms. For each breakthrough curve, the best conventional procedures of each method (FA, FACP, PM) were used to calculate the corresponding data point, using typical values of the parameters of each isotherm model, for four different values of the column efficiency (N=500, 1000, 2000, and 10,000). Then, the data points were fitted to each isotherm model and the corresponding isotherm parameters were compared to those of the initial isotherm model. When isotherm data are derived with a chromatographic method, they may suffer from two types of errors: (1) the errors made in deriving the experimental data points from the chromatographic records; (2) the errors made in selecting an incorrect isotherm model and fitting to it the experimental data. Both errors decrease significantly with increasing column efficiency with FA and FACP, but not with PM.
NASA Astrophysics Data System (ADS)
Wöhling, T.; Schöniger, A.; Geiges, A.; Nowak, W.; Gayler, S.
2013-12-01
The objective selection of appropriate models for realistic simulations of coupled soil-plant processes is a challenging task since the processes are complex, not fully understood at larger scales, and highly non-linear. Also, comprehensive data sets are scarce, and measurements are uncertain. In the past decades, a variety of different models have been developed that exhibit a wide range of complexity regarding their approximation of processes in the coupled model compartments. We present a method for evaluating experimental design for maximum confidence in the model selection task. The method considers uncertainty in parameters, measurements and model structures. Advancing the ideas behind Bayesian Model Averaging (BMA), we analyze the changes in posterior model weights and posterior model choice uncertainty when more data are made available. This allows assessing the power of different data types, data densities and data locations in identifying the best model structure from among a suite of plausible models. The models considered in this study are the crop models CERES, SUCROS, GECROS and SPASS, which are coupled to identical routines for simulating soil processes within the modelling framework Expert-N. The four models considerably differ in the degree of detail at which crop growth and root water uptake are represented. Monte-Carlo simulations were conducted for each of these models considering their uncertainty in soil hydraulic properties and selected crop model parameters. Using a Bootstrap Filter (BF), the models were then conditioned on field measurements of soil moisture, matric potential, leaf-area index, and evapotranspiration rates (from eddy-covariance measurements) during a vegetation period of winter wheat at a field site at the Swabian Alb in Southwestern Germany. Following our new method, we derived model weights when using all data or different subsets thereof. We discuss to which degree the posterior mean outperforms the prior mean and all individual posterior models, how informative the data types were for reducing prediction uncertainty of evapotranspiration and deep drainage, and how well the model structure can be identified based on the different data types and subsets. We further analyze the impact of measurement uncertainty und systematic model errors on the effective sample size of the BF and the resulting model weights.
Evaluation of Tsunami-HySEA for tsunami forecasting at selected locations in U.S.
NASA Astrophysics Data System (ADS)
Gonzalez Vida, J. M., Sr.; Ortega, S.; Castro, M. J.; de la Asuncion, M.; Arcas, D.
2017-12-01
The GPU-based Tsunami-HySEA model (Macias, J. et al., Pure and Applied Geophysics, 1-37, 2017, Lynett, P. et al., Ocean modeling, 114, 2017) is used to test four tsunami events: the January, 13, 2007 earthquake in Kuril islands (Mw 8.1), the September, 29, 2009 earthquake in Samoa (Mw 8.3), the February, 27, 2010 earthquake in Chile (Mw 9.8) and the March, 11, 2011 earthquake in Tohoku (Mw 9.0). Initial conditions have been provided by NOAA Center for Tsunami Research (NCTR) obtained from DART inversion results. All simulations have been performed using a global 4 arc-min grid of the Ocean Pacific and three nested-meshes levels around the selected locations. Wave amplitudes time series have been computed at selected tide gauges located at each location and maximum amplitudes compared with both MOST model results and observations where they are available. In addition, inundation also has been computed at selected U.S. locations for the 2011 Tohoku and 2009 Samoa events under the assumption of a steady mean high water level. Finally, computational time is also evaluated in order to study the operational capabilities of Tsunami-HySEA for these kind of events. Ackowledgements: This work has been funded by WE133R16SE1418 contract between PMEL (NOAA) and the Universidad de Málaga (Spain).
Using a knowledge-based planning solution to select patients for proton therapy.
Delaney, Alexander R; Dahele, Max; Tol, Jim P; Kuijper, Ingrid T; Slotman, Ben J; Verbakel, Wilko F A R
2017-08-01
Patient selection for proton therapy by comparing proton/photon treatment plans is time-consuming and prone to bias. RapidPlan™, a knowledge-based-planning solution, uses plan-libraries to model and predict organ-at-risk (OAR) dose-volume-histograms (DVHs). We investigated whether RapidPlan, utilizing an algorithm based only on photon beam characteristics, could generate proton DVH-predictions and whether these could correctly identify patients for proton therapy. Model PROT and Model PHOT comprised 30 head-and-neck cancer proton and photon plans, respectively. Proton and photon knowledge-based-plans (KBPs) were made for ten evaluation-patients. DVH-prediction accuracy was analyzed by comparing predicted-vs-achieved mean OAR doses. KBPs and manual plans were compared using salivary gland and swallowing muscle mean doses. For illustration, patients were selected for protons if predicted Model PHOT mean dose minus predicted Model PROT mean dose (ΔPrediction) for combined OARs was ≥6Gy, and benchmarked using achieved KBP doses. Achieved and predicted Model PROT /Model PHOT mean dose R 2 was 0.95/0.98. Generally, achieved mean dose for Model PHOT /Model PROT KBPs was respectively lower/higher than predicted. Comparing Model PROT /Model PHOT KBPs with manual plans, salivary and swallowing mean doses increased/decreased by <2Gy, on average. ΔPrediction≥6Gy correctly selected 4 of 5 patients for protons. Knowledge-based DVH-predictions can provide efficient, patient-specific selection for protons. A proton-specific RapidPlan-solution could improve results. Copyright © 2017 Elsevier B.V. All rights reserved.
Chattoraj, Sayantan; Bhugra, Chandan; Li, Zheng Jane; Sun, Changquan Calvin
2014-12-01
The nonisothermal crystallization kinetics of amorphous materials is routinely analyzed by statistically fitting the crystallization data to kinetic models. In this work, we systematically evaluate how the model-dependent crystallization kinetics is impacted by variations in the heating rate and the selection of the kinetic model, two key factors that can lead to significant differences in the crystallization activation energy (Ea ) of an amorphous material. Using amorphous felodipine, we show that the Ea decreases with increase in the heating rate, irrespective of the kinetic model evaluated in this work. The model that best describes the crystallization phenomenon cannot be identified readily through the statistical fitting approach because several kinetic models yield comparable R(2) . Here, we propose an alternate paired model-fitting model-free (PMFMF) approach for identifying the most suitable kinetic model, where Ea obtained from model-dependent kinetics is compared with those obtained from model-free kinetics. The most suitable kinetic model is identified as the one that yields Ea values comparable with the model-free kinetics. Through this PMFMF approach, nucleation and growth is identified as the main mechanism that controls the crystallization kinetics of felodipine. Using this PMFMF approach, we further demonstrate that crystallization mechanism from amorphous phase varies with heating rate. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
NASA Astrophysics Data System (ADS)
Li, Li-Na; Ma, Chang-Ming; Chang, Ming; Zhang, Ren-Cheng
2017-12-01
A novel method based on SIMPLe-to-use Interactive Self-modeling Mixture Analysis (SIMPLISMA) and Kernel Partial Least Square (KPLS), named as SIMPLISMA-KPLS, is proposed in this paper for selection of outlier samples and informative samples simultaneously. It is a quick algorithm used to model standardization (or named as model transfer) in near infrared (NIR) spectroscopy. The NIR experiment data of the corn for analysis of the protein content is introduced to evaluate the proposed method. Piecewise direct standardization (PDS) is employed in model transfer. And the comparison of SIMPLISMA-PDS-KPLS and KS-PDS-KPLS is given in this research by discussion of the prediction accuracy of protein content and calculation speed of each algorithm. The conclusions include that SIMPLISMA-KPLS can be utilized as an alternative sample selection method for model transfer. Although it has similar accuracy to Kennard-Stone (KS), it is different from KS as it employs concentration information in selection program. This means that it ensures analyte information is involved in analysis, and the spectra (X) of the selected samples is interrelated with concentration (y). And it can be used for outlier sample elimination simultaneously by validation of calibration. According to the statistical data results of running time, it is clear that the sample selection process is more rapid when using KPLS. The quick algorithm of SIMPLISMA-KPLS is beneficial to improve the speed of online measurement using NIR spectroscopy.
The Evaluation of Teachers' Job Performance Based on Total Quality Management (TQM)
ERIC Educational Resources Information Center
Shahmohammadi, Nayereh
2017-01-01
This study aimed to evaluate teachers' job performance based on total quality management (TQM) model. This was a descriptive survey study. The target population consisted of all primary school teachers in Karaj (N = 2917). Using Cochran formula and simple random sampling, 340 participants were selected as sample. A total quality management…
Evaluation of electrical fields inside a biological structure.
Drago, G. P.; Ridella, S.
1982-01-01
A digital computer simulation has been carried out of exposure of a cell, modelled as a multilayered spherical structure, to an alternating electrical field. Electrical quantities of possible biological interest can be evaluated everywhere inside the cell. A strong frequency selective behaviour in the range 0-10 MHz has been obtained. PMID:6279135
Variation simulation for compliant sheet metal assemblies with applications
NASA Astrophysics Data System (ADS)
Long, Yufeng
Sheet metals are widely used in discrete products, such as automobiles, aircraft, furniture and electronics appliances, due to their good manufacturability and low cost. A typical automotive body assembly consists of more than 300 parts welded together in more than 200 assembly fixture stations. Such an assembly system is usually quite complex, and takes a long time to develop. As the automotive customer demands products of increasing quality in a shorter time, engineers in automotive industry turn to computer-aided engineering (CAE) tools for help. Computers are an invaluable resource for engineers, not only to simplify and automate the design process, but also to share design specifications with manufacturing groups so that production systems can be tooled up quickly and efficiently. Therefore, it is beneficial to develop computerized simulation and evaluation tools for development of automotive body assembly systems. It is a well-known fact that assembly architectures (joints, fixtures, and assembly lines) have a profound impact on dimensional quality of compliant sheet metal assemblies. To evaluate sheet metal assembly architectures, a special dimensional analysis tool need be developed for predicting dimensional variation of the assembly. Then, the corresponding systematic tools can be established to help engineers select the assembly architectures. In this dissertation, a unified variation model is developed to predict variation in compliant sheet metal assemblies by considering fixture-induced rigid-body motion, deformation and springback. Based on the unified variation model, variation propagation models in multiple assembly stations with various configurations are established. To evaluate the dimensional capability of assembly architectures, quantitative indices are proposed based on the sensitivity matrix, which are independent of the variation level of the process. Examples are given to demonstrate their applications in selecting robust assembly architectures, and some useful guidelines for selection of assembly architectures are summarized. In addition, to enhance the fault diagnosis, a systematic methodology is proposed for selection of measurement configurations. Specifically, principles involved in selecting measurements are generalized first; then, the corresponding quantitative indices are developed to evaluate the measurement configurations, and finally, examples are present.
The Error Prone Model and the Basic Grants Validation Selection System. Draft Final Report.
ERIC Educational Resources Information Center
System Development Corp., Falls Church, VA.
An evaluation of existing and proposed mechanisms to ensure data accuracy for the Pell Grant program is reported, and recommendations for efficient detection of fraud and error in the program are offered. One study objective was to examine the existing system of pre-established criteria (PEC), which are validation criteria that select students on…
Ion exchange of H+, Na+, Mg2+, Ca2+, Mn2+, and Ba2+, on wood pulp
Alan W. Rudie; Alan Ball; Narendra Patel
2006-01-01
Ion exchange selectivity coefficients were measured for the partition of metals between solution and pulp fibers. The method accurately models the ion exchange isotherms for all cation pairs evaluated and is accurate up to approximately 0.05 molar concentrations. Selectivity coefficients were determined for calcium and magnesium with each other and with hydrogen....
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
.... This model will enable the Sub-Adviser to evaluate, rank, and select the appropriate mix of investments... becoming, risky. The Sub-Adviser will use a quantitative metric to rank and select the appropriate mix of... imposes a duty of due diligence on its Equity Trading Permit Holders to learn the essential facts relating...
Content Validity of Temporal Bone Models Printed Via Inexpensive Methods and Materials.
Bone, T Michael; Mowry, Sarah E
2016-09-01
Computed tomographic (CT) scans of the 3-D printed temporal bone models will be within 15% accuracy of the CT scans of the cadaveric temporal bones. Previous studies have evaluated the face validity of 3-D-printed temporal bone models designed to train otolaryngology residents. The purpose of the study was to determine the content validity of temporal bone models printed using inexpensive printers and materials. Four cadaveric temporal bones were randomly selected and clinical temporal bone CT scans were obtained. Models were generated using previously described methods in acrylonitrile butadiene styrene (ABS) plastic using the Makerbot Replicator 2× and Hyrel printers. Models were radiographically scanned using the same protocol as the cadaveric bones. Four images from each cadaveric CT series and four corresponding images from the model CT series were selected, and voxel values were normalized to black or white. Scan slices were compared using PixelDiff software. Gross anatomic structures were evaluated in the model scans by four board certified otolaryngologists on a 4-point scale. Mean pixel difference between the cadaver and model scans was 14.25 ± 2.30% at the four selected CT slices. Mean cortical bone width difference and mean external auditory canal width difference were 0.58 ± 0.66 mm and 0.55 ± 0.46 mm, respectively. Expert raters felt the mastoid air cells were well represented (2.5 ± 0.5), while middle ear and otic capsule structures were not accurately rendered (all averaged <1.8). These results suggest that these models would be sufficient adjuncts to cadaver temporal bones for training residents in cortical mastoidectomies, but less effective for middle ear procedures.
Cheng, Tiejun; Li, Qingliang; Wang, Yanli; Bryant, Stephen H
2011-02-28
Aqueous solubility is recognized as a critical parameter in both the early- and late-stage drug discovery. Therefore, in silico modeling of solubility has attracted extensive interests in recent years. Most previous studies have been limited in using relatively small data sets with limited diversity, which in turn limits the predictability of derived models. In this work, we present a support vector machines model for the binary classification of solubility by taking advantage of the largest known public data set that contains over 46 000 compounds with experimental solubility. Our model was optimized in combination with a reduction and recombination feature selection strategy. The best model demonstrated robust performance in both cross-validation and prediction of two independent test sets, indicating it could be a practical tool to select soluble compounds for screening, purchasing, and synthesizing. Moreover, our work may be used for comparative evaluation of solubility classification studies ascribe to the use of completely public resources.
Application of Support Vector Machine to Forex Monitoring
NASA Astrophysics Data System (ADS)
Kamruzzaman, Joarder; Sarker, Ruhul A.
Previous studies have demonstrated superior performance of artificial neural network (ANN) based forex forecasting models over traditional regression models. This paper applies support vector machines to build a forecasting model from the historical data using six simple technical indicators and presents a comparison with an ANN based model trained by scaled conjugate gradient (SCG) learning algorithm. The models are evaluated and compared on the basis of five commonly used performance metrics that measure closeness of prediction as well as correctness in directional change. Forecasting results of six different currencies against Australian dollar reveal superior performance of SVM model using simple linear kernel over ANN-SCG model in terms of all the evaluation metrics. The effect of SVM parameter selection on prediction performance is also investigated and analyzed.
Measuring cognition in teams: a cross-domain review.
Wildman, Jessica L; Salas, Eduardo; Scott, Charles P R
2014-08-01
The purpose of this article is twofold: to provide a critical cross-domain evaluation of team cognition measurement options and to provide novice researchers with practical guidance when selecting a measurement method. A vast selection of measurement approaches exist for measuring team cognition constructs including team mental models, transactive memory systems, team situation awareness, strategic consensus, and cognitive processes. Empirical studies and theoretical articles were reviewed to identify all of the existing approaches for measuring team cognition. These approaches were evaluated based on theoretical perspective assumed, constructs studied, resources required, level of obtrusiveness, internal consistency reliability, and predictive validity. The evaluations suggest that all existing methods are viable options from the point of view of reliability and validity, and that there are potential opportunities for cross-domain use. For example, methods traditionally used only to measure mental models may be useful for examining transactive memory and situation awareness. The selection of team cognition measures requires researchers to answer several key questions regarding the theoretical nature of team cognition and the practical feasibility of each method. We provide novice researchers with guidance regarding how to begin the search for a team cognition measure and suggest several new ideas regarding future measurement research. We provide (1) a broad overview and evaluation of existing team cognition measurement methods, (2) suggestions for new uses of those methods across research domains, and (3) critical guidance for novice researchers looking to measure team cognition.
Kowalski, K G; Olson, S; Remmers, A E; Hutmacher, M M
2008-06-01
Pharmacokinetic/pharmacodynamic (PK/PD) models were developed and clinical trial simulations were conducted to recommend a study design to test the hypothesis that a dose of SC-75416, a selective cyclooxygenase-2 inhibitor, can be identified that achieves superior pain relief (PR) compared to 400 mg ibuprofen in a post-oral surgery pain model. PK/PD models were developed for SC-75416, rofecoxib, valdecoxib, and ibuprofen relating plasma concentrations to PR scores using a nonlinear logistic-normal model. Clinical trial simulations conducted using these models suggested that 360 mg SC-75416 could achieve superior PR compared to 400 mg ibuprofen. A placebo- and positive-controlled parallel-group post-oral surgery pain study was conducted evaluating placebo, 60, 180, and 360 mg SC-75416 oral solution, and 400 mg ibuprofen. The study results confirmed the hypothesis that 360 mg SC-75416 achieved superior PR relative to 400 mg ibuprofen (DeltaTOTPAR6=3.3, P<0.05) and demonstrated the predictive performance of the PK/PD models.
Evaluation of Anomaly Detection Capability for Ground-Based Pre-Launch Shuttle Operations. Chapter 8
NASA Technical Reports Server (NTRS)
Martin, Rodney Alexander
2010-01-01
This chapter will provide a thorough end-to-end description of the process for evaluation of three different data-driven algorithms for anomaly detection to select the best candidate for deployment as part of a suite of IVHM (Integrated Vehicle Health Management) technologies. These algorithms were deemed to be sufficiently mature enough to be considered viable candidates for deployment in support of the maiden launch of Ares I-X, the successor to the Space Shuttle for NASA's Constellation program. Data-driven algorithms are just one of three different types being deployed. The other two types of algorithms being deployed include a "nile-based" expert system, and a "model-based" system. Within these two categories, the deployable candidates have already been selected based upon qualitative factors such as flight heritage. For the rule-based system, SHINE (Spacecraft High-speed Inference Engine) has been selected for deployment, which is a component of BEAM (Beacon-based Exception Analysis for Multimissions), a patented technology developed at NASA's JPL (Jet Propulsion Laboratory) and serves to aid in the management and identification of operational modes. For the "model-based" system, a commercially available package developed by QSI (Qualtech Systems, Inc.), TEAMS (Testability Engineering and Maintenance System) has been selected for deployment to aid in diagnosis. In the context of this particular deployment, distinctions among the use of the terms "data-driven," "rule-based," and "model-based," can be found in. Although there are three different categories of algorithms that have been selected for deployment, our main focus in this chapter will be on the evaluation of three candidates for data-driven anomaly detection. These algorithms will be evaluated upon their capability for robustly detecting incipient faults or failures in the ground-based phase of pre-launch space shuttle operations, rather than based oil heritage as performed in previous studies. Robust detection will allow for the achievement of pre-specified minimum false alarm and/or missed detection rates in the selection of alert thresholds. All algorithms will also be optimized with respect to an aggregation of these same criteria. Our study relies upon the use of Shuttle data to act as was a proxy for and in preparation for application to Ares I-X data, which uses a very similar hardware platform for the subsystems that are being targeted (TVC - Thrust Vector Control subsystem for the SRB (Solid Rocket Booster)).
Goodarzi, Mohammad; Jensen, Richard; Vander Heyden, Yvan
2012-12-01
A Quantitative Structure-Retention Relationship (QSRR) is proposed to estimate the chromatographic retention of 83 diverse drugs on a Unisphere poly butadiene (PBD) column, using isocratic elutions at pH 11.7. Previous work has generated QSRR models for them using Classification And Regression Trees (CART). In this work, Ant Colony Optimization is used as a feature selection method to find the best molecular descriptors from a large pool. In addition, several other selection methods have been applied, such as Genetic Algorithms, Stepwise Regression and the Relief method, not only to evaluate Ant Colony Optimization as a feature selection method but also to investigate its ability to find the important descriptors in QSRR. Multiple Linear Regression (MLR) and Support Vector Machines (SVMs) were applied as linear and nonlinear regression methods, respectively, giving excellent correlation between the experimental, i.e. extrapolated to a mobile phase consisting of pure water, and predicted logarithms of the retention factors of the drugs (logk(w)). The overall best model was the SVM one built using descriptors selected by ACO. Copyright © 2012 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Public Impact, 2012
2012-01-01
This toolkit is a companion to the school models provided on OpportunityCulture.org. The school models use job redesign and technology to extend the reach of excellent teachers to more students, for more pay, within budget. Most of these school models create new roles and collaborative teams, enabling all teachers and staff to develop and…
A System for Integrated Reliability and Safety Analyses
NASA Technical Reports Server (NTRS)
Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Coumeri, Marc; Scheidler, Peter, Jr.; Bonesteel, Charles
1999-01-01
We present an integrated reliability and aviation safety analysis tool. The reliability models for selected infrastructure components of the air traffic control system are described. The results of this model are used to evaluate the likelihood of seeing outcomes predicted by simulations with failures injected. We discuss the design of the simulation model, and the user interface to the integrated toolset.
A model to predict stream water temperature across the conterminous USA
Catalina Segura; Peter Caldwell; Ge Sun; Steve McNulty; Yang Zhang
2014-01-01
Stream water temperature (ts) is a critical water quality parameter for aquatic ecosystems. However, ts records are sparse or nonexistent in many river systems. In this work, we present an empirical model to predict ts at the site scale across the USA. The model, derived using data from 171 reference sites selected from the Geospatial Attributes of Gages for Evaluating...
ERIC Educational Resources Information Center
Stallings, Jane
The purpose of the Follow Through Classroom Observation Evaluation was to assess the implementation of seven Follow Through sponsor models included in the study and to examine the relationships between classroom instructional processes and child outcomes. The seven programs selected for study include two behavioristic models, an open school model…
Burkholder, Timothy P; Cunningham, Brian E; Clayton, Joshua R; Lander, Peter A; Brown, Matthew L; Doti, Robert A; Durst, Gregory L; Montrose-Rafizadeh, Chahrzad; King, Constance; Osborne, Harold E; Amos, Robert M; Zink, Richard W; Stramm, Lawrence E; Burris, Thomas P; Cardona, Guemalli; Konkol, Debra L; Reidy, Charles; Christe, Michael E; Genin, Michael J
2015-04-01
The design, synthesis, and structure activity relationships for a novel series of indoles as potent, selective, thyroid hormone receptor β (TRβ) agonists is described. Compounds with >50× binding selectivity for TRβ over TRα were generated and evaluation of compound 1c from this series in a model of dyslipidemia demonstrated positive effects on plasma lipid endpoints in vivo. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Li; Li, Chuanghong
2018-02-01
As a sustainable form of ecological structure, green building is widespread concerned and advocated in society increasingly nowadays. In the survey and design phase of preliminary project construction, carrying out the evaluation and selection of green building design scheme, which is in accordance with the scientific and reasonable evaluation index system, can improve the ecological benefits of green building projects largely and effectively. Based on the new Green Building Evaluation Standard which came into effect on January 1, 2015, the evaluation index system of green building design scheme is constructed taking into account the evaluation contents related to the green building design scheme. We organized experts who are experienced in construction scheme optimization to mark and determine the weight of each evaluation index through the AHP method. The correlation degree was calculated between each evaluation scheme and ideal scheme by using multilevel gray relational analysis model and then the optimal scheme was determined. The feasibility and practicability of the evaluation method are verified by introducing examples.
A study of undue pain and surfing: using hierarchical criteria to assess website quality.
Lorence, Daniel; Abraham, Joanna
2008-09-01
In studies of web-based consumer health information, scant attention has been paid to the selective development of differential methodologies for website quality evaluation, or to selective grouping and analysis of specific ;domains of uncertainty' in healthcare. Our objective is to introduce a more refined model for website evaluation, and illustrate its application using assessment of websites within an area of ongoing medical uncertainty, back pain. In this exploratory technology assessment, we suggest a model for assessing these ;domains of uncertainty' within healthcare, using qualitative assessment of websites and hierarchical concepts. Using such a hierarchy of quality criteria, we review medical information provided by the most frequently accessed websites related to back pain. Websites are evaluated using standardized criteria, with results rated from the viewpoint of the consumer. Results show that standardization of quality rating across subjective content, and between commercial and niche search results, can provide a consumer-friendly dimension to health information.
Review of the socket design and interface pressure measurement for transtibial prosthesis.
Pirouzi, Gh; Abu Osman, N A; Eshraghi, A; Ali, S; Gholizadeh, H; Wan Abas, W A B
2014-01-01
Socket is an important part of every prosthetic limb as an interface between the residual limb and prosthetic components. Biomechanics of socket-residual limb interface, especially the pressure and force distribution, have effect on patient satisfaction and function. This paper aimed to review and evaluate studies conducted in the last decades on the design of socket, in-socket interface pressure measurement, and socket biomechanics. Literature was searched to find related keywords with transtibial amputation, socket-residual limb interface, socket measurement, socket design, modeling, computational modeling, and suspension system. In accordance with the selection criteria, 19 articles were selected for further analysis. It was revealed that pressure and stress have been studied in the last decaeds, but quantitative evaluations remain inapplicable in clinical settings. This study also illustrates prevailing systems, which may facilitate improvements in socket design for improved quality of life for individuals ambulating with transtibial prosthesis. It is hoped that the review will better facilitate the understanding and determine the clinical relevance of quantitative evaluations.
Review of the Socket Design and Interface Pressure Measurement for Transtibial Prosthesis
Pirouzi, Gh.; Abu Osman, N. A.; Eshraghi, A.; Ali, S.; Gholizadeh, H.; Wan Abas, W. A. B.
2014-01-01
Socket is an important part of every prosthetic limb as an interface between the residual limb and prosthetic components. Biomechanics of socket-residual limb interface, especially the pressure and force distribution, have effect on patient satisfaction and function. This paper aimed to review and evaluate studies conducted in the last decades on the design of socket, in-socket interface pressure measurement, and socket biomechanics. Literature was searched to find related keywords with transtibial amputation, socket-residual limb interface, socket measurement, socket design, modeling, computational modeling, and suspension system. In accordance with the selection criteria, 19 articles were selected for further analysis. It was revealed that pressure and stress have been studied in the last decaeds, but quantitative evaluations remain inapplicable in clinical settings. This study also illustrates prevailing systems, which may facilitate improvements in socket design for improved quality of life for individuals ambulating with transtibial prosthesis. It is hoped that the review will better facilitate the understanding and determine the clinical relevance of quantitative evaluations. PMID:25197716
Numerical Model Studies of the Martian Mesoscale Circulations
NASA Technical Reports Server (NTRS)
Segal, Moti; Arritt, Raymond W.
1997-01-01
The study objectives were to evaluate by numerical modeling various possible mesoscale circulation on Mars and related atmospheric boundary layer processes. The study was in collaboration with J. Tillman of the University of Washington (who supported the study observationally). Interaction has been made with J. Prusa of Iowa State University in numerical modeling investigation of dynamical effects of topographically-influenced flow. Modeling simulations included evaluations of surface physical characteristics on: (i) the Martian atmospheric boundary layer and (ii) their impact on thermally and dynamically forced mesoscale flows. Special model evaluations were made in support of selection of the Pathfinder landing sites. J. Tillman's finding of VL-2 inter-annual temperature difference was followed by model simulations attempting to point out the forcing for this feature. Publication of the results in the reviewed literature in pending upon completion of the manuscripts in preparation as indicated later.
Using logic models in a community-based agricultural injury prevention project.
Helitzer, Deborah; Willging, Cathleen; Hathorn, Gary; Benally, Jeannie
2009-01-01
The National Institute for Occupational Safety and Health has long promoted the logic model as a useful tool in an evaluator's portfolio. Because a logic model supports a systematic approach to designing interventions, it is equally useful for program planners. Undertaken with community stakeholders, a logic model process articulates the underlying foundations of a particular programmatic effort and enhances program design and evaluation. Most often presented as sequenced diagrams or flow charts, logic models demonstrate relationships among the following components: statement of a problem, various causal and mitigating factors related to that problem, available resources to address the problem, theoretical foundations of the selected intervention, intervention goals and planned activities, and anticipated short- and long-term outcomes. This article describes a case example of how a logic model process was used to help community stakeholders on the Navajo Nation conceive, design, implement, and evaluate agricultural injury prevention projects.
Sexually antagonistic polymorphism in simultaneous hermaphrodites
Jordan, Crispin Y.; Connallon, Tim
2015-01-01
In hermaphrodites, pleiotropic genetic tradeoffs between female and male reproductive functions can lead to sexually antagonistic (SA) selection, where individual alleles have conflicting fitness effects on each sex function. While an extensive theory of SA selection exists for dioecious species, these results have not been generalized to hermaphrodites. We develop population genetic models of SA selection in simultaneous hermaphrodites, and evaluate effects of dominance, selection on each sex function, self-fertilization, and population size, on the maintenance of polymorphism. Under obligate outcrossing, hermaphrodite model predictions converge exactly with those of dioecious populations. Self-fertilization in hermaphrodites generates three points of divergence with dioecious theory. First, opportunities for stable polymorphism decline sharply and become less sensitive to dominance with increased selfing. Second, selfing introduces an asymmetry in the relative importance of selection through male versus female reproductive functions, expands the parameter space favorable for the evolutionary invasion of female-beneficial alleles, and restricts invasion criteria for male-beneficial alleles. Finally, contrary to models of unconditionally beneficial alleles, selfing decreases genetic hitchhiking effects of invading SA alleles, and should therefore decrease these population genetic signals of SA polymorphisms. We discuss implications of SA selection in hermaphrodites, including its potential role in the evolution of “selfing syndromes”. PMID:25311368
Aesthetic evolution by mate choice: Darwin's really dangerous idea
Prum, Richard O.
2012-01-01
Darwin proposed an explicitly aesthetic theory of sexual selection in which he described mate preferences as a ‘taste for the beautiful’, an ‘aesthetic capacity’, etc. These statements were not merely colourful Victorian mannerisms, but explicit expressions of Darwin's hypothesis that mate preferences can evolve for arbitrarily attractive traits that do not provide any additional benefits to mate choice. In his critique of Darwin, A. R. Wallace proposed an entirely modern mechanism of mate preference evolution through the correlation of display traits with male vigour or viability, but he called this mechanism natural selection. Wallace's honest advertisement proposal was stridently anti-Darwinian and anti-aesthetic. Most modern sexual selection research relies on essentially the same Neo-Wallacean theory renamed as sexual selection. I define the process of aesthetic evolution as the evolution of a communication signal through sensory/cognitive evaluation, which is most elaborated through coevolution of the signal and its evaluation. Sensory evaluation includes the possibility that display traits do not encode information that is being assessed, but are merely preferred. A genuinely Darwinian, aesthetic theory of sexual selection requires the incorporation of the Lande–Kirkpatrick null model into sexual selection research, but also encompasses the possibility of sensory bias, good genes and direct benefits mechanisms. PMID:22777014
NASA Astrophysics Data System (ADS)
Zhang, Zhifen; Chen, Huabin; Xu, Yanling; Zhong, Jiyong; Lv, Na; Chen, Shanben
2015-08-01
Multisensory data fusion-based online welding quality monitoring has gained increasing attention in intelligent welding process. This paper mainly focuses on the automatic detection of typical welding defect for Al alloy in gas tungsten arc welding (GTAW) by means of analzing arc spectrum, sound and voltage signal. Based on the developed algorithms in time and frequency domain, 41 feature parameters were successively extracted from these signals to characterize the welding process and seam quality. Then, the proposed feature selection approach, i.e., hybrid fisher-based filter and wrapper was successfully utilized to evaluate the sensitivity of each feature and reduce the feature dimensions. Finally, the optimal feature subset with 19 features was selected to obtain the highest accuracy, i.e., 94.72% using established classification model. This study provides a guideline for feature extraction, selection and dynamic modeling based on heterogeneous multisensory data to achieve a reliable online defect detection system in arc welding.
Aguirre-Gutiérrez, Jesús; Carvalheiro, Luísa G; Polce, Chiara; van Loon, E Emiel; Raes, Niels; Reemer, Menno; Biesmeijer, Jacobus C
2013-01-01
Understanding species distributions and the factors limiting them is an important topic in ecology and conservation, including in nature reserve selection and predicting climate change impacts. While Species Distribution Models (SDM) are the main tool used for these purposes, choosing the best SDM algorithm is not straightforward as these are plentiful and can be applied in many different ways. SDM are used mainly to gain insight in 1) overall species distributions, 2) their past-present-future probability of occurrence and/or 3) to understand their ecological niche limits (also referred to as ecological niche modelling). The fact that these three aims may require different models and outputs is, however, rarely considered and has not been evaluated consistently. Here we use data from a systematically sampled set of species occurrences to specifically test the performance of Species Distribution Models across several commonly used algorithms. Species range in distribution patterns from rare to common and from local to widespread. We compare overall model fit (representing species distribution), the accuracy of the predictions at multiple spatial scales, and the consistency in selection of environmental correlations all across multiple modelling runs. As expected, the choice of modelling algorithm determines model outcome. However, model quality depends not only on the algorithm, but also on the measure of model fit used and the scale at which it is used. Although model fit was higher for the consensus approach and Maxent, Maxent and GAM models were more consistent in estimating local occurrence, while RF and GBM showed higher consistency in environmental variables selection. Model outcomes diverged more for narrowly distributed species than for widespread species. We suggest that matching study aims with modelling approach is essential in Species Distribution Models, and provide suggestions how to do this for different modelling aims and species' data characteristics (i.e. sample size, spatial distribution).
NASA Astrophysics Data System (ADS)
Kostrzewa, Daniel; Josiński, Henryk
2016-06-01
The expanded Invasive Weed Optimization algorithm (exIWO) is an optimization metaheuristic modelled on the original IWO version inspired by dynamic growth of weeds colony. The authors of the present paper have modified the exIWO algorithm introducing a set of both deterministic and non-deterministic strategies of individuals' selection. The goal of the project was to evaluate the modified exIWO by testing its usefulness for multidimensional numerical functions optimization. The optimized functions: Griewank, Rastrigin, and Rosenbrock are frequently used as benchmarks because of their characteristics.
Improved numerical methods for turbulent viscous flows aerothermal modeling program, phase 2
NASA Technical Reports Server (NTRS)
Karki, K. C.; Patankar, S. V.; Runchal, A. K.; Mongia, H. C.
1988-01-01
The details of a study to develop accurate and efficient numerical schemes to predict complex flows are described. In this program, several discretization schemes were evaluated using simple test cases. This assessment led to the selection of three schemes for an in-depth evaluation based on two-dimensional flows. The scheme with the superior overall performance was incorporated in a computer program for three-dimensional flows. To improve the computational efficiency, the selected discretization scheme was combined with a direct solution approach in which the fluid flow equations are solved simultaneously rather than sequentially.
Phosphorus component in AnnAGNPS
Yuan, Y.; Bingner, R.L.; Theurer, F.D.; Rebich, R.A.; Moore, P.A.
2005-01-01
The USDA Annualized Agricultural Non-Point Source Pollution model (AnnAGNPS) has been developed to aid in evaluation of watershed response to agricultural management practices. Previous studies have demonstrated the capability of the model to simulate runoff and sediment, but not phosphorus (P). The main purpose of this article is to evaluate the performance of AnnAGNPS on P simulation using comparisons with measurements from the Deep Hollow watershed of the Mississippi Delta Management Systems Evaluation Area (MDMSEA) project. A sensitivity analysis was performed to identify input parameters whose impact is the greatest on P yields. Sensitivity analysis results indicate that the most sensitive variables of those selected are initial soil P contents, P application rate, and plant P uptake. AnnAGNPS simulations of dissolved P yield do not agree well with observed dissolved P yield (Nash-Sutcliffe coefficient of efficiency of 0.34, R2 of 0.51, and slope of 0.24); however, AnnAGNPS simulations of total P yield agree well with observed total P yield (Nash-Sutcliffe coefficient of efficiency of 0.85, R2 of 0.88, and slope of 0.83). The difference in dissolved P yield may be attributed to limitations in model simulation of P processes. Uncertainties in input parameter selections also affect the model's performance.
Modeling listeners' emotional response to music.
Eerola, Tuomas
2012-10-01
An overview of the computational prediction of emotional responses to music is presented. Communication of emotions by music has received a great deal of attention during the last years and a large number of empirical studies have described the role of individual features (tempo, mode, articulation, timbre) in predicting the emotions suggested or invoked by the music. However, unlike the present work, relatively few studies have attempted to model continua of expressed emotions using a variety of musical features from audio-based representations in a correlation design. The construction of the computational model is divided into four separate phases, with a different focus for evaluation. These phases include the theoretical selection of relevant features, empirical assessment of feature validity, actual feature selection, and overall evaluation of the model. Existing research on music and emotions and extraction of musical features is reviewed in terms of these criteria. Examples drawn from recent studies of emotions within the context of film soundtracks are used to demonstrate each phase in the construction of the model. These models are able to explain the dominant part of the listeners' self-reports of the emotions expressed by music and the models show potential to generalize over different genres within Western music. Possible applications of the computational models of emotions are discussed. Copyright © 2012 Cognitive Science Society, Inc.
Torfs, Elena; Balemans, Sophie; Locatelli, Florent; Diehl, Stefan; Bürger, Raimund; Laurent, Julien; François, Pierre; Nopens, Ingmar
2017-03-01
Advanced 1-D models for Secondary Settling Tanks (SSTs) explicitly account for several phenomena that influence the settling process (such as hindered settling and compression settling). For each of these phenomena a valid mathematical expression needs to be selected and its parameters calibrated to obtain a model that can be used for operation and control. This is, however, a challenging task as these phenomena may occur simultaneously. Therefore, the presented work evaluates several available expressions for hindered settling based on long-term batch settling data. Specific attention is paid to the behaviour of these hindered settling functions in the compression region in order to evaluate how the modelling of sludge compression is influenced by the choice of a certain hindered settling function. The analysis shows that the exponential hindered settling forms, which are most commonly used in traditional SST models, not only account for hindered settling but partly lump other phenomena (compression) as well. This makes them unsuitable for advanced 1-D models that explicitly include each phenomenon in a modular way. A power-law function is shown to be more appropriate to describe the hindered settling velocity in advanced 1-D SST models. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
He, Zhibin; Wen, Xiaohu; Liu, Hu; Du, Jun
2014-02-01
Data driven models are very useful for river flow forecasting when the underlying physical relationships are not fully understand, but it is not clear whether these data driven models still have a good performance in the small river basin of semiarid mountain regions where have complicated topography. In this study, the potential of three different data driven methods, artificial neural network (ANN), adaptive neuro fuzzy inference system (ANFIS) and support vector machine (SVM) were used for forecasting river flow in the semiarid mountain region, northwestern China. The models analyzed different combinations of antecedent river flow values and the appropriate input vector has been selected based on the analysis of residuals. The performance of the ANN, ANFIS and SVM models in training and validation sets are compared with the observed data. The model which consists of three antecedent values of flow has been selected as the best fit model for river flow forecasting. To get more accurate evaluation of the results of ANN, ANFIS and SVM models, the four quantitative standard statistical performance evaluation measures, the coefficient of correlation (R), root mean squared error (RMSE), Nash-Sutcliffe efficiency coefficient (NS) and mean absolute relative error (MARE), were employed to evaluate the performances of various models developed. The results indicate that the performance obtained by ANN, ANFIS and SVM in terms of different evaluation criteria during the training and validation period does not vary substantially; the performance of the ANN, ANFIS and SVM models in river flow forecasting was satisfactory. A detailed comparison of the overall performance indicated that the SVM model performed better than ANN and ANFIS in river flow forecasting for the validation data sets. The results also suggest that ANN, ANFIS and SVM method can be successfully applied to establish river flow with complicated topography forecasting models in the semiarid mountain regions.
Butson, Christopher R.; Tamm, Georg; Jain, Sanket; Fogal, Thomas; Krüger, Jens
2012-01-01
In recent years there has been significant growth in the use of patient-specific models to predict the effects of neuromodulation therapies such as deep brain stimulation (DBS). However, translating these models from a research environment to the everyday clinical workflow has been a challenge, primarily due to the complexity of the models and the expertise required in specialized visualization software. In this paper, we deploy the interactive visualization system ImageVis3D Mobile, which has been designed for mobile computing devices such as the iPhone or iPad, in an evaluation environment to visualize models of Parkinson’s disease patients who received DBS therapy. Selection of DBS settings is a significant clinical challenge that requires repeated revisions to achieve optimal therapeutic response, and is often performed without any visual representation of the stimulation system in the patient. We used ImageVis3D Mobile to provide models to movement disorders clinicians and asked them to use the software to determine: 1) which of the four DBS electrode contacts they would select for therapy; and 2) what stimulation settings they would choose. We compared the stimulation protocol chosen from the software versus the stimulation protocol that was chosen via clinical practice (independently of the study). Lastly, we compared the amount of time required to reach these settings using the software versus the time required through standard practice. We found that the stimulation settings chosen using ImageVis3D Mobile were similar to those used in standard of care, but were selected in drastically less time. We show how our visualization system, available directly at the point of care on a device familiar to the clinician, can be used to guide clinical decision making for selection of DBS settings. In our view, the positive impact of the system could also translate to areas other than DBS. PMID:22450824
Selecting a digital camera for telemedicine.
Patricoski, Chris; Ferguson, A Stewart
2009-06-01
The digital camera is an essential component of store-and-forward telemedicine (electronic consultation). There are numerous makes and models of digital cameras on the market, and selecting a suitable consumer-grade camera can be complicated. Evaluation of digital cameras includes investigating the features and analyzing image quality. Important features include the camera settings, ease of use, macro capabilities, method of image transfer, and power recharging. Consideration needs to be given to image quality, especially as it relates to color (skin tones) and detail. It is important to know the level of the photographer and the intended application. The goal is to match the characteristics of the camera with the telemedicine program requirements. In the end, selecting a digital camera is a combination of qualitative (subjective) and quantitative (objective) analysis. For the telemedicine program in Alaska in 2008, the camera evaluation and decision process resulted in a specific selection based on the criteria developed for our environment.
Selected aspects of modelling monetary transmission mechanism by BVAR model
NASA Astrophysics Data System (ADS)
Vaněk, Tomáš; Dobešová, Anna; Hampel, David
2013-10-01
In this paper we use the BVAR model with the specifically defined prior to evaluate data including high-lag dependencies. The results are compared to both restricted and common VAR model. The data depicts the monetary transmission mechanism in the Czech Republic and Slovakia from January 2002 to February 2013. The results point to the inadequacy of the common VAR model. The restricted VAR model and the BVAR model appear to be similar in the sense of impulse responses.
Hit identification of novel heparanase inhibitors by structure- and ligand-based approaches.
Gozalbes, Rafael; Mosulén, Silvia; Ortí, Leticia; Rodríguez-Díaz, Jesús; Carbajo, Rodrigo J; Melnyk, Patricia; Pineda-Lucena, Antonio
2013-04-01
Heparanase is a key enzyme involved in the dissemination of metastatic cancer cells. In this study a combination of in silico techniques and experimental methods was used to identify new potential inhibitors against this target. A 3D model of heparanase was built from sequence homology and applied to the virtual screening of a library composed of 27 known heparanase inhibitors and a commercial collection of drugs and drug-like compounds. The docking results from this campaign were combined with those obtained from a pharmacophore model recently published based in the same set of chemicals. Compounds were then ranked according to their theoretical binding affinity, and the top-rated commercial drugs were selected for further experimental evaluation. Biophysical methods (NMR and SPR) were applied to assess experimentally the interaction of the selected compounds with heparanase. The binding site was evaluated via competition experiments, using a known inhibitor of heparanase. Three of the selected drugs were found to bind to the active site of the protein and their KD values were determined. Among them, the antimalarial drug amodiaquine presented affinity towards the protein in the low-micromolar range, and was singled out for a SAR study based on its chemical scaffold. A subset of fourteen 4-arylaminoquinolines from a global set of 249 analogues of amodiaquine was selected based on the application of in silico models, a QSAR solubility prediction model and a chemical diversity analysis. Some of these compounds displayed binding affinities in the micromolar range. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mohammed, Habiba Ibrahim; Majid, Zulkepli; Yusof, Norhakim Bin; Bello Yamusa, Yamusa
2018-03-01
Landfilling remains the most common systematic technique of solid waste disposal in most of the developed and developing countries. Finding a suitable site for landfill is a very challenging task. Landfill site selection process aims to provide suitable areas that will protect the environment and public health from pollution and hazards. Therefore, various factors such as environmental, physical, socio-economic, and geological criteria must be considered before siting any landfill. This makes the site selection process vigorous and tedious because it involves the processing of large amount of spatial data, rules and regulations from different agencies and also policy from decision makers. This allows the incorporation of conflicting objectives and decision maker preferences into spatial decision models. This paper particularly analyzes the multi-criteria evaluation (MCE) method of landfill site selection for solid waste management by means of literature reviews and surveys. The study will help the decision makers and waste management authorities to choose the most effective method when considering landfill site selection.
Evaluating the risk of water distribution system failure: A shared frailty model
NASA Astrophysics Data System (ADS)
Clark, Robert M.; Thurnau, Robert C.
2011-12-01
Condition assessment (CA) Modeling is drawing increasing interest as a technique that can assist in managing drinking water infrastructure. This paper develops a model based on the application of a Cox proportional hazard (PH)/shared frailty model and applies it to evaluating the risk of failure in drinking water networks using data from the Laramie Water Utility (located in Laramie, Wyoming, USA). Using the risk model a cost/ benefit analysis incorporating the inspection value method (IVM), is used to assist in making improved repair, replacement and rehabilitation decisions for selected drinking water distribution system pipes. A separate model is developed to predict failures in prestressed concrete cylinder pipe (PCCP). Various currently available inspection technologies are presented and discussed.
NASA Astrophysics Data System (ADS)
Sahin, E. K.; Colkesen, I., , Dr; Kavzoglu, T.
2017-12-01
Identification of localities prone to landslide areas plays an important role for emergency planning, disaster management and recovery planning. Due to its great importance for disaster management, producing accurate and up-to-date landslide susceptibility maps is essential for hazard mitigation purpose and regional planning. The main objective of the present study was to apply multi-collinearity based model selection approach for the production of a landslide susceptibility map of Ulus district of Karabuk, Turkey. It is a fact that data do not contain enough information to describe the problem under consideration when the factors are highly correlated with each other. In such cases, choosing a subset of the original features will often lead to better performance. This paper presents multi-collinearity based model selection approach to deal with the high correlation within the dataset. Two collinearity diagnostic factors (Tolerance (TOL) and the Variance Inflation Factor (VIF)) are commonly used to identify multi-collinearity. Values of VIF that exceed 10.0 and TOL values less than 1.0 are often regarded as indicating multi-collinearity. Five causative factors (slope length, curvature, plan curvature, profile curvature and topographical roughness index) were found highly correlated with each other among 15 factors available for the study area. As a result, the five correlated factors were removed from the model estimation, and performances of the models including the remaining 10 factors (aspect, drainage density, elevation, lithology, land use/land cover, NDVI, slope, sediment transport index, topographical position index and topographical wetness index) were evaluated using logistic regression. The performance of prediction model constructed with 10 factors was compared to that of 15-factor model. The prediction performance of two susceptibility maps was evaluated by overall accuracy and the area under the ROC curve (AUC) values. Results showed that overall accuracy and AUC was calculated as 77.15% and 96.62% respectively for the model with 10 selected factors whilst they were estimated as 73.45% and 89.45% respectively for the model with all factors. It is clear that the multi-collinearity based model outperformed the conventional model in the mapping of landslide susceptibility.
Furlanello, Cesare; Serafini, Maria; Merler, Stefano; Jurman, Giuseppe
2003-11-06
We describe the E-RFE method for gene ranking, which is useful for the identification of markers in the predictive classification of array data. The method supports a practical modeling scheme designed to avoid the construction of classification rules based on the selection of too small gene subsets (an effect known as the selection bias, in which the estimated predictive errors are too optimistic due to testing on samples already considered in the feature selection process). With E-RFE, we speed up the recursive feature elimination (RFE) with SVM classifiers by eliminating chunks of uninteresting genes using an entropy measure of the SVM weights distribution. An optimal subset of genes is selected according to a two-strata model evaluation procedure: modeling is replicated by an external stratified-partition resampling scheme, and, within each run, an internal K-fold cross-validation is used for E-RFE ranking. Also, the optimal number of genes can be estimated according to the saturation of Zipf's law profiles. Without a decrease of classification accuracy, E-RFE allows a speed-up factor of 100 with respect to standard RFE, while improving on alternative parametric RFE reduction strategies. Thus, a process for gene selection and error estimation is made practical, ensuring control of the selection bias, and providing additional diagnostic indicators of gene importance.
A Methodology for Evaluating Artifacts Produced by a Formal Verification Process
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Miner, Paul S.; Person, Suzette
2011-01-01
The goal of this study is to produce a methodology for evaluating the claims and arguments employed in, and the evidence produced by formal verification activities. To illustrate the process, we conduct a full assessment of a representative case study for the Enabling Technology Development and Demonstration (ETDD) program. We assess the model checking and satisfiabilty solving techniques as applied to a suite of abstract models of fault tolerant algorithms which were selected to be deployed in Orion, namely the TTEthernet startup services specified and verified in the Symbolic Analysis Laboratory (SAL) by TTTech. To this end, we introduce the Modeling and Verification Evaluation Score (MVES), a metric that is intended to estimate the amount of trust that can be placed on the evidence that is obtained. The results of the evaluation process and the MVES can then be used by non-experts and evaluators in assessing the credibility of the verification results.
Nozari, Nazbanou; Hepner, Christopher R
2018-06-05
Competitive accounts of lexical selection propose that the activation of competitors slows down the selection of the target. Non-competitive accounts, on the other hand, posit that target response latencies are independent of the activation of competing items. In this paper, we propose a signal detection framework for lexical selection and show how a flexible selection criterion affects claims of competitive selection. Specifically, we review evidence from neurotypical and brain-damaged speakers and demonstrate that task goals and the state of the production system determine whether a competitive or a non-competitive selection profile arises. We end by arguing that there is conclusive evidence for a flexible criterion in lexical selection, and that integrating criterion shifts into models of language production is critical for evaluating theoretical claims regarding (non-)competitive selection.
Plurality of Type A evaluations of uncertainty
NASA Astrophysics Data System (ADS)
Possolo, Antonio; Pintar, Adam L.
2017-10-01
The evaluations of measurement uncertainty involving the application of statistical methods to measurement data (Type A evaluations as specified in the Guide to the Expression of Uncertainty in Measurement, GUM) comprise the following three main steps: (i) developing a statistical model that captures the pattern of dispersion or variability in the experimental data, and that relates the data either to the measurand directly or to some intermediate quantity (input quantity) that the measurand depends on; (ii) selecting a procedure for data reduction that is consistent with this model and that is fit for the purpose that the results are intended to serve; (iii) producing estimates of the model parameters, or predictions based on the fitted model, and evaluations of uncertainty that qualify either those estimates or these predictions, and that are suitable for use in subsequent uncertainty propagation exercises. We illustrate these steps in uncertainty evaluations related to the measurement of the mass fraction of vanadium in a bituminous coal reference material, including the assessment of the homogeneity of the material, and to the calibration and measurement of the amount-of-substance fraction of a hydrochlorofluorocarbon in air, and of the age of a meteorite. Our goal is to expose the plurality of choices that can reasonably be made when taking each of the three steps outlined above, and to show that different choices typically lead to different estimates of the quantities of interest, and to different evaluations of the associated uncertainty. In all the examples, the several alternatives considered represent choices that comparably competent statisticians might make, but who differ in the assumptions that they are prepared to rely on, and in their selection of approach to statistical inference. They represent also alternative treatments that the same statistician might give to the same data when the results are intended for different purposes.
ERIC Educational Resources Information Center
Sagalakova, Olga A.; Truevtsev, Dmitry V.; Sagalakov, Anatoly M.
2016-01-01
This article analyzes modern theoretical and conceptual models of social anxiety disorder (SAD) (cognitive, metacognitive, psychopathological) with a view to determine specific features of psychological mechanisms of disorders studied in various approaches, to identify similarities and differences in conceptual SAD models, their heuristic…
In the evaluation of emissions standards, OAQPS frequently uses one or more computer-based models to estimate the number of people who will be exposed to the air pollution levels that are expected to occur under various air quality scenarios.
Power Hardware-in-the-Loop Evaluation of PV Inverter Grid Support on Hawaiian Electric Feeders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Austin A; Prabakar, Kumaraguru; Nagarajan, Adarsh
As more grid-connected photovoltaic (PV) inverters become compliant with evolving interconnections requirements, there is increased interest from utilities in understanding how to best deploy advanced grid-support functions (GSF) in the field. One efficient and cost-effective method to examine such deployment options is to leverage power hardware-in-the-loop (PHIL) testing methods, which combine the fidelity of hardware tests with the flexibility of computer simulation. This paper summarizes a study wherein two Hawaiian Electric feeder models were converted to real-time models using an OPAL-RT real-time digital testing platform, and integrated with models of GSF capable PV inverters based on characterization test data. Themore » integrated model was subsequently used in PHIL testing to evaluate the effects of different fixed power factor and volt-watt control settings on voltage regulation of the selected feeders using physical inverters. Selected results are presented in this paper, and complete results of this study were provided as inputs for field deployment and technical interconnection requirements for grid-connected PV inverters on the Hawaiian Islands.« less
NASA Technical Reports Server (NTRS)
Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.
1983-01-01
Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.
NASA Astrophysics Data System (ADS)
Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.
1983-05-01
Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.
Thompson, William K; Rasmussen, Luke V; Pacheco, Jennifer A; Peissig, Peggy L; Denny, Joshua C; Kho, Abel N; Miller, Aaron; Pathak, Jyotishman
2012-01-01
The development of Electronic Health Record (EHR)-based phenotype selection algorithms is a non-trivial and highly iterative process involving domain experts and informaticians. To make it easier to port algorithms across institutions, it is desirable to represent them using an unambiguous formal specification language. For this purpose we evaluated the recently developed National Quality Forum (NQF) information model designed for EHR-based quality measures: the Quality Data Model (QDM). We selected 9 phenotyping algorithms that had been previously developed as part of the eMERGE consortium and translated them into QDM format. Our study concluded that the QDM contains several core elements that make it a promising format for EHR-driven phenotyping algorithms for clinical research. However, we also found areas in which the QDM could be usefully extended, such as representing information extracted from clinical text, and the ability to handle algorithms that do not consist of Boolean combinations of criteria.
DECIDE: a software for computer-assisted evaluation of diagnostic test performance.
Chiecchio, A; Bo, A; Manzone, P; Giglioli, F
1993-05-01
The evaluation of the performance of clinical tests is a complex problem involving different steps and many statistical tools, not always structured in an organic and rational system. This paper presents a software which provides an organic system of statistical tools helping evaluation of clinical test performance. The program allows (a) the building and the organization of a working database, (b) the selection of the minimal set of tests with the maximum information content, (c) the search of the model best fitting the distribution of the test values, (d) the selection of optimal diagnostic cut-off value of the test for every positive/negative situation, (e) the evaluation of performance of the combinations of correlated and uncorrelated tests. The uncertainty associated with all the variables involved is evaluated. The program works in a MS-DOS environment with EGA or higher performing graphic card.
Cavallo, Jaime A.; Roma, Andres A.; Jasielec, Mateusz S.; Ousley, Jenny; Creamer, Jennifer; Pichert, Matthew D.; Baalman, Sara; Frisella, Margaret M.; Matthews, Brent D.
2014-01-01
Background The purpose of this study was to evaluate the associations between patient characteristics or surgical site classifications and the histologic remodeling scores of synthetic meshes biopsied from their abdominal wall repair sites in the first attempt to generate a multivariable risk prediction model of non-constructive remodeling. Methods Biopsies of the synthetic meshes were obtained from the abdominal wall repair sites of 51 patients during a subsequent abdominal re-exploration. Biopsies were stained with hematoxylin and eosin, and evaluated according to a semi-quantitative scoring system for remodeling characteristics (cell infiltration, cell types, extracellular matrix deposition, inflammation, fibrous encapsulation, and neovascularization) and a mean composite score (CR). Biopsies were also stained with Sirius Red and Fast Green, and analyzed to determine the collagen I:III ratio. Based on univariate analyses between subject clinical characteristics or surgical site classification and the histologic remodeling scores, cohort variables were selected for multivariable regression models using a threshold p value of ≤0.200. Results The model selection process for the extracellular matrix score yielded two variables: subject age at time of mesh implantation, and mesh classification (c-statistic = 0.842). For CR score, the model selection process yielded two variables: subject age at time of mesh implantation and mesh classification (r2 = 0.464). The model selection process for the collagen III area yielded a model with two variables: subject body mass index at time of mesh explantation and pack-year history (r2 = 0.244). Conclusion Host characteristics and surgical site assessments may predict degree of remodeling for synthetic meshes used to reinforce abdominal wall repair sites. These preliminary results constitute the first steps in generating a risk prediction model that predicts the patients and clinical circumstances for which non-constructive remodeling of an abdominal wall repair site with synthetic mesh reinforcement is most likely to occur. PMID:24442681
Evaluation of parallel reduction strategies for fusion of sensory information from a robot team
NASA Astrophysics Data System (ADS)
Lyons, Damian M.; Leroy, Joseph
2015-05-01
The advantage of using a team of robots to search or to map an area is that by navigating the robots to different parts of the area, searching or mapping can be completed more quickly. A crucial aspect of the problem is the combination, or fusion, of data from team members to generate an integrated model of the search/mapping area. In prior work we looked at the issue of removing mutual robots views from an integrated point cloud model built from laser and stereo sensors, leading to a cleaner and more accurate model. This paper addresses a further challenge: Even with mutual views removed, the stereo data from a team of robots can quickly swamp a WiFi connection. This paper proposes and evaluates a communication and fusion approach based on the parallel reduction operation, where data is combined in a series of steps of increasing subsets of the team. Eight different strategies for selecting the subsets are evaluated for bandwidth requirements using three robot missions, each carried out with teams of four Pioneer 3-AT robots. Our results indicate that selecting groups to combine based on similar pose but distant location yields the best results.
Preclinical evaluation of anti-HIV microbicide products: New models and biomarkers.
Doncel, Gustavo F; Clark, Meredith R
2010-12-01
A safe and effective microbicide product designed to prevent sexual transmission of HIV-1 rests on a solid foundation provided by the proper selection and preclinical characterization of both its active pharmaceutical ingredient (API) and formulation. The evaluation of API and formulation physicochemical properties, drug release, specific antiviral activity, cell and tissue toxicity, organ toxicity, pharmacokinetics, and pharmacodynamics and efficacy provides information to understand the product, make go/no go decisions in the critical path of product development and complete a regulatory dossier to file an investigational new drug (IND) with the US Food and Drug Administration. Incorporation of new models, assays and biomarkers has expanded our ability to understand the mechanisms of action underlying microbicide toxicity and efficacy, enabling a more rational selection of drug and formulation candidates. This review presents an overview of the models and endpoints used to comprehensively evaluate an anti-HIV microbicide in preclinical development. This article forms part of a special supplement on presentations covering HIV transmission and microbicides, based on the symposium "Trends in Microbicide Formulations", held on 25 and 26 January 2010, Arlington, VA. Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Albertson, C. W.
1982-03-01
A 1/12th scale model of the Curved Surface Test Apparatus (CSTA), which will be used to study aerothermal loads and evaluate Thermal Protection Systems (TPS) on a fuselage-type configuration in the Langley 8-Foot High Temperature Structures Tunnel (8 ft HTST), was tested in the Langley 7-Inch Mach 7 Pilot Tunnel. The purpose of the tests was to study the overall flow characteristics and define an envelope for testing the CSTA in the 8 ft HTST. Wings were tested on the scaled CSTA model to select a wing configuration with the most favorable characteristics for conducting TPS evaluations for curved and intersecting surfaces. The results indicate that the CSTA and selected wing configuration can be tested at angles of attack up to 15.5 and 10.5 degrees, respectively. The base pressure for both models was at the expected low level for most test conditions. Results generally indicate that the CSTA and wing configuration will provide a useful test bed for aerothermal pads and thermal structural concept evaluation over a broad range of flow conditions in the 8 ft HTST.
NASA Technical Reports Server (NTRS)
Albertson, C. W.
1982-01-01
A 1/12th scale model of the Curved Surface Test Apparatus (CSTA), which will be used to study aerothermal loads and evaluate Thermal Protection Systems (TPS) on a fuselage-type configuration in the Langley 8-Foot High Temperature Structures Tunnel (8 ft HTST), was tested in the Langley 7-Inch Mach 7 Pilot Tunnel. The purpose of the tests was to study the overall flow characteristics and define an envelope for testing the CSTA in the 8 ft HTST. Wings were tested on the scaled CSTA model to select a wing configuration with the most favorable characteristics for conducting TPS evaluations for curved and intersecting surfaces. The results indicate that the CSTA and selected wing configuration can be tested at angles of attack up to 15.5 and 10.5 degrees, respectively. The base pressure for both models was at the expected low level for most test conditions. Results generally indicate that the CSTA and wing configuration will provide a useful test bed for aerothermal pads and thermal structural concept evaluation over a broad range of flow conditions in the 8 ft HTST.
USDA-ARS?s Scientific Manuscript database
Selection of the composite MARC III population for markers allowed better estimates of effects and inheritance of markers for targeted carcass quality traits (n=254) and nontargeted traits and an evaluation of SNP specific residual variance models for tenderness. Genotypic effects of CAPN1 haplotyp...
Evaluation of cancer mortality in a cohort of workers exposed to low-level radiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lea, C.S.
1995-12-01
The purpose of this dissertation was to re-analyze existing data to explore methodologic approaches that may determine whether excess cancer mortality in the ORNL cohort can be explained by time-related factors not previously considered; grouping of cancer outcomes; selection bias due to choice of method selected to incorporate an empirical induction period; or the type of statistical model chosen.
Evaluation of the Science Enrichment Activities (SEA) Program: A Decision Oriented Model.
ERIC Educational Resources Information Center
Linn, Marcia C.
1978-01-01
Three questions guided an evaluation of sixth and eighth grade science enrichment activities: (1) Does a free choice interactive program affect cognitive abilities? (2) Do students in a free choice program make predictable selections of activities based on their age, sex, or ability level? and (3) Are specific student choices associated with…
Design and Empirical Evaluation of Search Software for Legal Professionals on the WWW.
ERIC Educational Resources Information Center
Dempsey, Bert J.; Vreeland, Robert C.; Sumner, Robert G., Jr.; Yang, Kiduk
2000-01-01
Discussion of effective search aids for legal researchers on the World Wide Web focuses on the design and evaluation of two software systems developed to explore models for browsing and searching across a user-selected set of Web sites. Describes crawler-enhanced search engines, filters, distributed full-text searching, and natural language…
Georgia's Pre-K Professional Development Evaluation: Technical Appendix. Publication #2015-02B
ERIC Educational Resources Information Center
Early, Diane M.; Pan, Yi; Maxwell, Kelly L.
2014-01-01
The primary purpose of the accompanying final study was to evaluate the impact of two professional development models on teacher-child interactions in Georgia's Pre-K classrooms. Teachers were randomly selected to participate and were randomly assigned to one of the professional development conditions or to a control group. Because of this…
Software Organization in Student Data Banks for Research and Evaluation: Four Institutional Models.
ERIC Educational Resources Information Center
Friedman, Charles P.
Student data banks for ongoing research and evaluation have been implemented by a number of professional schools. Institutions selecting software designs for the establishment of such systems are often faced with making their choice before all the possible uses of the system are determined. Making software design decisions involves "rational"…
NASA Technical Reports Server (NTRS)
Stalmach, C. J., Jr.
1975-01-01
Several model/instrument concepts employing electroless metallic skin were considered for improvement of surface condition, accuracy, and cost of contoured-geometry convective heat transfer models. A plated semi-infinite slab approach was chosen for development and evaluation in a hypersonic wind tunnel. The plated slab model consists of an epoxy casting containing fine constantan wires accurately placed at specified surface locations. An electroless alloy was deposited on the plastic surface that provides a hard, uniformly thick, seamless skin. The chosen alloy forms a high-output thermocouple junction with each exposed constantan wire, providing means of determining heat transfer during tunnel testing of the model. A selective electroless plating procedure was used to deposit scaled heatshield tiles on the lower surface of a 0.0175-scale shuttle orbiter model. Twenty-five percent of the tiles were randomly selected and plated to a height of 0.001-inch. The purpose was to assess the heating effects of surface roughness simulating misalignment of tiles that may occur during manufacture of the spacecraft.
2014-01-01
Background A higher prevalence of chronic atrophic gastritis (CAG) occurs in younger adults in Asia. We used Stomach Age to examine the different mechanisms of CAG between younger adults and elderly individuals, and established a simple model of cancer risk that can be applied to CAG surveillance. Methods Stomach Age was determined by FISH examination of telomere length in stomach biopsies. Δψm was also determined by flow cytometry. Sixty volunteers were used to confirm the linear relationship between telomere length and age while 120 subjects were used to build a mathematical model by a multivariate analysis. Overall, 146 subjects were used to evaluate the validity of the model, and 1,007 subjects were used to evaluate the relationship between prognosis and Δage (calculated from the mathematical model). ROC curves were used to evaluate the relationship between prognosis and Δage and to determine the cut-off point for Δage. Results We established that a tight linear relationship between the telomere length and the age. The telomere length was obvious different between patients with and without CAG even in the same age. Δψm decreased in individuals whose Stomach Age was greater than real age, especially in younger adults. A mathematical model of Stomach Age (real age + Δage) was successfully constructed which was easy to apply in clinical work. A higher Δage was correlated with a worse outcome. The criterion of Δage >3.11 should be considered as the cut-off to select the subgroup of patients who require endoscopic surveillance. Conclusion Variation in Stomach Age between individuals of the same biological age was confirmed. Attention should be paid to those with a greater Stomach Age, especially in younger adults. The Δage in the Simple Model can be used as a criterion to select CAG patients for gastric cancer surveillance. PMID:25057261
Nankali, Saber; Miandoab, Payam Samadi; Baghizadeh, Amin
2016-01-01
In external‐beam radiotherapy, using external markers is one of the most reliable tools to predict tumor position, in clinical applications. The main challenge in this approach is tumor motion tracking with highest accuracy that depends heavily on external markers location, and this issue is the objective of this study. Four commercially available feature selection algorithms entitled 1) Correlation‐based Feature Selection, 2) Classifier, 3) Principal Components, and 4) Relief were proposed to find optimum location of external markers in combination with two “Genetic” and “Ranker” searching procedures. The performance of these algorithms has been evaluated using four‐dimensional extended cardiac‐torso anthropomorphic phantom. Six tumors in lung, three tumors in liver, and 49 points on the thorax surface were taken into account to simulate internal and external motions, respectively. The root mean square error of an adaptive neuro‐fuzzy inference system (ANFIS) as prediction model was considered as metric for quantitatively evaluating the performance of proposed feature selection algorithms. To do this, the thorax surface region was divided into nine smaller segments and predefined tumors motion was predicted by ANFIS using external motion data of given markers at each small segment, separately. Our comparative results showed that all feature selection algorithms can reasonably select specific external markers from those segments where the root mean square error of the ANFIS model is minimum. Moreover, the performance accuracy of proposed feature selection algorithms was compared, separately. For this, each tumor motion was predicted using motion data of those external markers selected by each feature selection algorithm. Duncan statistical test, followed by F‐test, on final results reflected that all proposed feature selection algorithms have the same performance accuracy for lung tumors. But for liver tumors, a correlation‐based feature selection algorithm, in combination with a genetic search algorithm, proved to yield best performance accuracy for selecting optimum markers. PACS numbers: 87.55.km, 87.56.Fc PMID:26894358
Nankali, Saber; Torshabi, Ahmad Esmaili; Miandoab, Payam Samadi; Baghizadeh, Amin
2016-01-08
In external-beam radiotherapy, using external markers is one of the most reliable tools to predict tumor position, in clinical applications. The main challenge in this approach is tumor motion tracking with highest accuracy that depends heavily on external markers location, and this issue is the objective of this study. Four commercially available feature selection algorithms entitled 1) Correlation-based Feature Selection, 2) Classifier, 3) Principal Components, and 4) Relief were proposed to find optimum location of external markers in combination with two "Genetic" and "Ranker" searching procedures. The performance of these algorithms has been evaluated using four-dimensional extended cardiac-torso anthropomorphic phantom. Six tumors in lung, three tumors in liver, and 49 points on the thorax surface were taken into account to simulate internal and external motions, respectively. The root mean square error of an adaptive neuro-fuzzy inference system (ANFIS) as prediction model was considered as metric for quantitatively evaluating the performance of proposed feature selection algorithms. To do this, the thorax surface region was divided into nine smaller segments and predefined tumors motion was predicted by ANFIS using external motion data of given markers at each small segment, separately. Our comparative results showed that all feature selection algorithms can reasonably select specific external markers from those segments where the root mean square error of the ANFIS model is minimum. Moreover, the performance accuracy of proposed feature selection algorithms was compared, separately. For this, each tumor motion was predicted using motion data of those external markers selected by each feature selection algorithm. Duncan statistical test, followed by F-test, on final results reflected that all proposed feature selection algorithms have the same performance accuracy for lung tumors. But for liver tumors, a correlation-based feature selection algorithm, in combination with a genetic search algorithm, proved to yield best performance accuracy for selecting optimum markers.
Elements of complexity in subsurface modeling, exemplified with three case studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freedman, Vicky L.; Truex, Michael J.; Rockhold, Mark
2017-04-03
There are complexity elements to consider when applying subsurface flow and transport models to support environmental analyses. Modelers balance the benefits and costs of modeling along the spectrum of complexity, taking into account the attributes of more simple models (e.g., lower cost, faster execution, easier to explain, less mechanistic) and the attributes of more complex models (higher cost, slower execution, harder to explain, more mechanistic and technically defensible). In this paper, modeling complexity is examined with respect to considering this balance. The discussion of modeling complexity is organized into three primary elements: 1) modeling approach, 2) description of process, andmore » 3) description of heterogeneity. Three examples are used to examine these complexity elements. Two of the examples use simulations generated from a complex model to develop simpler models for efficient use in model applications. The first example is designed to support performance evaluation of soil vapor extraction remediation in terms of groundwater protection. The second example investigates the importance of simulating different categories of geochemical reactions for carbon sequestration and selecting appropriate simplifications for use in evaluating sequestration scenarios. In the third example, the modeling history for a uranium-contaminated site demonstrates that conservative parameter estimates were inadequate surrogates for complex, critical processes and there is discussion on the selection of more appropriate model complexity for this application. All three examples highlight how complexity considerations are essential to create scientifically defensible models that achieve a balance between model simplification and complexity.« less
Elements of complexity in subsurface modeling, exemplified with three case studies
NASA Astrophysics Data System (ADS)
Freedman, Vicky L.; Truex, Michael J.; Rockhold, Mark L.; Bacon, Diana H.; Freshley, Mark D.; Wellman, Dawn M.
2017-09-01
There are complexity elements to consider when applying subsurface flow and transport models to support environmental analyses. Modelers balance the benefits and costs of modeling along the spectrum of complexity, taking into account the attributes of more simple models (e.g., lower cost, faster execution, easier to explain, less mechanistic) and the attributes of more complex models (higher cost, slower execution, harder to explain, more mechanistic and technically defensible). In this report, modeling complexity is examined with respect to considering this balance. The discussion of modeling complexity is organized into three primary elements: (1) modeling approach, (2) description of process, and (3) description of heterogeneity. Three examples are used to examine these complexity elements. Two of the examples use simulations generated from a complex model to develop simpler models for efficient use in model applications. The first example is designed to support performance evaluation of soil-vapor-extraction remediation in terms of groundwater protection. The second example investigates the importance of simulating different categories of geochemical reactions for carbon sequestration and selecting appropriate simplifications for use in evaluating sequestration scenarios. In the third example, the modeling history for a uranium-contaminated site demonstrates that conservative parameter estimates were inadequate surrogates for complex, critical processes and there is discussion on the selection of more appropriate model complexity for this application. All three examples highlight how complexity considerations are essential to create scientifically defensible models that achieve a balance between model simplification and complexity.
Study of Turbofan Engines Designed for Low Enery Consumption
NASA Technical Reports Server (NTRS)
Neitzel, R. E.; Hirschkron, R.; Johnston, R. P.
1976-01-01
Subsonic transport turbofan engine design and technology features which have promise of improving aircraft energy consumption are described. Task I addressed the selection and evaluation of features for the CF6 family of engines in current aircraft, and growth models of these aircraft. Task II involved cycle studies and the evaluation of technology features for advanced technology turbofans, consistent with initial service in 1985. Task III pursued the refined analysis of a specific design of an advanced technology turbofan engine selected as the result of Task II studies. In all of the above, the impact upon aircraft economics, as well as energy consumption, was evaluated. Task IV summarized recommendations for technology developments which would be necessary to achieve the improvements in energy consumption identified.
Mehrabian, Zara; Guo, Yan; Weinreich, Daniel; Bernstein, Steven L
2017-01-01
Optic nerve (ON) damage following nonarteritic anterior ischemic optic neuropathy (NAION) and its models is associated with neurodegenerative inflammation. Minocycline is a tetracycline derivative antibiotic believed to exert a neuroprotective effect by selective alteration and activation of the neuroinflammatory response. We evaluated minocycline's post-induction ability to modify early and late post-ischemic inflammatory responses and its retinal ganglion cell (RGC)-neuroprotective ability. We used the rodent NAION (rNAION) model in male Sprague-Dawley rats. Animals received either vehicle or minocycline (33 mg/kg) daily intraperitoneally for 28 days. Early (3 days) ON-cytokine responses were evaluated, and oligodendrocyte death was temporally evaluated using terminal deoxynucleotidyl transferase dUTP nick end labeling (TUNEL) analysis. Cellular inflammation was evaluated with immunohistochemistry, and RGC preservation was compared with stereology of Brn3a-positive cells in flat mounted retinas. Post-rNAION, oligodendrocytes exhibit a delayed pattern of apoptosis extending over a month, with extrinsic monocyte infiltration occurring only in the primary rNAION lesion and progressive distal microglial activation. Post-induction minocycline failed to improve retinal ganglion cell survival compared with the vehicle treated (893.14 vs. 920.72; p>0.9). Cytokine analysis of the rNAION lesion 3 days post-induction revealed that minocycline exert general inflammatory suppression without selective upregulation of cytokines associated with the proposed alternative or neuroprotective M2 inflammatory pathway. The pattern of cytokine release, extended temporal window of oligodendrocyte death, and progressive microglial activation suggests that selective neuroimmunomodulation, rather than general inflammatory suppression, may be required for effective repair strategies in ischemic optic neuropathies.
Mehrabian, Zara; Guo, Yan; Weinreich, Daniel
2017-01-01
Purpose Optic nerve (ON) damage following nonarteritic anterior ischemic optic neuropathy (NAION) and its models is associated with neurodegenerative inflammation. Minocycline is a tetracycline derivative antibiotic believed to exert a neuroprotective effect by selective alteration and activation of the neuroinflammatory response. We evaluated minocycline’s post-induction ability to modify early and late post-ischemic inflammatory responses and its retinal ganglion cell (RGC)–neuroprotective ability. Methods We used the rodent NAION (rNAION) model in male Sprague-Dawley rats. Animals received either vehicle or minocycline (33 mg/kg) daily intraperitoneally for 28 days. Early (3 days) ON-cytokine responses were evaluated, and oligodendrocyte death was temporally evaluated using terminal deoxynucleotidyl transferase dUTP nick end labeling (TUNEL) analysis. Cellular inflammation was evaluated with immunohistochemistry, and RGC preservation was compared with stereology of Brn3a-positive cells in flat mounted retinas. Results Post-rNAION, oligodendrocytes exhibit a delayed pattern of apoptosis extending over a month, with extrinsic monocyte infiltration occurring only in the primary rNAION lesion and progressive distal microglial activation. Post-induction minocycline failed to improve retinal ganglion cell survival compared with the vehicle treated (893.14 vs. 920.72; p>0.9). Cytokine analysis of the rNAION lesion 3 days post-induction revealed that minocycline exert general inflammatory suppression without selective upregulation of cytokines associated with the proposed alternative or neuroprotective M2 inflammatory pathway. Conclusions The pattern of cytokine release, extended temporal window of oligodendrocyte death, and progressive microglial activation suggests that selective neuroimmunomodulation, rather than general inflammatory suppression, may be required for effective repair strategies in ischemic optic neuropathies. PMID:29386871
Kasthurirathne, Suranga N; Dixon, Brian E; Gichoya, Judy; Xu, Huiping; Xia, Yuni; Mamlin, Burke; Grannis, Shaun J
2016-04-01
Increased adoption of electronic health records has resulted in increased availability of free text clinical data for secondary use. A variety of approaches to obtain actionable information from unstructured free text data exist. These approaches are resource intensive, inherently complex and rely on structured clinical data and dictionary-based approaches. We sought to evaluate the potential to obtain actionable information from free text pathology reports using routinely available tools and approaches that do not depend on dictionary-based approaches. We obtained pathology reports from a large health information exchange and evaluated the capacity to detect cancer cases from these reports using 3 non-dictionary feature selection approaches, 4 feature subset sizes, and 5 clinical decision models: simple logistic regression, naïve bayes, k-nearest neighbor, random forest, and J48 decision tree. The performance of each decision model was evaluated using sensitivity, specificity, accuracy, positive predictive value, and area under the receiver operating characteristics (ROC) curve. Decision models parameterized using automated, informed, and manual feature selection approaches yielded similar results. Furthermore, non-dictionary classification approaches identified cancer cases present in free text reports with evaluation measures approaching and exceeding 80-90% for most metrics. Our methods are feasible and practical approaches for extracting substantial information value from free text medical data, and the results suggest that these methods can perform on par, if not better, than existing dictionary-based approaches. Given that public health agencies are often under-resourced and lack the technical capacity for more complex methodologies, these results represent potentially significant value to the public health field. Copyright © 2016 Elsevier Inc. All rights reserved.
Czarnecki, John B.
2008-01-01
An existing conjunctive use optimization model of the Mississippi River Valley alluvial aquifer was used to evaluate the effect of selected constraints and model variables on ground-water sustainable yield. Modifications to the optimization model were made to evaluate the effects of varying (1) the upper limit of ground-water withdrawal rates, (2) the streamflow constraint associated with the White River, and (3) the specified stage of the White River. Upper limits of ground-water withdrawal rates were reduced to 75, 50, and 25 percent of the 1997 ground-water withdrawal rates. As the upper limit is reduced, the spatial distribution of sustainable pumping increases, although the total sustainable pumping from the entire model area decreases. In addition, the number of binding constraint points decreases. In a separate analysis, the streamflow constraint associated with the White River was optimized, resulting in an estimate of the maximum sustainable streamflow at DeValls Bluff, Arkansas, the site of potential surface-water withdrawals from the White River for the Grand Prairie Area Demonstration Project. The maximum sustainable streamflow, however, is less than the amount of streamflow allocated in the spring during the paddlefish spawning period. Finally, decreasing the specified stage of the White River was done to evaluate a hypothetical river stage that might result if the White River were to breach the Melinda Head Cut Structure, one of several manmade diversions that prevents the White River from permanently joining the Arkansas River. A reduction in the stage of the White River causes reductions in the sustainable yield of ground water.
Evaluation of Solid Modeling Software for Finite Element Analysis of Woven Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Mital, Subodh; Lang, Jerry
2010-01-01
Three computer programs, used for the purpose of generating 3-D finite element models of the Repeating Unit Cell (RUC) of a textile, were examined for suitability to model woven Ceramic Matrix Composites (CMCs). The programs evaluated were the open-source available TexGen, the commercially available WiseTex, and the proprietary Composite Material Evaluator (COMATE). A five-harness-satin (5HS) weave for a melt-infiltrated (MI) silicon carbide matrix and silicon carbide fiber was selected as an example problem and the programs were tested for their ability to generate a finite element model of the RUC. The programs were also evaluated for ease-of-use and capability, particularly for the capability to introduce various defect types such as porosity, ply shifting, and nesting of a laminate. Overall, it was found that TexGen and WiseTex were useful for generating solid models of the tow geometry; however, there was a lack of consistency in generating well-conditioned finite element meshes of the tows and matrix. TexGen and WiseTex were both capable of allowing collective and individual shifting of tows within a ply and WiseTex also had a ply nesting capability. TexGen and WiseTex were sufficiently userfriendly and both included a Graphical User Interface (GUI). COMATE was satisfactory in generating a 5HS finite element mesh of an idealized weave geometry but COMATE lacked a GUI and was limited to only 5HS and 8HS weaves compared to the larger amount of weave selections available with TexGen and WiseTex.
A model for the sustainable selection of building envelope assemblies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huedo, Patricia, E-mail: huedo@uji.es; Mulet, Elena, E-mail: emulet@uji.es; López-Mesa, Belinda, E-mail: belinda@unizar.es
2016-02-15
The aim of this article is to define an evaluation model for the environmental impacts of building envelopes to support planners in the early phases of materials selection. The model is intended to estimate environmental impacts for different combinations of building envelope assemblies based on scientifically recognised sustainability indicators. These indicators will increase the amount of information that existing catalogues show to support planners in the selection of building assemblies. To define the model, first the environmental indicators were selected based on the specific aims of the intended sustainability assessment. Then, a simplified LCA methodology was developed to estimate themore » impacts applicable to three types of dwellings considering different envelope assemblies, building orientations and climate zones. This methodology takes into account the manufacturing, installation, maintenance and use phases of the building. Finally, the model was validated and a matrix in Excel was created as implementation of the model. - Highlights: • Method to assess the envelope impacts based on a simplified LCA • To be used at an earlier phase than the existing methods in a simple way. • It assigns a score by means of known sustainability indicators. • It estimates data about the embodied and operating environmental impacts. • It compares the investment costs with the costs of the consumed energy.« less
Test of a habitat suitability index for black bears in the southern Appalachians
Mitchell, M.S.; Zimmerman, J.W.; Powell, R.A.
2002-01-01
We present a habitat suitability index (HSI) model for black bears (Ursus americanus) living in the southern Appalachians that was developed a priori from the literature, then tested using location and home range data collected in the Pisgah Bear Sanctuary, North Carolina, over a 12-year period. The HSI was developed and initially tested using habitat and bear data collected over 2 years in the sanctuary. We increased number of habitat sampling sites, included data collected in areas affected by timber harvest, used more recent Geographic Information System (GIS) technology to create a more accurate depiction of the HSI for the sanctuary, evaluated effects of input variability on HSI values, and duplicated the original tests using more data. We found that the HSI predicted habitat selection by bears on population and individual levels and the distribution of collared bears were positively correlated with HSI values. We found a stronger relationship between habitat selection by bears and a second-generation HSI. We evaluated our model with criteria suggested by Roloff and Kernohan (1999) for evaluating HSI model reliability and concluded that our model was reliable and robust. The model's strength is that it was developed as an a priori hypothesis directly modeling the relationship between critical resources and fitness of bears and tested with independent data. We present the HSI spatially as a continuous fitness surface where potential contribution of habitat to the fitness of a bear is depicted at each point in space.
Ru, Sushan; Hardner, Craig; Carter, Patrick A; Evans, Kate; Main, Dorrie; Peace, Cameron
2016-01-01
Seedling selection identifies superior seedlings as candidate cultivars based on predicted genetic potential for traits of interest. Traditionally, genetic potential is determined by phenotypic evaluation. With the availability of DNA tests for some agronomically important traits, breeders have the opportunity to include DNA information in their seedling selection operations—known as marker-assisted seedling selection. A major challenge in deploying marker-assisted seedling selection in clonally propagated crops is a lack of knowledge in genetic gain achievable from alternative strategies. Existing models based on additive effects considering seed-propagated crops are not directly relevant for seedling selection of clonally propagated crops, as clonal propagation captures all genetic effects, not just additive. This study modeled genetic gain from traditional and various marker-based seedling selection strategies on a single trait basis through analytical derivation and stochastic simulation, based on a generalized seedling selection scheme of clonally propagated crops. Various trait-test scenarios with a range of broad-sense heritability and proportion of genotypic variance explained by DNA markers were simulated for two populations with different segregation patterns. Both derived and simulated results indicated that marker-based strategies tended to achieve higher genetic gain than phenotypic seedling selection for a trait where the proportion of genotypic variance explained by marker information was greater than the broad-sense heritability. Results from this study provides guidance in optimizing genetic gain from seedling selection for single traits where DNA tests providing marker information are available. PMID:27148453
An Exploratory Study of the Role of Human Resource Management in Models of Employee Turnover
ERIC Educational Resources Information Center
Ozolina-Ozola, Iveta
2016-01-01
The purpose of this paper is to present the study results of the human resource management role in the voluntary employee turnover models. The mixed methods design was applied. On the basis of the results of the search and evaluation of publications, the 16 models of employee turnover were selected. Applying the method of content analysis, the…
The Shuttle Cost and Price model
NASA Technical Reports Server (NTRS)
Leary, Katherine; Stone, Barbara
1983-01-01
The Shuttle Cost and Price (SCP) model was developed as a tool to assist in evaluating major aspects of Shuttle operations that have direct and indirect economic consequences. It incorporates the major aspects of NASA Pricing Policy and corresponds to the NASA definition of STS operating costs. An overview of the SCP model is presented and the cost model portion of SCP is described in detail. Selected recent applications of the SCP model to NASA Pricing Policy issues are presented.
Fieberg, John R.; Forester, James D.; Street, Garrett M.; Johnson, Douglas H.; ArchMiller, Althea A.; Matthiopoulos, Jason
2018-01-01
“Species distribution modeling” was recently ranked as one of the top five “research fronts” in ecology and the environmental sciences by ISI's Essential Science Indicators (Renner and Warton 2013), reflecting the importance of predicting how species distributions will respond to anthropogenic change. Unfortunately, species distribution models (SDMs) often perform poorly when applied to novel environments. Compounding on this problem is the shortage of methods for evaluating SDMs (hence, we may be getting our predictions wrong and not even know it). Traditional methods for validating SDMs quantify a model's ability to classify locations as used or unused. Instead, we propose to focus on how well SDMs can predict the characteristics of used locations. This subtle shift in viewpoint leads to a more natural and informative evaluation and validation of models across the entire spectrum of SDMs. Through a series of examples, we show how simple graphical methods can help with three fundamental challenges of habitat modeling: identifying missing covariates, non-linearity, and multicollinearity. Identifying habitat characteristics that are not well-predicted by the model can provide insights into variables affecting the distribution of species, suggest appropriate model modifications, and ultimately improve the reliability and generality of conservation and management recommendations.
Berian, Julia R; Zhou, Lynn; Hornor, Melissa A; Russell, Marcia M; Cohen, Mark E; Finlayson, Emily; Ko, Clifford Y; Robinson, Thomas N; Rosenthal, Ronnie A
2017-12-01
Surgical quality datasets can be better tailored toward older adults. The American College of Surgeons (ACS) NSQIP Geriatric Surgery Pilot collected risk factors and outcomes in 4 geriatric-specific domains: cognition, decision-making, function, and mobility. This study evaluated the contributions of geriatric-specific factors to risk adjustment in modeling 30-day outcomes and geriatric-specific outcomes (postoperative delirium, new mobility aid use, functional decline, and pressure ulcers). Using ACS NSQIP Geriatric Surgery Pilot data (January 2014 to December 2016), 7 geriatric-specific risk factors were evaluated for selection in 14 logistic models (morbidities/mortality) in general-vascular and orthopaedic surgery subgroups. Hierarchical models evaluated 4 geriatric-specific outcomes, adjusting for hospitals-level effects and including Bayesian-type shrinkage, to estimate hospital performance. There were 36,399 older adults who underwent operations at 31 hospitals in the ACS NSQIP Geriatric Surgery Pilot. Geriatric-specific risk factors were selected in 10 of 14 models in both general-vascular and orthopaedic surgery subgroups. After risk adjustment, surrogate consent (odds ratio [OR] 1.5; 95% CI 1.3 to 1.8) and use of a mobility aid (OR 1.3; 95% CI 1.1 to 1.4) increased the risk for serious morbidity or mortality in the general-vascular cohort. Geriatric-specific factors were selected in all 4 geriatric-specific outcomes models. Rates of geriatric-specific outcomes were: postoperative delirium in 12.1% (n = 3,650), functional decline in 42.9% (n = 13,000), new mobility aid in 29.7% (n = 9,257), and new or worsened pressure ulcers in 1.7% (n = 527). Geriatric-specific risk factors are important for patient-centered care and contribute to risk adjustment in modeling traditional and geriatric-specific outcomes. To provide optimal patient care for older adults, surgical datasets should collect measures that address cognition, decision-making, mobility, and function. Copyright © 2017 American College of Surgeons. All rights reserved.
Phytomonas: A non-pathogenic trypanosomatid model for functional expression of proteins.
Miranda, Mariana R; Sayé, Melisa; Reigada, Chantal; Carrillo, Carolina; Pereira, Claudio A
2015-10-01
Phytomonas are protozoan parasites from the Trypanosomatidae family which infect a wide variety of plants. Herein, Phytomonas Jma was tested as a model for functional expression of heterologous proteins. Green fluorescent protein expression was evaluated in Phytomonas and compared with Trypanosoma cruzi, the etiological agent of Chagas' disease. Phytomonas was able to express GFP at levels similar to T. cruzi although the transgenic selection time was higher. It was possible to establish an efficient transfection and selection protocol for protein expression. These results demonstrate that Phytomonas can be a good model for functional expression of proteins from other trypanosomatids, presenting the advantage of being completely safe for humans. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kozlovská, Mária; Struková, Zuzana
2013-06-01
Several factors should be considered by the owner and general contractor in the process of contractors` and subcontractors` selection and evaluation. The paper reviews the recent models addressed to guide general contractors in subcontractors' selection process and in evaluation of different contractors during the execution of the project. Moreover the paper suggests the impact of different contractors' performance to the overall level of occupational health and safety culture at the sites. It deals with the factors influencing the safety performance of contractors during construction and analyses the methods for assessing the safety performance of construction contractors. The results of contractors' safety performance evaluation could be a useful tool in motivating contractors to achieve better safety outcomes or could have effect on owners` or general contractors' decision making about contractors suitability for future contracting works.
Design and in vivo evaluation of more efficient and selective deep brain stimulation electrodes
NASA Astrophysics Data System (ADS)
Howell, Bryan; Huynh, Brian; Grill, Warren M.
2015-08-01
Objective. Deep brain stimulation (DBS) is an effective treatment for movement disorders and a promising therapy for treating epilepsy and psychiatric disorders. Despite its clinical success, the efficiency and selectivity of DBS can be improved. Our objective was to design electrode geometries that increased the efficiency and selectivity of DBS. Approach. We coupled computational models of electrodes in brain tissue with cable models of axons of passage (AOPs), terminating axons (TAs), and local neurons (LNs); we used engineering optimization to design electrodes for stimulating these neural elements; and the model predictions were tested in vivo. Main results. Compared with the standard electrode used in the Medtronic Model 3387 and 3389 arrays, model-optimized electrodes consumed 45-84% less power. Similar gains in selectivity were evident with the optimized electrodes: 50% of parallel AOPs could be activated while reducing activation of perpendicular AOPs from 44 to 48% with the standard electrode to 0-14% with bipolar designs; 50% of perpendicular AOPs could be activated while reducing activation of parallel AOPs from 53 to 55% with the standard electrode to 1-5% with an array of cathodes; and, 50% of TAs could be activated while reducing activation of AOPs from 43 to 100% with the standard electrode to 2-15% with a distal anode. In vivo, both the geometry and polarity of the electrode had a profound impact on the efficiency and selectivity of stimulation. Significance. Model-based design is a powerful tool that can be used to improve the efficiency and selectivity of DBS electrodes.
NASA Astrophysics Data System (ADS)
Cama, Mariaelena; Cristi Nicu, Ionut; Conoscenti, Christian; Quénéhervé, Geraldine; Maerker, Michael
2016-04-01
Landslide susceptibility can be defined as the likelihood of a landslide occurring in a given area on the basis of local terrain conditions. In the last decades many research focused on its evaluation by means of stochastic approaches under the assumption that 'the past is the key to the future' which means that if a model is able to reproduce a known landslide spatial distribution, it will be able to predict the future locations of new (i.e. unknown) slope failures. Among the various stochastic approaches, Binary Logistic Regression (BLR) is one of the most used because it calculates the susceptibility in probabilistic terms and its results are easily interpretable from a geomorphological point of view. However, very often not much importance is given to multicollinearity assessment whose effect is that the coefficient estimates are unstable, with opposite sign and therefore difficult to interpret. Therefore, it should be evaluated every time in order to make a model whose results are geomorphologically correct. In this study the effects of multicollinearity in the predictive performance and robustness of landslide susceptibility models are analyzed. In particular, the multicollinearity is estimated by means of Variation Inflation Index (VIF) which is also used as selection criterion for the independent variables (VIF Stepwise Selection) and compared to the more commonly used AIC Stepwise Selection. The robustness of the results is evaluated through 100 replicates of the dataset. The study area selected to perform this analysis is the Moldavian Plateau where landslides are among the most frequent geomorphological processes. This area has an increasing trend of urbanization and a very high potential regarding the cultural heritage, being the place of discovery of the largest settlement belonging to the Cucuteni Culture from Eastern Europe (that led to the development of the great complex Cucuteni-Tripyllia). Therefore, identifying the areas susceptible to landslides may lead to a better understanding and mitigation for government, local authorities and stakeholders to plan the economic activities, minimize the damages costs, environmental and cultural heritage protection. The results show that although the VIF Stepwise selection allows a more stable selection of the controlling factors, the AIC Stepwise selection produces better predictive performance. Moreover, when working with replicates the effect of multicollinearity are statistically reduced by the application of the AIC stepwise selection and the results are easily interpretable in geomorphologic terms.
A social preference valuations set for EQ-5D health states in Flanders, Belgium.
Cleemput, Irina
2010-04-01
This study aimed at deriving a preference valuation set for EQ-5D health states from the general Flemish public in Belgium. A EuroQol valuation instrument with 16 health states to be valued on a visual analogue scale was sent to a random sample of 2,754 adults. The initial response rate was 35%. Eventually, 548 (20%) respondents provided useable valuations for modeling. Valuations for 245 health states were modeled using a random effects model. The selection of the model was based on two criteria: health state valuations must be consistent, and the difference with the directly observed valuations must be small. A model including a value decrement if any health dimension of the EQ-5D is on the worst level was selected to construct the social health state valuation set. A comparison with health state valuations from other countries showed similarities, especially with those from New Zealand. The use of a single preference valuation set across different health economic evaluations within a country is highly preferable to increase their usability for policy makers. This study contributes to the standardization of outcome measurement in economic evaluations in Belgium.
Bigham-Sadegh, Amin; Oryan, Ahmad
2015-06-01
In vitro assays can be useful in determining biological mechanism and optimizing scaffold parameters, however translation of the in vitro results to clinics is generally hard. Animal experimentation is a better approximation than in vitro tests, and usage of animal models is often essential in extrapolating the experimental results and translating the information in a human clinical setting. In addition, usage of animal models to study fracture healing is useful to answer questions related to the most effective method to treat humans. There are several factors that should be considered when selecting an animal model. These include availability of the animal, cost, ease of handling and care, size of the animal, acceptability to society, resistance to surgery, infection and disease, biological properties analogous to humans, bone structure and composition, as well as bone modeling and remodeling characteristics. Animal experiments on bone healing have been conducted on small and large animals, including mice, rats, rabbits, dogs, pigs, goats and sheep. This review also describes the molecular events during various steps of fracture healing and explains different means of fracture healing evaluation including biomechanical, histopathological and radiological assessments.
Integrated watershed-scale response to climate change for selected basins across the United States
Markstrom, Steven L.; Hay, Lauren E.; Ward-Garrison, D. Christian; Risley, John C.; Battaglin, William A.; Bjerklie, David M.; Chase, Katherine J.; Christiansen, Daniel E.; Dudley, Robert W.; Hunt, Randall J.; Koczot, Kathryn M.; Mastin, Mark C.; Regan, R. Steven; Viger, Roland J.; Vining, Kevin C.; Walker, John F.
2012-01-01
A study by the U.S. Geological Survey (USGS) evaluated the hydrologic response to different projected carbon emission scenarios of the 21st century using a hydrologic simulation model. This study involved five major steps: (1) setup, calibrate and evaluated the Precipitation Runoff Modeling System (PRMS) model in 14 basins across the United States by local USGS personnel; (2) acquire selected simulated carbon emission scenarios from the World Climate Research Programme's Coupled Model Intercomparison Project; (3) statistical downscaling of these scenarios to create PRMS input files which reflect the future climatic conditions of these scenarios; (4) generate PRMS projections for the carbon emission scenarios for the 14 basins; and (5) analyze the modeled hydrologic response. This report presents an overview of this study, details of the methodology, results from the 14 basin simulations, and interpretation of these results. A key finding is that the hydrological response of the different geographical regions of the United States to potential climate change may be different, depending on the dominant physical processes of that particular region. Also considered is the tremendous amount of uncertainty present in the carbon emission scenarios and how this uncertainty propagates through the hydrologic simulations.
Alzheimer's disease: the amyloid hypothesis and the Inverse Warburg effect
Demetrius, Lloyd A.; Magistretti, Pierre J.; Pellerin, Luc
2014-01-01
Epidemiological and biochemical studies show that the sporadic forms of Alzheimer's disease (AD) are characterized by the following hallmarks: (a) An exponential increase with age; (b) Selective neuronal vulnerability; (c) Inverse cancer comorbidity. The present article appeals to these hallmarks to evaluate and contrast two competing models of AD: the amyloid hypothesis (a neuron-centric mechanism) and the Inverse Warburg hypothesis (a neuron-astrocytic mechanism). We show that these three hallmarks of AD conflict with the amyloid hypothesis, but are consistent with the Inverse Warburg hypothesis, a bioenergetic model which postulates that AD is the result of a cascade of three events—mitochondrial dysregulation, metabolic reprogramming (the Inverse Warburg effect), and natural selection. We also provide an explanation for the failures of the clinical trials based on amyloid immunization, and we propose a new class of therapeutic strategies consistent with the neuroenergetic selection model. PMID:25642192
Core Professionalism Education in Surgery: A Systematic Review.
Sarıoğlu Büke, Akile; Karabilgin Öztürkçü, Özlem Sürel; Yılmaz, Yusuf; Sayek, İskender
2018-03-15
Professionalism education is one of the major elements of surgical residency education. To evaluate the studies on core professionalism education programs in surgical professionalism education. Systematic review. This systematic literature review was performed to analyze core professionalism programs for surgical residency education published in English with at least three of the following features: program developmental model/instructional design method, aims and competencies, methods of teaching, methods of assessment, and program evaluation model or method. A total of 27083 articles were retrieved using EBSCOHOST, PubMed, Science Direct, Web of Science, and manual search. Eight articles met the selection criteria. The instructional design method was presented in only one article, which described the Analysis, Design, Development, Implementation, and Evaluation model. Six articles were based on the Accreditation Council for Graduate Medical Education criterion, although there was significant variability in content. The most common teaching method was role modeling with scenario- and case-based learning. A wide range of assessment methods for evaluating professionalism education were reported. The Kirkpatrick model was reported in one article as a method for program evaluation. It is suggested that for a core surgical professionalism education program, developmental/instructional design model, aims and competencies, content, teaching methods, assessment methods, and program evaluation methods/models should be well defined, and the content should be comparable.
Witkin, J M; Smith, J L; Ping, X; Gleason, S D; Poe, M M; Li, G; Jin, X; Hobbs, J; Schkeryantz, J M; McDermott, J S; Alatorre, A I; Siemian, J N; Cramer, J W; Airey, D C; Methuku, K R; Tiruveedhula, V V N P B; Jones, T M; Crawford, J; Krambis, M J; Fisher, J L; Cook, J M; Cerne, R
2018-05-03
HZ-166 has previously been characterized as an α2,3-selective GABA A receptor modulator with anticonvulsant, anxiolytic, and anti-nociceptive properties but reduced motor effects. We discovered a series of ester bioisosteres with reduced metabolic liabilities, leading to improved efficacy as anxiolytic-like compounds in rats. In the present study, we evaluated the anticonvulsant effects KRM-II-81 across several rodent models. In some models we also evaluated key structural analogs. KRM-II-81 suppressed hyper-excitation in a network of cultured cortical neurons without affecting the basal neuronal activity. KRM-II-81 was active against electroshock-induced convulsions in mice, pentylenetetrazole (PTZ)-induced convulsions in rats, elevations in PTZ-seizure thresholds, and amygdala-kindled seizures in rats with efficacies greater than that of diazepam. KRM-II-81 was also active in the 6 Hz seizure model in mice. Structural analogs of KRM-II-81 but not the ester, HZ-166, were active in all models in which they were evaluated. We further evaluated KRM-II-81 in human cortical epileptic tissue where it was found to significantly-attenuate picrotoxin- and AP-4-induced increases in firing rate across an electrode array. These molecules generally had a wider margin of separation in potencies to produce anticonvulsant effects vs. motor impairment on an inverted screen test than did diazepam. Ester bioisosters of HZ-166 are thus presented as novel agents for the potential treatment of epilepsy acting via selective positive allosteric amplification of GABA A signaling through α2/α3-containing GABA receptors. The in vivo data from the present study can serve as a guide to dosing parameters that predict engagement of central GABA A receptors. Copyright © 2018 Elsevier Ltd. All rights reserved.
Theoretical performance and clinical evaluation of transverse tripolar spinal cord stimulation.
Struijk, J J; Holsheimer, J; Spincemaille, G H; Gielen, F L; Hoekema, R
1998-09-01
A new type of spinal cord stimulation electrode, providing contact combinations with a transverse orientation, is presented. Electrodes were implanted in the cervical area (C4-C5) of two chronic pain patients and the stimulation results were subsequently simulated with a computer model consisting of a volume conductor model and active nerve fiber models. For various contact combinations a good match was obtained between the modeling results and the measurement data with respect to load resistance (less than 20% difference), perception thresholds (16% difference), asymmetry of paresthesia (significant correlation) and paresthesia distributions (weak correlation). The transversally oriented combinations provided the possibility to select either a preferential dorsal column stimulation, a preferential dorsal root stimulation or a mixed stimulation. The (a)symmetry of paresthesia could largely be affected in a predictable way by the selection of contact combinations as well. The transverse tripolar combination was shown to give a higher selectivity of paresthesia than monopolar and longitudinal dipolar combinations, at the cost of an increased current (more than twice).
Improved knowledge diffusion model based on the collaboration hypernetwork
NASA Astrophysics Data System (ADS)
Wang, Jiang-Pan; Guo, Qiang; Yang, Guang-Yong; Liu, Jian-Guo
2015-06-01
The process for absorbing knowledge becomes an essential element for innovation in firms and in adapting to changes in the competitive environment. In this paper, we present an improved knowledge diffusion hypernetwork (IKDH) model based on the idea that knowledge will spread from the target node to all its neighbors in terms of the hyperedge and knowledge stock. We apply the average knowledge stock V(t) , the variable σ2(t) , and the variance coefficient c(t) to evaluate the performance of knowledge diffusion. By analyzing different knowledge diffusion ways, selection ways of the highly knowledgeable nodes, hypernetwork sizes and hypernetwork structures for the performance of knowledge diffusion, results show that the diffusion speed of IKDH model is 3.64 times faster than that of traditional knowledge diffusion (TKDH) model. Besides, it is three times faster to diffuse knowledge by randomly selecting "expert" nodes than that by selecting large-hyperdegree nodes as "expert" nodes. Furthermore, either the closer network structure or smaller network size results in the faster knowledge diffusion.
Alternative Methods for Handling Attrition
Foster, E. Michael; Fang, Grace Y.
2009-01-01
Using data from the evaluation of the Fast Track intervention, this article illustrates three methods for handling attrition. Multiple imputation and ignorable maximum likelihood estimation produce estimates that are similar to those based on listwise-deleted data. A panel selection model that allows for selective dropout reveals that highly aggressive boys accumulate in the treatment group over time and produces a larger estimate of treatment effect. In contrast, this model produces a smaller treatment effect for girls. The article's conclusion discusses the strengths and weaknesses of the alternative approaches and outlines ways in which researchers might improve their handling of attrition. PMID:15358906
Analogue Study of Actinide Transport at Sites in Russia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Novikov, A P; Simmons, A M; Halsey, W G
2003-02-12
The U. S. Department of Energy (DOE) and the Russian Academy of Sciences (RAS) are engaged in a three-year cooperative study to observe the behavior of actinides in the natural environment at selected disposal sites and/or contamination sites in Russia. The purpose is to develop experimental data and models for actinide speciation, mobilization and transport processes in support of geologic repository design, safety and performance analyses. Currently at the mid-point of the study, the accomplishments to date include: evaluation of existing data and data needs, site screening and selection, initial data acquisition, and development of preliminary conceptual models.
Decision Aids for Airborne Intercept Operations in Advanced Aircrafts
NASA Technical Reports Server (NTRS)
Madni, A.; Freedy, A.
1981-01-01
A tactical decision aid (TDA) for the F-14 aircrew, i.e., the naval flight officer and pilot, in conducting a multitarget attack during the performance of a Combat Air Patrol (CAP) role is presented. The TDA employs hierarchical multiattribute utility models for characterizing mission objectives in operationally measurable terms, rule based AI-models for tactical posture selection, and fast time simulation for maneuver consequence prediction. The TDA makes aspect maneuver recommendations, selects and displays the optimum mission posture, evaluates attackable and potentially attackable subsets, and recommends the 'best' attackable subset along with the required course perturbation.
An expert system for the selection of building elements during architectural design
NASA Astrophysics Data System (ADS)
Alibaba, Halil Zafer
This thesis explains the development stages of an expert system for the evaluation and selection of building elements during the early stages of architectural design. This expert system is called BES. It is produced after two prototypes were established. Testing of BES is made on professional architects who are from both academia and the practical construction market of Northern Cyprus. BES is intended to be used by experienced and inexperienced architects. The model includes selection of all kinds of main building elements that are available like retaining walls, foundations, external walls, internal walls, floors, external stairs, internal stairs, roofs, external chimneys, internal chimneys, windows and external doors and internal doors and their sub-type building elements. The selection is achieved via SMART Methodology depending on the performance requirements and an expert system shell Exsys Corvid version 1.2.14 is used to structure the expert system. The use of computers in today's world is very important with its advantages in handling vast amount of data. The use of the model through Internet makes the model international, and a useful design aid for architects. In addition, the decision-making feature of this model provides a suitable selection among numerous alternatives. The thesis also explains the development and the experience gained through use of the BES. It discusses the further development of the model.
[Hyperspectral remote sensing image classification based on SVM optimized by clonal selection].
Liu, Qing-Jie; Jing, Lin-Hai; Wang, Meng-Fei; Lin, Qi-Zhong
2013-03-01
Model selection for support vector machine (SVM) involving kernel and the margin parameter values selection is usually time-consuming, impacts training efficiency of SVM model and final classification accuracies of SVM hyperspectral remote sensing image classifier greatly. Firstly, based on combinatorial optimization theory and cross-validation method, artificial immune clonal selection algorithm is introduced to the optimal selection of SVM (CSSVM) kernel parameter a and margin parameter C to improve the training efficiency of SVM model. Then an experiment of classifying AVIRIS in India Pine site of USA was performed for testing the novel CSSVM, as well as a traditional SVM classifier with general Grid Searching cross-validation method (GSSVM) for comparison. And then, evaluation indexes including SVM model training time, classification overall accuracy (OA) and Kappa index of both CSSVM and GSSVM were all analyzed quantitatively. It is demonstrated that OA of CSSVM on test samples and whole image are 85.1% and 81.58, the differences from that of GSSVM are both within 0.08% respectively; And Kappa indexes reach 0.8213 and 0.7728, the differences from that of GSSVM are both within 0.001; While the ratio of model training time of CSSVM and GSSVM is between 1/6 and 1/10. Therefore, CSSVM is fast and accurate algorithm for hyperspectral image classification and is superior to GSSVM.
Red-shouldered hawk nesting habitat preference in south Texas
Strobel, Bradley N.; Boal, Clint W.
2010-01-01
We examined nesting habitat preference by red-shouldered hawks Buteo lineatus using conditional logistic regression on characteristics measured at 27 occupied nest sites and 68 unused sites in 2005–2009 in south Texas. We measured vegetation characteristics of individual trees (nest trees and unused trees) and corresponding 0.04-ha plots. We evaluated the importance of tree and plot characteristics to nesting habitat selection by comparing a priori tree-specific and plot-specific models using Akaike's information criterion. Models with only plot variables carried 14% more weight than models with only center tree variables. The model-averaged odds ratios indicated red-shouldered hawks selected to nest in taller trees and in areas with higher average diameter at breast height than randomly available within the forest stand. Relative to randomly selected areas, each 1-m increase in nest tree height and 1-cm increase in the plot average diameter at breast height increased the probability of selection by 85% and 10%, respectively. Our results indicate that red-shouldered hawks select nesting habitat based on vegetation characteristics of individual trees as well as the 0.04-ha area surrounding the tree. Our results indicate forest management practices resulting in tall forest stands with large average diameter at breast height would benefit red-shouldered hawks in south Texas.
Grizzly bear habitat selection is scale dependent.
Ciarniello, Lana M; Boyce, Mark S; Seip, Dale R; Heard, Douglas C
2007-07-01
The purpose of our study is to show how ecologists' interpretation of habitat selection by grizzly bears (Ursus arctos) is altered by the scale of observation and also how management questions would be best addressed using predetermined scales of analysis. Using resource selection functions (RSF) we examined how variation in the spatial extent of availability affected our interpretation of habitat selection by grizzly bears inhabiting mountain and plateau landscapes. We estimated separate models for females and males using three spatial extents: within the study area, within the home range, and within predetermined movement buffers. We employed two methods for evaluating the effects of scale on our RSF designs. First, we chose a priori six candidate models, estimated at each scale, and ranked them using Akaike Information Criteria. Using this method, results changed among scales for males but not for females. For female bears, models that included the full suite of covariates predicted habitat use best at each scale. For male bears that resided in the mountains, models based on forest successional stages ranked highest at the study-wide and home range extents, whereas models containing covariates based on terrain features ranked highest at the buffer extent. For male bears on the plateau, each scale estimated a different highest-ranked model. Second, we examined differences among model coefficients across the three scales for one candidate model. We found that both the magnitude and direction of coefficients were dependent upon the scale examined; results varied between landscapes, scales, and sexes. Greenness, reflecting lush green vegetation, was a strong predictor of the presence of female bears in both landscapes and males that resided in the mountains. Male bears on the plateau were the only animals to select areas that exposed them to a high risk of mortality by humans. Our results show that grizzly bear habitat selection is scale dependent. Further, the selection of resources can be dependent upon the availability of a particular vegetation type on the landscape. From a management perspective, decisions should be based on a hierarchical process of habitat selection, recognizing that selection patterns vary across scales.
Odor composition analysis and odor indicator selection during sewage sludge composting
Zhu, Yan-li; Zheng, Guo-di; Gao, Ding; Chen, Tong-bin; Wu, Fang-kun; Niu, Ming-jie; Zou, Ke-hua
2016-01-01
ABSTRACT On the basis of total temperature increase, normal dehydration, and maturity, the odor compositions of surface and internal piles in a well-run sewage sludge compost plant were analyzed using gas chromatography–mass spectrometry with a liquid nitrogen cooling system and a portable odor detector. Approximately 80 types of substances were detected, including 2 volatile inorganic compounds, 4 sulfur organic compounds, 16 benzenes, 27 alkanes, 15 alkenes, and 19 halogenated compounds. Most pollutants were mainly produced in the mesophilic and pre-thermophilic periods. The sulfur volatile organic compounds contributed significantly to odor and should be controlled primarily. Treatment strategies should be based on the properties of sulfur organic compounds. Hydrogen sulfide, methyl mercaptan, dimethyl disulfide, dimethyl sulfide, ammonia, and carbon disulfide were selected as core indicators. Ammonia, hydrogen sulfide, carbon disulfide, dimethyl disulfide, methyl mercaptan, dimethylbenzene, phenylpropane, and isopentane were designated as concentration indicators. Benzene, m-xylene, p-xylene, dimethylbenzene, dichloromethane, toluene, chlorobenzene, trichloromethane, carbon tetrachloride, and ethylbenzene were selected as health indicators. According to the principle of odor pollution indicator selection, dimethyl disulfide was selected as an odor pollution indicator of sewage sludge composting. Monitoring dimethyl disulfide provides a highly scientific method for modeling and evaluating odor pollution from sewage sludge composting facilities. Implications: Composting is one of the most important methods for sewage sludge treatment and improving the low organic matter content of many agricultural soils. However, odors are inevitably produced during the composting process. Understanding the production and emission patterns of odors is important for odor control and treatment. Core indicators, concentration indicators, and health indicators provide an index system to odor evaluation. An odor pollution indicator provides theoretical support for further modelling and evaluating odor pollution from sewage sludge composting facilities. PMID:27192607
Odor composition analysis and odor indicator selection during sewage sludge composting.
Zhu, Yan-Li; Zheng, Guo-di; Gao, Ding; Chen, Tong-Bin; Wu, Fang-Kun; Niu, Ming-Jie; Zou, Ke-Hua
2016-09-01
On the basis of total temperature increase, normal dehydration, and maturity, the odor compositions of surface and internal piles in a well-run sewage sludge compost plant were analyzed using gas chromatography-mass spectrometry with a liquid nitrogen cooling system and a portable odor detector. Approximately 80 types of substances were detected, including 2 volatile inorganic compounds, 4 sulfur organic compounds, 16 benzenes, 27 alkanes, 15 alkenes, and 19 halogenated compounds. Most pollutants were mainly produced in the mesophilic and pre-thermophilic periods. The sulfur volatile organic compounds contributed significantly to odor and should be controlled primarily. Treatment strategies should be based on the properties of sulfur organic compounds. Hydrogen sulfide, methyl mercaptan, dimethyl disulfide, dimethyl sulfide, ammonia, and carbon disulfide were selected as core indicators. Ammonia, hydrogen sulfide, carbon disulfide, dimethyl disulfide, methyl mercaptan, dimethylbenzene, phenylpropane, and isopentane were designated as concentration indicators. Benzene, m-xylene, p-xylene, dimethylbenzene, dichloromethane, toluene, chlorobenzene, trichloromethane, carbon tetrachloride, and ethylbenzene were selected as health indicators. According to the principle of odor pollution indicator selection, dimethyl disulfide was selected as an odor pollution indicator of sewage sludge composting. Monitoring dimethyl disulfide provides a highly scientific method for modeling and evaluating odor pollution from sewage sludge composting facilities. Composting is one of the most important methods for sewage sludge treatment and improving the low organic matter content of many agricultural soils. However, odors are inevitably produced during the composting process. Understanding the production and emission patterns of odors is important for odor control and treatment. Core indicators, concentration indicators, and health indicators provide an index system to odor evaluation. An odor pollution indicator provides theoretical support for further modelling and evaluating odor pollution from sewage sludge composting facilities.
NASA Astrophysics Data System (ADS)
Alipour, M. H.; Kibler, Kelly M.
2018-02-01
A framework methodology is proposed for streamflow prediction in poorly-gauged rivers located within large-scale regions of sparse hydrometeorologic observation. A multi-criteria model evaluation is developed to select models that balance runoff efficiency with selection of accurate parameter values. Sparse observed data are supplemented by uncertain or low-resolution information, incorporated as 'soft' data, to estimate parameter values a priori. Model performance is tested in two catchments within a data-poor region of southwestern China, and results are compared to models selected using alternative calibration methods. While all models perform consistently with respect to runoff efficiency (NSE range of 0.67-0.78), models selected using the proposed multi-objective method may incorporate more representative parameter values than those selected by traditional calibration. Notably, parameter values estimated by the proposed method resonate with direct estimates of catchment subsurface storage capacity (parameter residuals of 20 and 61 mm for maximum soil moisture capacity (Cmax), and 0.91 and 0.48 for soil moisture distribution shape factor (B); where a parameter residual is equal to the centroid of a soft parameter value minus the calibrated parameter value). A model more traditionally calibrated to observed data only (single-objective model) estimates a much lower soil moisture capacity (residuals of Cmax = 475 and 518 mm and B = 1.24 and 0.7). A constrained single-objective model also underestimates maximum soil moisture capacity relative to a priori estimates (residuals of Cmax = 246 and 289 mm). The proposed method may allow managers to more confidently transfer calibrated models to ungauged catchments for streamflow predictions, even in the world's most data-limited regions.
Developing Best Practices for Detecting Change at Marine Renewable Energy Sites
NASA Astrophysics Data System (ADS)
Linder, H. L.; Horne, J. K.
2016-02-01
In compliance with the National Environmental Policy Act (NEPA), an evaluation of environmental effects is mandatory for obtaining permits for any Marine Renewable Energy (MRE) project in the US. Evaluation includes an assessment of baseline conditions and on-going monitoring during operation to determine if biological conditions change relative to the baseline. Currently, there are no best practices for the analysis of MRE monitoring data. We have developed an approach to evaluate and recommend analytic models used to characterize and detect change in biological monitoring data. The approach includes six steps: review current MRE monitoring practices, identify candidate models to analyze data, fit models to a baseline dataset, develop simulated scenarios of change, evaluate model fit to simulated data, and produce recommendations on the choice of analytic model for monitoring data. An empirical data set from a proposed tidal turbine site at Admiralty Inlet, Puget Sound, Washington was used to conduct the model evaluation. Candidate models that were evaluated included: linear regression, time series, and nonparametric models. Model fit diagnostics Root-Mean-Square-Error and Mean-Absolute-Scaled-Error were used to measure accuracy of predicted values from each model. A power analysis was used to evaluate the ability of each model to measure and detect change from baseline conditions. As many of these models have yet to be applied in MRE monitoring studies, results of this evaluation will generate comprehensive guidelines on choice of model to detect change in environmental monitoring data from MRE sites. The creation of standardized guidelines for model selection enables accurate comparison of change between life stages of a MRE project, within life stages to meet real time regulatory requirements, and comparison of environmental changes among MRE sites.
A dental vision system for accurate 3D tooth modeling.
Zhang, Li; Alemzadeh, K
2006-01-01
This paper describes an active vision system based reverse engineering approach to extract the three-dimensional (3D) geometric information from dental teeth and transfer this information into Computer-Aided Design/Computer-Aided Manufacture (CAD/CAM) systems to improve the accuracy of 3D teeth models and at the same time improve the quality of the construction units to help patient care. The vision system involves the development of a dental vision rig, edge detection, boundary tracing and fast & accurate 3D modeling from a sequence of sliced silhouettes of physical models. The rig is designed using engineering design methods such as a concept selection matrix and weighted objectives evaluation chart. Reconstruction results and accuracy evaluation are presented on digitizing different teeth models.
Cost and accuracy of advanced breeding trial designs in apple
Harshman, Julia M; Evans, Kate M; Hardner, Craig M
2016-01-01
Trialing advanced candidates in tree fruit crops is expensive due to the long-term nature of the planting and labor-intensive evaluations required to make selection decisions. How closely the trait evaluations approximate the true trait value needs balancing with the cost of the program. Designs of field trials of advanced apple candidates in which reduced number of locations, the number of years and the number of harvests per year were modeled to investigate the effect on the cost and accuracy in an operational breeding program. The aim was to find designs that would allow evaluation of the most additional candidates while sacrificing the least accuracy. Critical percentage difference, response to selection, and correlated response were used to examine changes in accuracy of trait evaluations. For the quality traits evaluated, accuracy and response to selection were not substantially reduced for most trial designs. Risk management influences the decision to change trial design, and some designs had greater risk associated with them. Balancing cost and accuracy with risk yields valuable insight into advanced breeding trial design. The methods outlined in this analysis would be well suited to other horticultural crop breeding programs. PMID:27019717
An integrated fuzzy approach for strategic alliance partner selection in third-party logistics.
Erkayman, Burak; Gundogar, Emin; Yilmaz, Aysegul
2012-01-01
Outsourcing some of the logistic activities is a useful strategy for companies in recent years. This makes it possible for firms to concentrate on their main issues and processes and presents facility to improve logistics performance, to reduce costs, and to improve quality. Therefore provider selection and evaluation in third-party logistics become important activities for companies. Making a strategic decision like this is significantly hard and crucial. In this study we proposed a fuzzy multicriteria decision making (MCDM) approach to effectively select the most appropriate provider. First we identify the provider selection criteria and build the hierarchical structure of decision model. After building the hierarchical structure we determined the selection criteria weights by using fuzzy analytical hierarchy process (AHP) technique. Then we applied fuzzy technique for order preference by similarity to ideal solution (TOPSIS) to obtain final rankings for providers. And finally an illustrative example is also given to demonstrate the effectiveness of the proposed model.
An Integrated Fuzzy Approach for Strategic Alliance Partner Selection in Third-Party Logistics
Gundogar, Emin; Yılmaz, Aysegul
2012-01-01
Outsourcing some of the logistic activities is a useful strategy for companies in recent years. This makes it possible for firms to concentrate on their main issues and processes and presents facility to improve logistics performance, to reduce costs, and to improve quality. Therefore provider selection and evaluation in third-party logistics become important activities for companies. Making a strategic decision like this is significantly hard and crucial. In this study we proposed a fuzzy multicriteria decision making (MCDM) approach to effectively select the most appropriate provider. First we identify the provider selection criteria and build the hierarchical structure of decision model. After building the hierarchical structure we determined the selection criteria weights by using fuzzy analytical hierarchy process (AHP) technique. Then we applied fuzzy technique for order preference by similarity to ideal solution (TOPSIS) to obtain final rankings for providers. And finally an illustrative example is also given to demonstrate the effectiveness of the proposed model. PMID:23365520
Chen, Hui; Lowe, Alan A; de Almeida, Fernanda Riberiro; Wong, Mary; Fleetham, John A; Wang, Bangkang
2008-09-01
The aim of this study was to test a 3-dimensional (3D) computer-assisted dental model analysis system that uses selected landmarks to describe tooth movement during treatment with an oral appliance. Dental casts of 70 patients diagnosed with obstructive sleep apnea and treated with oral appliances for a mean time of 7 years 4 months were evaluated with a 3D digitizer (MicroScribe-3DX, Immersion, San Jose, Calif) compatible with the Rhinoceros modeling program (version 3.0 SR3c, Robert McNeel & Associates, Seattle, Wash). A total of 86 landmarks on each model were digitized, and 156 variables were calculated as either the linear distance between points or the distance from points to reference planes. Four study models for each patient (maxillary baseline, mandibular baseline, maxillary follow-up, and mandibular follow-up) were superimposed on 2 sets of reference points: 3 points on the palatal rugae for maxillary model superimposition, and 3 occlusal contact points for the same set of maxillary and mandibular model superimpositions. The patients were divided into 3 evaluation groups by 5 orthodontists based on the changes between baseline and follow-up study models. Digital dental measurements could be analyzed, including arch width, arch length, curve of Spee, overbite, overjet, and the anteroposterior relationship between the maxillary and mandibular arches. A method error within 0.23 mm in 14 selected variables was found for the 3D system. The statistical differences in the 3 evaluation groups verified the division criteria determined by the orthodontists. The system provides a method to record 3D measurements of study models that permits computer visualization of tooth position and movement from various perspectives.
An Improved Swarm Optimization for Parameter Estimation and Biological Model Selection
Abdullah, Afnizanfaizal; Deris, Safaai; Mohamad, Mohd Saberi; Anwar, Sohail
2013-01-01
One of the key aspects of computational systems biology is the investigation on the dynamic biological processes within cells. Computational models are often required to elucidate the mechanisms and principles driving the processes because of the nonlinearity and complexity. The models usually incorporate a set of parameters that signify the physical properties of the actual biological systems. In most cases, these parameters are estimated by fitting the model outputs with the corresponding experimental data. However, this is a challenging task because the available experimental data are frequently noisy and incomplete. In this paper, a new hybrid optimization method is proposed to estimate these parameters from the noisy and incomplete experimental data. The proposed method, called Swarm-based Chemical Reaction Optimization, integrates the evolutionary searching strategy employed by the Chemical Reaction Optimization, into the neighbouring searching strategy of the Firefly Algorithm method. The effectiveness of the method was evaluated using a simulated nonlinear model and two biological models: synthetic transcriptional oscillators, and extracellular protease production models. The results showed that the accuracy and computational speed of the proposed method were better than the existing Differential Evolution, Firefly Algorithm and Chemical Reaction Optimization methods. The reliability of the estimated parameters was statistically validated, which suggests that the model outputs produced by these parameters were valid even when noisy and incomplete experimental data were used. Additionally, Akaike Information Criterion was employed to evaluate the model selection, which highlighted the capability of the proposed method in choosing a plausible model based on the experimental data. In conclusion, this paper presents the effectiveness of the proposed method for parameter estimation and model selection problems using noisy and incomplete experimental data. This study is hoped to provide a new insight in developing more accurate and reliable biological models based on limited and low quality experimental data. PMID:23593445
Software selection based on analysis and forecasting methods, practised in 1C
NASA Astrophysics Data System (ADS)
Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.
2015-09-01
The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.
Currently, little justification is provided for nanomaterial testing concentrations in in vitro assays. The in vitro concentrations typically used may be higher than those experienced by exposed humans. Selection of concentration levels for hazard evaluation based on real-world e...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neary, Vincent Sinclair; Yang, Zhaoqing; Wang, Taiping
A wave model test bed is established to benchmark, test and evaluate spectral wave models and modeling methodologies (i.e., best practices) for predicting the wave energy resource parameters recommended by the International Electrotechnical Commission, IEC TS 62600-101Ed. 1.0 ©2015. Among other benefits, the model test bed can be used to investigate the suitability of different models, specifically what source terms should be included in spectral wave models under different wave climate conditions and for different classes of resource assessment. The overarching goal is to use these investigations to provide industry guidance for model selection and modeling best practices depending onmore » the wave site conditions and desired class of resource assessment. Modeling best practices are reviewed, and limitations and knowledge gaps in predicting wave energy resource parameters are identified.« less
Selecting among competing models of electro-optic, infrared camera system range performance
Nichols, Jonathan M.; Hines, James E.; Nichols, James D.
2013-01-01
Range performance is often the key requirement around which electro-optical and infrared camera systems are designed. This work presents an objective framework for evaluating competing range performance models. Model selection based on the Akaike’s Information Criterion (AIC) is presented for the type of data collected during a typical human observer and target identification experiment. These methods are then demonstrated on observer responses to both visible and infrared imagery in which one of three maritime targets was placed at various ranges. We compare the performance of a number of different models, including those appearing previously in the literature. We conclude that our model-based approach offers substantial improvements over the traditional approach to inference, including increased precision and the ability to make predictions for some distances other than the specific set for which experimental trials were conducted.
Linear free energy relationships for selected phthalate esters were used to estimate the rate constants for hydrolysis, biolysis, sediment-water partition coefficients, and biosorption required for modeling. The fate and transport behavior of dimethyl, diethyl, di-n-butyl, di-n-o...