Sample records for model assessment method

  1. Massive integration of diverse protein quality assessment methods to improve template based modeling in CASP11

    PubMed Central

    Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin

    2015-01-01

    Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. PMID:26369671

  2. Massive integration of diverse protein quality assessment methods to improve template based modeling in CASP11.

    PubMed

    Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin

    2016-09-01

    Model evaluation and selection is an important step and a big challenge in template-based protein structure prediction. Individual model quality assessment methods designed for recognizing some specific properties of protein structures often fail to consistently select good models from a model pool because of their limitations. Therefore, combining multiple complimentary quality assessment methods is useful for improving model ranking and consequently tertiary structure prediction. Here, we report the performance and analysis of our human tertiary structure predictor (MULTICOM) based on the massive integration of 14 diverse complementary quality assessment methods that was successfully benchmarked in the 11th Critical Assessment of Techniques of Protein Structure prediction (CASP11). The predictions of MULTICOM for 39 template-based domains were rigorously assessed by six scoring metrics covering global topology of Cα trace, local all-atom fitness, side chain quality, and physical reasonableness of the model. The results show that the massive integration of complementary, diverse single-model and multi-model quality assessment methods can effectively leverage the strength of single-model methods in distinguishing quality variation among similar good models and the advantage of multi-model quality assessment methods of identifying reasonable average-quality models. The overall excellent performance of the MULTICOM predictor demonstrates that integrating a large number of model quality assessment methods in conjunction with model clustering is a useful approach to improve the accuracy, diversity, and consequently robustness of template-based protein structure prediction. Proteins 2016; 84(Suppl 1):247-259. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  3. [Application of three risk assessment models in occupational health risk assessment of dimethylformamide].

    PubMed

    Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J

    2016-08-20

    Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2 ; Occupational hazards risk assessment index=2 Health effect level ×2 exposure ratio ×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions and can comprehensively and accurately evaluate occupational health risk caused by DMF.

  4. MQAPRank: improved global protein model quality assessment by learning-to-rank.

    PubMed

    Jing, Xiaoyang; Dong, Qiwen

    2017-05-25

    Protein structure prediction has achieved a lot of progress during the last few decades and a greater number of models for a certain sequence can be predicted. Consequently, assessing the qualities of predicted protein models in perspective is one of the key components of successful protein structure prediction. Over the past years, a number of methods have been developed to address this issue, which could be roughly divided into three categories: single methods, quasi-single methods and clustering (or consensus) methods. Although these methods achieve much success at different levels, accurate protein model quality assessment is still an open problem. Here, we present the MQAPRank, a global protein model quality assessment program based on learning-to-rank. The MQAPRank first sorts the decoy models by using single method based on learning-to-rank algorithm to indicate their relative qualities for the target protein. And then it takes the first five models as references to predict the qualities of other models by using average GDT_TS scores between reference models and other models. Benchmarked on CASP11 and 3DRobot datasets, the MQAPRank achieved better performances than other leading protein model quality assessment methods. Recently, the MQAPRank participated in the CASP12 under the group name FDUBio and achieved the state-of-the-art performances. The MQAPRank provides a convenient and powerful tool for protein model quality assessment with the state-of-the-art performances, it is useful for protein structure prediction and model quality assessment usages.

  5. Protein model quality assessment prediction by combining fragment comparisons and a consensus Cα contact potential

    PubMed Central

    Zhou, Hongyi; Skolnick, Jeffrey

    2009-01-01

    In this work, we develop a fully automated method for the quality assessment prediction of protein structural models generated by structure prediction approaches such as fold recognition servers, or ab initio methods. The approach is based on fragment comparisons and a consensus Cα contact potential derived from the set of models to be assessed and was tested on CASP7 server models. The average Pearson linear correlation coefficient between predicted quality and model GDT-score per target is 0.83 for the 98 targets which is better than those of other quality assessment methods that participated in CASP7. Our method also outperforms the other methods by about 3% as assessed by the total GDT-score of the selected top models. PMID:18004783

  6. Designing and evaluating the MULTICOM protein local and global model quality prediction methods in the CASP10 experiment

    PubMed Central

    2014-01-01

    Background Protein model quality assessment is an essential component of generating and using protein structural models. During the Tenth Critical Assessment of Techniques for Protein Structure Prediction (CASP10), we developed and tested four automated methods (MULTICOM-REFINE, MULTICOM-CLUSTER, MULTICOM-NOVEL, and MULTICOM-CONSTRUCT) that predicted both local and global quality of protein structural models. Results MULTICOM-REFINE was a clustering approach that used the average pairwise structural similarity between models to measure the global quality and the average Euclidean distance between a model and several top ranked models to measure the local quality. MULTICOM-CLUSTER and MULTICOM-NOVEL were two new support vector machine-based methods of predicting both the local and global quality of a single protein model. MULTICOM-CONSTRUCT was a new weighted pairwise model comparison (clustering) method that used the weighted average similarity between models in a pool to measure the global model quality. Our experiments showed that the pairwise model assessment methods worked better when a large portion of models in the pool were of good quality, whereas single-model quality assessment methods performed better on some hard targets when only a small portion of models in the pool were of reasonable quality. Conclusions Since digging out a few good models from a large pool of low-quality models is a major challenge in protein structure prediction, single model quality assessment methods appear to be poised to make important contributions to protein structure modeling. The other interesting finding was that single-model quality assessment scores could be used to weight the models by the consensus pairwise model comparison method to improve its accuracy. PMID:24731387

  7. Designing and evaluating the MULTICOM protein local and global model quality prediction methods in the CASP10 experiment.

    PubMed

    Cao, Renzhi; Wang, Zheng; Cheng, Jianlin

    2014-04-15

    Protein model quality assessment is an essential component of generating and using protein structural models. During the Tenth Critical Assessment of Techniques for Protein Structure Prediction (CASP10), we developed and tested four automated methods (MULTICOM-REFINE, MULTICOM-CLUSTER, MULTICOM-NOVEL, and MULTICOM-CONSTRUCT) that predicted both local and global quality of protein structural models. MULTICOM-REFINE was a clustering approach that used the average pairwise structural similarity between models to measure the global quality and the average Euclidean distance between a model and several top ranked models to measure the local quality. MULTICOM-CLUSTER and MULTICOM-NOVEL were two new support vector machine-based methods of predicting both the local and global quality of a single protein model. MULTICOM-CONSTRUCT was a new weighted pairwise model comparison (clustering) method that used the weighted average similarity between models in a pool to measure the global model quality. Our experiments showed that the pairwise model assessment methods worked better when a large portion of models in the pool were of good quality, whereas single-model quality assessment methods performed better on some hard targets when only a small portion of models in the pool were of reasonable quality. Since digging out a few good models from a large pool of low-quality models is a major challenge in protein structure prediction, single model quality assessment methods appear to be poised to make important contributions to protein structure modeling. The other interesting finding was that single-model quality assessment scores could be used to weight the models by the consensus pairwise model comparison method to improve its accuracy.

  8. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    NASA Astrophysics Data System (ADS)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  9. PconsD: ultra rapid, accurate model quality assessment for protein structure prediction.

    PubMed

    Skwark, Marcin J; Elofsson, Arne

    2013-07-15

    Clustering methods are often needed for accurately assessing the quality of modeled protein structures. Recent blind evaluation of quality assessment methods in CASP10 showed that there is little difference between many different methods as far as ranking models and selecting best model are concerned. When comparing many models, the computational cost of the model comparison can become significant. Here, we present PconsD, a fast, stream-computing method for distance-driven model quality assessment that runs on consumer hardware. PconsD is at least one order of magnitude faster than other methods of comparable accuracy. The source code for PconsD is freely available at http://d.pcons.net/. Supplementary benchmarking data are also available there. arne@bioinfo.se Supplementary data are available at Bioinformatics online.

  10. Development and comparison in uncertainty assessment based Bayesian modularization method in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn

    2013-04-01

    SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.

  11. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  12. The Role of Simulation in Microsurgical Training.

    PubMed

    Evgeniou, Evgenios; Walker, Harriet; Gujral, Sameer

    Simulation has been established as an integral part of microsurgical training. The aim of this study was to assess and categorize the various simulation models in relation to the complexity of the microsurgical skill being taught and analyze the assessment methods commonly employed in microsurgical simulation training. Numerous courses have been established using simulation models. These models can be categorized, according to the level of complexity of the skill being taught, into basic, intermediate, and advanced. Microsurgical simulation training should be assessed using validated assessment methods. Assessment methods vary significantly from subjective expert opinions to self-assessment questionnaires and validated global rating scales. The appropriate assessment method should carefully be chosen based on the simulation modality. Simulation models should be validated, and a model with appropriate fidelity should be chosen according to the microsurgical skill being taught. Assessment should move from traditional simple subjective evaluations of trainee performance to validated tools. Future studies should assess the transferability of skills gained during simulation training to the real-life setting. Copyright © 2018 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  13. Qualitative Assessment of Inquiry-Based Teaching Methods

    ERIC Educational Resources Information Center

    Briggs, Michael; Long, George; Owens, Katrina

    2011-01-01

    A new approach to teaching method assessment using student focused qualitative studies and the theoretical framework of mental models is proposed. The methodology is considered specifically for the advantages it offers when applied to the assessment of inquiry-based teaching methods. The theoretical foundation of mental models is discussed, and…

  14. [Statistical prediction methods in violence risk assessment and its application].

    PubMed

    Liu, Yuan-Yuan; Hu, Jun-Mei; Yang, Min; Li, Xiao-Song

    2013-06-01

    It is an urgent global problem how to improve the violence risk assessment. As a necessary part of risk assessment, statistical methods have remarkable impacts and effects. In this study, the predicted methods in violence risk assessment from the point of statistics are reviewed. The application of Logistic regression as the sample of multivariate statistical model, decision tree model as the sample of data mining technique, and neural networks model as the sample of artificial intelligence technology are all reviewed. This study provides data in order to contribute the further research of violence risk assessment.

  15. Application of ''Earl's Assessment "as", Assessment "for", and Assessment "of" Learning Model'' with Orthopaedic Assessment Clinical Competence

    ERIC Educational Resources Information Center

    Lafave, Mark R.; Katz, Larry; Vaughn, Norman

    2013-01-01

    Context: In order to study the efficacy of assessment methods, a theoretical framework of Earl's model of assessment was introduced. Objective: (1) Introduce the predictive learning assessment model (PLAM) as an application of Earl's model of learning; (2) test Earl's model of learning through the use of the Standardized Orthopedic Assessment Tool…

  16. Dynamic drought risk assessment using crop model and remote sensing techniques

    NASA Astrophysics Data System (ADS)

    Sun, H.; Su, Z.; Lv, J.; Li, L.; Wang, Y.

    2017-02-01

    Drought risk assessment is of great significance to reduce the loss of agricultural drought and ensure food security. The normally drought risk assessment method is to evaluate its exposure to the hazard and the vulnerability to extended periods of water shortage for a specific region, which is a static evaluation method. The Dynamic Drought Risk Assessment (DDRA) is to estimate the drought risk according to the crop growth and water stress conditions in real time. In this study, a DDRA method using crop model and remote sensing techniques was proposed. The crop model we employed is DeNitrification and DeComposition (DNDC) model. The drought risk was quantified by the yield losses predicted by the crop model in a scenario-based method. The crop model was re-calibrated to improve the performance by the Leaf Area Index (LAI) retrieved from MODerate Resolution Imaging Spectroradiometer (MODIS) data. And the in-situ station-based crop model was extended to assess the regional drought risk by integrating crop planted mapping. The crop planted area was extracted with extended CPPI method from MODIS data. This study was implemented and validated on maize crop in Liaoning province, China.

  17. [A reliability growth assessment method and its application in the development of equipment in space cabin].

    PubMed

    Chen, J D; Sun, H L

    1999-04-01

    Objective. To assess and predict reliability of an equipment dynamically by making full use of various test informations in the development of products. Method. A new reliability growth assessment method based on army material system analysis activity (AMSAA) model was developed. The method is composed of the AMSAA model and test data conversion technology. Result. The assessment and prediction results of a space-borne equipment conform to its expectations. Conclusion. It is suggested that this method should be further researched and popularized.

  18. DeepQA: improving the estimation of single protein model quality with deep belief networks.

    PubMed

    Cao, Renzhi; Bhattacharya, Debswapna; Hou, Jie; Cheng, Jianlin

    2016-12-05

    Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. We introduce a novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information. The deep belief network is trained on several large datasets consisting of models from the Critical Assessment of Protein Structure Prediction (CASP) experiments, several publicly available datasets, and models generated by our in-house ab initio method. Our experiments demonstrate that deep belief network has better performance compared to Support Vector Machines and Neural Networks on the protein model quality assessment problem, and our method DeepQA achieves the state-of-the-art performance on CASP11 dataset. It also outperformed two well-established methods in selecting good outlier models from a large set of models of mostly low quality generated by ab initio modeling methods. DeepQA is a useful deep learning tool for protein single model quality assessment and protein structure prediction. The source code, executable, document and training/test datasets of DeepQA for Linux is freely available to non-commercial users at http://cactus.rnet.missouri.edu/DeepQA/ .

  19. Strapdown Airborne Gravimetry Quality Assessment Method Based on Single Survey Line Data: A Study by SGA-WZ02 Gravimeter

    PubMed Central

    Wu, Meiping; Cao, Juliang; Zhang, Kaidong; Cai, Shaokun; Yu, Ruihang

    2018-01-01

    Quality assessment is an important part in the strapdown airborne gravimetry. Root mean square error (RMSE) evaluation method is a classical way to evaluate the gravimetry quality, but classical evaluation methods are preconditioned by extra flight or reference data. Thus, a method, which is able to largely conquer the premises of classical quality assessment methods and can be used in single survey line, has been developed in this paper. According to theoretical analysis, the method chooses the stability of two horizontal attitude angles, horizontal specific force and vertical specific force as the determinants of quality assessment method. The actual data, collected by SGA-WZ02 from 13 flights 21 lines in certain survey, was used to build the model and elaborate the method. To substantiate the performance of the quality assessment model, the model is applied in extra repeat line flights from two surveys. Compared with internal RMSE, standard deviation of assessment residuals are 0.23 mGal and 0.16 mGal in two surveys, which shows that the quality assessment method is reliable and stricter. The extra flights are not necessary by specially arranging the route of flights. The method, summarized from SGA-WZ02, is a feasible approach to assess gravimetry quality using single line data and is also suitable for other strapdown gravimeters. PMID:29373535

  20. Dynamic Assessment of Water Quality Based on a Variable Fuzzy Pattern Recognition Model

    PubMed Central

    Xu, Shiguo; Wang, Tianxiang; Hu, Suduan

    2015-01-01

    Water quality assessment is an important foundation of water resource protection and is affected by many indicators. The dynamic and fuzzy changes of water quality lead to problems for proper assessment. This paper explores a method which is in accordance with the water quality changes. The proposed method is based on the variable fuzzy pattern recognition (VFPR) model and combines the analytic hierarchy process (AHP) model with the entropy weight (EW) method. The proposed method was applied to dynamically assess the water quality of Biliuhe Reservoir (Dailan, China). The results show that the water quality level is between levels 2 and 3 and worse in August or September, caused by the increasing water temperature and rainfall. Weights and methods are compared and random errors of the values of indicators are analyzed. It is concluded that the proposed method has advantages of dynamism, fuzzification and stability by considering the interval influence of multiple indicators and using the average level characteristic values of four models as results. PMID:25689998

  1. Dynamic assessment of water quality based on a variable fuzzy pattern recognition model.

    PubMed

    Xu, Shiguo; Wang, Tianxiang; Hu, Suduan

    2015-02-16

    Water quality assessment is an important foundation of water resource protection and is affected by many indicators. The dynamic and fuzzy changes of water quality lead to problems for proper assessment. This paper explores a method which is in accordance with the water quality changes. The proposed method is based on the variable fuzzy pattern recognition (VFPR) model and combines the analytic hierarchy process (AHP) model with the entropy weight (EW) method. The proposed method was applied to dynamically assess the water quality of Biliuhe Reservoir (Dailan, China). The results show that the water quality level is between levels 2 and 3 and worse in August or September, caused by the increasing water temperature and rainfall. Weights and methods are compared and random errors of the values of indicators are analyzed. It is concluded that the proposed method has advantages of dynamism, fuzzification and stability by considering the interval influence of multiple indicators and using the average level characteristic values of four models as results.

  2. United3D: a protein model quality assessment program that uses two consensus based methods.

    PubMed

    Terashi, Genki; Oosawa, Makoto; Nakamura, Yuuki; Kanou, Kazuhiko; Takeda-Shitaka, Mayuko

    2012-01-01

    In protein structure prediction, such as template-based modeling and free modeling (ab initio modeling), the step that assesses the quality of protein models is very important. We have developed a model quality assessment (QA) program United3D that uses an optimized clustering method and a simple Cα atom contact-based potential. United3D automatically estimates the quality scores (Qscore) of predicted protein models that are highly correlated with the actual quality (GDT_TS). The performance of United3D was tested in the ninth Critical Assessment of protein Structure Prediction (CASP9) experiment. In CASP9, United3D showed the lowest average loss of GDT_TS (5.3) among the QA methods participated in CASP9. This result indicates that the performance of United3D to identify the high quality models from the models predicted by CASP9 servers on 116 targets was best among the QA methods that were tested in CASP9. United3D also produced high average Pearson correlation coefficients (0.93) and acceptable Kendall rank correlation coefficients (0.68) between the Qscore and GDT_TS. This performance was competitive with the other top ranked QA methods that were tested in CASP9. These results indicate that United3D is a useful tool for selecting high quality models from many candidate model structures provided by various modeling methods. United3D will improve the accuracy of protein structure prediction.

  3. Core Professionalism Education in Surgery: A Systematic Review.

    PubMed

    Sarıoğlu Büke, Akile; Karabilgin Öztürkçü, Özlem Sürel; Yılmaz, Yusuf; Sayek, İskender

    2018-03-15

    Professionalism education is one of the major elements of surgical residency education. To evaluate the studies on core professionalism education programs in surgical professionalism education. Systematic review. This systematic literature review was performed to analyze core professionalism programs for surgical residency education published in English with at least three of the following features: program developmental model/instructional design method, aims and competencies, methods of teaching, methods of assessment, and program evaluation model or method. A total of 27083 articles were retrieved using EBSCOHOST, PubMed, Science Direct, Web of Science, and manual search. Eight articles met the selection criteria. The instructional design method was presented in only one article, which described the Analysis, Design, Development, Implementation, and Evaluation model. Six articles were based on the Accreditation Council for Graduate Medical Education criterion, although there was significant variability in content. The most common teaching method was role modeling with scenario- and case-based learning. A wide range of assessment methods for evaluating professionalism education were reported. The Kirkpatrick model was reported in one article as a method for program evaluation. It is suggested that for a core surgical professionalism education program, developmental/instructional design model, aims and competencies, content, teaching methods, assessment methods, and program evaluation methods/models should be well defined, and the content should be comparable.

  4. Novel Method to Assess Arterial Insufficiency in Rodent Hindlimb

    PubMed Central

    Ziegler, Matthew A.; DiStasi, Matthew R.; Miller, Steven J.; Dalsing, Michael C.; Unthank, Joseph L.

    2015-01-01

    Background Lack of techniques to assess maximal blood flow capacity thwarts the use of rodent models of arterial insufficiency to evaluate therapies for intermittent claudication. We evaluated femoral vein outflow (VO) in combination with stimulated muscle contraction as a potential method to assess functional hindlimb arterial reserve and therapeutic efficacy in a rodent model of subcritical limb ischemia. Materials and methods VO was measured with perivascular flow probes at rest and during stimulated calf muscle contraction in young healthy rats (Wistar Kyoto, WKY; lean Zucker, LZR) and rats with cardiovascular risk factors (Spontaneously Hypertensive, SHR; Obese Zucker, OZR) with acute and/or chronic femoral arterial occlusion. Therapeutic efficacy was assessed by administration of Ramipril or Losartan to SHR after femoral artery excision. Results VO measurement in WKY demonstrated the utility of this method to assess hindlimb perfusion at rest and during calf muscle contraction. While application to diseased models (OZR, SHR) demonstrated normal resting perfusion compared to contralateral limbs, a significant reduction in reserve capacity was uncovered with muscle stimulation. Administration of Ramipril and Losartan demonstrated significant improvement in functional arterial reserve. Conclusion The results demonstrate that this novel method to assess distal limb perfusion in small rodents with subcritical limb ischemia is sufficient to unmask perfusion deficits not apparent at rest, detect impaired compensation in diseased animal models with risk factors, and assess therapeutic efficacy. The approach provides a significant advance in methods to investigate potential mechanisms and novel therapies for subcritical limb ischemia in pre-clinical rodent models. PMID:26850199

  5. A Study of Wind Turbine Comprehensive Operational Assessment Model Based on EM-PCA Algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, Minqiang; Xu, Bin; Zhan, Yangyan; Ren, Danyuan; Liu, Dexing

    2018-01-01

    To assess wind turbine performance accurately and provide theoretical basis for wind farm management, a hybrid assessment model based on Entropy Method and Principle Component Analysis (EM-PCA) was established, which took most factors of operational performance into consideration and reach to a comprehensive result. To verify the model, six wind turbines were chosen as the research objects, the ranking obtained by the method proposed in the paper were 4#>6#>1#>5#>2#>3#, which are completely in conformity with the theoretical ranking, which indicates that the reliability and effectiveness of the EM-PCA method are high. The method could give guidance for processing unit state comparison among different units and launching wind farm operational assessment.

  6. Comparison between two statistically based methods, and two physically based models developed to compute daily mean streamflow at ungaged locations in the Cedar River Basin, Iowa

    USGS Publications Warehouse

    Linhart, S. Mike; Nania, Jon F.; Christiansen, Daniel E.; Hutchinson, Kasey J.; Sanders, Curtis L.; Archfield, Stacey A.

    2013-01-01

    A variety of individuals from water resource managers to recreational users need streamflow information for planning and decisionmaking at locations where there are no streamgages. To address this problem, two statistically based methods, the Flow Duration Curve Transfer method and the Flow Anywhere method, were developed for statewide application and the two physically based models, the Precipitation Runoff Modeling-System and the Soil and Water Assessment Tool, were only developed for application for the Cedar River Basin. Observed and estimated streamflows for the two methods and models were compared for goodness of fit at 13 streamgages modeled in the Cedar River Basin by using the Nash-Sutcliffe and the percent-bias efficiency values. Based on median and mean Nash-Sutcliffe values for the 13 streamgages the Precipitation Runoff Modeling-System and Soil and Water Assessment Tool models appear to have performed similarly and better than Flow Duration Curve Transfer and Flow Anywhere methods. Based on median and mean percent bias values, the Soil and Water Assessment Tool model appears to have generally overestimated daily mean streamflows, whereas the Precipitation Runoff Modeling-System model and statistical methods appear to have underestimated daily mean streamflows. The Flow Duration Curve Transfer method produced the lowest median and mean percent bias values and appears to perform better than the other models.

  7. Assessment of methods for methyl iodide emission reduction and pest control using a simulation model

    USDA-ARS?s Scientific Manuscript database

    Various methods have been developed to reduce atmospheric emissions from the agricultural use of highly volatile pesticides and mitigate their adverse environmental effects. The effectiveness of various methods on emissions reduction and pest control was assessed using simulation model in this study...

  8. Assessment selection in human-automation interaction studies: The Failure-GAM2E and review of assessment methods for highly automated driving.

    PubMed

    Grane, Camilla

    2018-01-01

    Highly automated driving will change driver's behavioural patterns. Traditional methods used for assessing manual driving will only be applicable for the parts of human-automation interaction where the driver intervenes such as in hand-over and take-over situations. Therefore, driver behaviour assessment will need to adapt to the new driving scenarios. This paper aims at simplifying the process of selecting appropriate assessment methods. Thirty-five papers were reviewed to examine potential and relevant methods. The review showed that many studies still relies on traditional driving assessment methods. A new method, the Failure-GAM 2 E model, with purpose to aid assessment selection when planning a study, is proposed and exemplified in the paper. Failure-GAM 2 E includes a systematic step-by-step procedure defining the situation, failures (Failure), goals (G), actions (A), subjective methods (M), objective methods (M) and equipment (E). The use of Failure-GAM 2 E in a study example resulted in a well-reasoned assessment plan, a new way of measuring trust through feet movements and a proposed Optimal Risk Management Model. Failure-GAM 2 E and the Optimal Risk Management Model are believed to support the planning process for research studies in the field of human-automation interaction. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. An Automated System for Skeletal Maturity Assessment by Extreme Learning Machines

    PubMed Central

    Mansourvar, Marjan; Shamshirband, Shahaboddin; Raj, Ram Gopal; Gunalan, Roshan; Mazinani, Iman

    2015-01-01

    Assessing skeletal age is a subjective and tedious examination process. Hence, automated assessment methods have been developed to replace manual evaluation in medical applications. In this study, a new fully automated method based on content-based image retrieval and using extreme learning machines (ELM) is designed and adapted to assess skeletal maturity. The main novelty of this approach is it overcomes the segmentation problem as suffered by existing systems. The estimation results of ELM models are compared with those of genetic programming (GP) and artificial neural networks (ANNs) models. The experimental results signify improvement in assessment accuracy over GP and ANN, while generalization capability is possible with the ELM approach. Moreover, the results are indicated that the ELM model developed can be used confidently in further work on formulating novel models of skeletal age assessment strategies. According to the experimental results, the new presented method has the capacity to learn many hundreds of times faster than traditional learning methods and it has sufficient overall performance in many aspects. It has conclusively been found that applying ELM is particularly promising as an alternative method for evaluating skeletal age. PMID:26402795

  10. [Assessment on the ability of emergency response at the county center for disease control and prevention level in flooding-prone areas].

    PubMed

    Chen, Wei; Zeng, Guang

    2006-02-01

    To establish a comprehensive assessment model on the ability of emergency response within the public health system in flooding-prone areas. A hierarchy process theory was used to establish the initial assessing framework. Delphi method was used to screen and choose the ultimate indicators and their weights before an assessment model was set up under the 'synthetic scored method' to assess the ability of the emergency response among twenty county public health units. We then used the 'analysis of variation (ANOVA)' methodology to test the feasibility of distinguishing the ability of emergency response among different county health units and correlation analysis was used to assess the independence of indicators in the assessing model. A comprehensive model was then established including twenty first-class indicators and fifty-six second-class indicators and the degree of ability to emergency response with flooding of public health units was evaluated. There were five public health units having higher, ten having moderate but five with lower levels on emergency response. The assessment model was proved to be a good method in differentiating the ability of public health units, using independent indicators. The assessment model which we established seemed to be practical and reliable.

  11. MODELS AND MODELING METHODS FOR ASSESSING HUMAN EXPOSURE AND DOSE TO TOXIC CHEMICALS AND POLLUTANTS

    EPA Science Inventory

    This project aims to strengthen the general scientific foundation of EPA's exposure and risk assessment, management, and policy processes by developing state-of-the-art exposure to dose mathematical models and solution methods. The results of this research will be to produce a mo...

  12. Improved model quality assessment using ProQ2.

    PubMed

    Ray, Arjun; Lindahl, Erik; Wallner, Björn

    2012-09-10

    Employing methods to assess the quality of modeled protein structures is now standard practice in bioinformatics. In a broad sense, the techniques can be divided into methods relying on consensus prediction on the one hand, and single-model methods on the other. Consensus methods frequently perform very well when there is a clear consensus, but this is not always the case. In particular, they frequently fail in selecting the best possible model in the hard cases (lacking consensus) or in the easy cases where models are very similar. In contrast, single-model methods do not suffer from these drawbacks and could potentially be applied on any protein of interest to assess quality or as a scoring function for sampling-based refinement. Here, we present a new single-model method, ProQ2, based on ideas from its predecessor, ProQ. ProQ2 is a model quality assessment algorithm that uses support vector machines to predict local as well as global quality of protein models. Improved performance is obtained by combining previously used features with updated structural and predicted features. The most important contribution can be attributed to the use of profile weighting of the residue specific features and the use features averaged over the whole model even though the prediction is still local. ProQ2 is significantly better than its predecessors at detecting high quality models, improving the sum of Z-scores for the selected first-ranked models by 20% and 32% compared to the second-best single-model method in CASP8 and CASP9, respectively. The absolute quality assessment of the models at both local and global level is also improved. The Pearson's correlation between the correct and local predicted score is improved from 0.59 to 0.70 on CASP8 and from 0.62 to 0.68 on CASP9; for global score to the correct GDT_TS from 0.75 to 0.80 and from 0.77 to 0.80 again compared to the second-best single methods in CASP8 and CASP9, respectively. ProQ2 is available at http://proq2.wallnerlab.org.

  13. Core Professionalism Education in Surgery: A Systematic Review

    PubMed Central

    Sarıoğlu Büke, Akile; Karabilgin Öztürkçü, Özlem Sürel; Yılmaz, Yusuf; Sayek, İskender

    2018-01-01

    Background: Professionalism education is one of the major elements of surgical residency education. Aims: To evaluate the studies on core professionalism education programs in surgical professionalism education. Study Design: Systematic review. Methods: This systematic literature review was performed to analyze core professionalism programs for surgical residency education published in English with at least three of the following features: program developmental model/instructional design method, aims and competencies, methods of teaching, methods of assessment, and program evaluation model or method. A total of 27083 articles were retrieved using EBSCOHOST, PubMed, Science Direct, Web of Science, and manual search. Results: Eight articles met the selection criteria. The instructional design method was presented in only one article, which described the Analysis, Design, Development, Implementation, and Evaluation model. Six articles were based on the Accreditation Council for Graduate Medical Education criterion, although there was significant variability in content. The most common teaching method was role modeling with scenario- and case-based learning. A wide range of assessment methods for evaluating professionalism education were reported. The Kirkpatrick model was reported in one article as a method for program evaluation. Conclusion: It is suggested that for a core surgical professionalism education program, developmental/instructional design model, aims and competencies, content, teaching methods, assessment methods, and program evaluation methods/models should be well defined, and the content should be comparable. PMID:29553464

  14. Human Exposure Assessment for Air Pollution.

    PubMed

    Han, Bin; Hu, Li-Wen; Bai, Zhipeng

    2017-01-01

    Assessment of human exposure to air pollution is a fundamental part of the more general process of health risk assessment. The measurement methods for exposure assessment now include personal exposure monitoring, indoor-outdoor sampling, mobile monitoring, and exposure assessment modeling (such as proximity models, interpolation model, air dispersion models, and land-use regression (LUR) models). Among these methods, personal exposure measurement is considered to be the most accurate method of pollutant exposure assessment until now, since it can better quantify observed differences and better reflect exposure among smaller groups of people at ground level. And since the great differences of geographical environment, source distribution, pollution characteristics, economic conditions, and living habits, there is a wide range of differences between indoor, outdoor, and individual air pollution exposure in different regions of China. In general, the indoor particles in most Chinese families comprise infiltrated outdoor particles, particles generated indoors, and a few secondary organic aerosol particles, and in most cases, outdoor particle pollution concentrations are a major contributor to indoor concentrations in China. Furthermore, since the time, energy, and expense are limited, it is difficult to measure the concentration of pollutants for each individual. In recent years, obtaining the concentration of air pollutants by using a variety of exposure assessment models is becoming a main method which could solve the problem of the increasing number of individuals in epidemiology studies.

  15. Protein single-model quality assessment by feature-based probability density functions.

    PubMed

    Cao, Renzhi; Cheng, Jianlin

    2016-04-04

    Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob.

  16. High-resolution assessment of land use impacts on biodiversity in life cycle assessment using species habitat suitability models.

    PubMed

    de Baan, Laura; Curran, Michael; Rondinini, Carlo; Visconti, Piero; Hellweg, Stefanie; Koellner, Thomas

    2015-02-17

    Agricultural land use is a main driver of global biodiversity loss. The assessment of land use impacts in decision-support tools such as life cycle assessment (LCA) requires spatially explicit models, but existing approaches are either not spatially differentiated or modeled at very coarse scales (e.g., biomes or ecoregions). In this paper, we develop a high-resolution (900 m) assessment method for land use impacts on biodiversity based on habitat suitability models (HSM) of mammal species. This method considers potential land use effects on individual species, and impacts are weighted by the species' conservation status and global rarity. We illustrate the method using a case study of crop production in East Africa, but the underlying HSMs developed by the Global Mammals Assessment are available globally. We calculate impacts of three major export crops and compare the results to two previously developed methods (focusing on local and regional impacts, respectively) to assess the relevance of the methodological innovations proposed in this paper. The results highlight hotspots of product-related biodiversity impacts that help characterize the links among agricultural production, consumption, and biodiversity loss.

  17. Assessing Argumentative Representation with Bayesian Network Models in Debatable Social Issues

    ERIC Educational Resources Information Center

    Zhang, Zhidong; Lu, Jingyan

    2014-01-01

    This study seeks to obtain argumentation models, which represent argumentative processes and an assessment structure in secondary school debatable issues in the social sciences. The argumentation model was developed based on mixed methods, a combination of both theory-driven and data-driven methods. The coding system provided a combing point by…

  18. Large-scale model quality assessment for improving protein tertiary structure prediction.

    PubMed

    Cao, Renzhi; Bhattacharya, Debswapna; Adhikari, Badri; Li, Jilong; Cheng, Jianlin

    2015-06-15

    Sampling structural models and ranking them are the two major challenges of protein structure prediction. Traditional protein structure prediction methods generally use one or a few quality assessment (QA) methods to select the best-predicted models, which cannot consistently select relatively better models and rank a large number of models well. Here, we develop a novel large-scale model QA method in conjunction with model clustering to rank and select protein structural models. It unprecedentedly applied 14 model QA methods to generate consensus model rankings, followed by model refinement based on model combination (i.e. averaging). Our experiment demonstrates that the large-scale model QA approach is more consistent and robust in selecting models of better quality than any individual QA method. Our method was blindly tested during the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM group. It was officially ranked third out of all 143 human and server predictors according to the total scores of the first models predicted for 78 CASP11 protein domains and second according to the total scores of the best of the five models predicted for these domains. MULTICOM's outstanding performance in the extremely competitive 2014 CASP11 experiment proves that our large-scale QA approach together with model clustering is a promising solution to one of the two major problems in protein structure modeling. The web server is available at: http://sysbio.rnet.missouri.edu/multicom_cluster/human/. © The Author 2015. Published by Oxford University Press.

  19. Assessing and reducing hydrogeologic model uncertainty

    USDA-ARS?s Scientific Manuscript database

    NRC is sponsoring research that couples model abstraction techniques with model uncertainty assessment methods. Insights and information from this program will be useful in decision making by NRC staff, licensees and stakeholders in their assessment of subsurface radionuclide transport. All analytic...

  20. Alternative Assessment Methods Based on Categorizations, Supporting Technologies, and a Model for Betterment

    ERIC Educational Resources Information Center

    Ben-Jacob, Marion G.; Ben-Jacob, Tyler E.

    2014-01-01

    This paper explores alternative assessment methods from the perspective of categorizations. It addresses the technologies that support assessment. It discusses initial, formative, and summative assessment, as well as objective and subjective assessment, and formal and informal assessment. It approaches each category of assessment from the…

  1. A new assessment method for urbanization environmental impact: urban environment entropy model and its application.

    PubMed

    Ouyang, Tingping; Fu, Shuqing; Zhu, Zhaoyu; Kuang, Yaoqiu; Huang, Ningsheng; Wu, Zhifeng

    2008-11-01

    The thermodynamic law is one of the most widely used scientific principles. The comparability between the environmental impact of urbanization and the thermodynamic entropy was systematically analyzed. Consequently, the concept "Urban Environment Entropy" was brought forward and the "Urban Environment Entropy" model was established for urbanization environmental impact assessment in this study. The model was then utilized in a case study for the assessment of river water quality in the Pearl River Delta Economic Zone. The results indicated that the assessing results of the model are consistent to that of the equalized synthetic pollution index method. Therefore, it can be concluded that the Urban Environment Entropy model has high reliability and can be applied widely in urbanization environmental assessment research using many different environmental parameters.

  2. Comparison of mathematic models for assessment of glomerular filtration rate with electron-beam CT in pigs.

    PubMed

    Daghini, Elena; Juillard, Laurent; Haas, John A; Krier, James D; Romero, Juan C; Lerman, Lilach O

    2007-02-01

    To prospectively compare in pigs three mathematic models for assessment of glomerular filtration rate (GFR) on electron-beam (EB) computed tomographic (CT) images, with concurrent inulin clearance serving as the reference standard. This study was approved by the institutional animal care and use committee. Inulin clearance was measured in nine pigs (18 kidneys) and compared with single-kidney GFR assessed from renal time-attenuation curves (TACs) obtained with EB CT before and after infusion of the vasodilator acetylcholine. CT-derived GFR was calculated with the original and modified Patlak methods and with previously validated extended gamma variate modeling of first-pass cortical TACs. Statistical analysis was performed to assess correlation between CT methods and inulin clearance for estimation of GFR with least-squares regression analysis and Bland-Altman graphical representation. Comparisons within groups were performed with a paired t test. GFR assessed with the original Patlak method indicated poor correlation with inulin clearance, whereas GFR assessed with the modified Patlak method (P < .001, r = 0.75) and with gamma variate modeling (P < .001, r = 0.79) correlated significantly with inulin clearance and indicated an increase in response to acetylcholine. CT-derived estimates of GFR can be significantly improved by modifications in image analysis methods (eg, use of a cortical region of interest). (c) RSNA, 2007.

  3. Tooth-size discrepancy: A comparison between manual and digital methods

    PubMed Central

    Correia, Gabriele Dória Cabral; Habib, Fernando Antonio Lima; Vogel, Carlos Jorge

    2014-01-01

    Introduction Technological advances in Dentistry have emerged primarily in the area of diagnostic tools. One example is the 3D scanner, which can transform plaster models into three-dimensional digital models. Objective This study aimed to assess the reliability of tooth size-arch length discrepancy analysis measurements performed on three-dimensional digital models, and compare these measurements with those obtained from plaster models. Material and Methods To this end, plaster models of lower dental arches and their corresponding three-dimensional digital models acquired with a 3Shape R700T scanner were used. All of them had lower permanent dentition. Four different tooth size-arch length discrepancy calculations were performed on each model, two of which by manual methods using calipers and brass wire, and two by digital methods using linear measurements and parabolas. Results Data were statistically assessed using Friedman test and no statistically significant differences were found between the two methods (P > 0.05), except for values found by the linear digital method which revealed a slight, non-significant statistical difference. Conclusions Based on the results, it is reasonable to assert that any of these resources used by orthodontists to clinically assess tooth size-arch length discrepancy can be considered reliable. PMID:25279529

  4. An analytical framework for estimating aquatic species density from environmental DNA

    USGS Publications Warehouse

    Chambert, Thierry; Pilliod, David S.; Goldberg, Caren S.; Doi, Hideyuki; Takahara, Teruhiko

    2018-01-01

    Environmental DNA (eDNA) analysis of water samples is on the brink of becoming a standard monitoring method for aquatic species. This method has improved detection rates over conventional survey methods and thus has demonstrated effectiveness for estimation of site occupancy and species distribution. The frontier of eDNA applications, however, is to infer species density. Building upon previous studies, we present and assess a modeling approach that aims at inferring animal density from eDNA. The modeling combines eDNA and animal count data from a subset of sites to estimate species density (and associated uncertainties) at other sites where only eDNA data are available. As a proof of concept, we first perform a cross-validation study using experimental data on carp in mesocosms. In these data, fish densities are known without error, which allows us to test the performance of the method with known data. We then evaluate the model using field data from a study on a stream salamander species to assess the potential of this method to work in natural settings, where density can never be known with absolute certainty. Two alternative distributions (Normal and Negative Binomial) to model variability in eDNA concentration data are assessed. Assessment based on the proof of concept data (carp) revealed that the Negative Binomial model provided much more accurate estimates than the model based on a Normal distribution, likely because eDNA data tend to be overdispersed. Greater imprecision was found when we applied the method to the field data, but the Negative Binomial model still provided useful density estimates. We call for further model development in this direction, as well as further research targeted at sampling design optimization. It will be important to assess these approaches on a broad range of study systems.

  5. Assessing the Assessment Methods: Climate Change and Hydrologic Impacts

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.

    2014-12-01

    The Bureau of Reclamation, the U.S. Army Corps of Engineers, and other water management agencies have an interest in developing reliable, science-based methods for incorporating climate change information into longer-term water resources planning. Such assessments must quantify projections of future climate and hydrology, typically relying on some form of spatial downscaling and bias correction to produce watershed-scale weather information that subsequently drives hydrology and other water resource management analyses (e.g., water demands, water quality, and environmental habitat). Water agencies continue to face challenging method decisions in these endeavors: (1) which downscaling method should be applied and at what resolution; (2) what observational dataset should be used to drive downscaling and hydrologic analysis; (3) what hydrologic model(s) should be used and how should these models be configured and calibrated? There is a critical need to understand the ramification of these method decisions, as they affect the signal and uncertainties produced by climate change assessments and, thus, adaptation planning. This presentation summarizes results from a three-year effort to identify strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic conditions. Methods were evaluated from two perspectives: historical fidelity, and tendency to modulate a global climate model's climate change signal. On downscaling, four methods were applied at multiple resolutions: statistically using Bias Correction Spatial Disaggregation, Bias Correction Constructed Analogs, and Asynchronous Regression; dynamically using the Weather Research and Forecasting model. Downscaling results were then used to drive hydrologic analyses over the contiguous U.S. using multiple models (VIC, CLM, PRMS), with added focus placed on case study basins within the Colorado Headwaters. The presentation will identify which types of climate changes are expressed robustly across methods versus those that are sensitive to method choice; which method choices seem relatively more important; and where strategic investments in research and development can substantially improve guidance on climate change provided to water managers.

  6. WORKSHOP ON APPLICATION OF STATISTICAL METHODS TO BIOLOGICALLY-BASED PHARMACOKINETIC MODELING FOR RISK ASSESSMENT

    EPA Science Inventory

    Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...

  7. Comparison of methods used to estimate conventional undiscovered petroleum resources: World examples

    USGS Publications Warehouse

    Ahlbrandt, T.S.; Klett, T.R.

    2005-01-01

    Various methods for assessing undiscovered oil, natural gas, and natural gas liquid resources were compared in support of the USGS World Petroleum Assessment 2000. Discovery process, linear fractal, parabolic fractal, engineering estimates, PETRIMES, Delphi, and the USGS 2000 methods were compared. Three comparisons of these methods were made in: (1) the Neuquen Basin province, Argentina (different assessors, same input data); (2) provinces in North Africa, Oman, and Yemen (same assessors, different methods); and (3) the Arabian Peninsula, Arabian (Persian) Gulf, and North Sea (different assessors, different methods). A fourth comparison (same assessors, same assessment methods but different geologic models), between results from structural and stratigraphic assessment units in the North Sea used only the USGS 2000 method, and hence compared the type of assessment unit rather than the method. In comparing methods, differences arise from inherent differences in assumptions regarding: (1) the underlying distribution of the parent field population (all fields, discovered and undiscovered), (2) the population of fields being estimated; that is, the entire parent distribution or the undiscovered resource distribution, (3) inclusion or exclusion of large outlier fields; (4) inclusion or exclusion of field (reserve) growth, (5) deterministic or probabilistic models, (6) data requirements, and (7) scale and time frame of the assessment. Discovery process, Delphi subjective consensus, and the USGS 2000 method yield comparable results because similar procedures are employed. In mature areas such as the Neuquen Basin province in Argentina, the linear and parabolic fractal and engineering methods were conservative compared to the other five methods and relative to new reserve additions there since 1995. The PETRIMES method gave the most optimistic estimates in the Neuquen Basin. In less mature areas, the linear fractal method yielded larger estimates relative to other methods. A geologically based model, such as one using the total petroleum system approach, is preferred in that it combines the elements of petroleum source, reservoir, trap and seal with the tectono-stratigraphic history of basin evolution with petroleum resource potential. Care must be taken to demonstrate that homogeneous populations in terms of geology, geologic risk, exploration, and discovery processes are used in the assessment process. The USGS 2000 method (7th Approximation Model, EMC computational program) is robust; that is, it can be used in both mature and immature areas, and provides comparable results when using different geologic models (e.g. stratigraphic or structural) with differing amounts of subdivisions, assessment units, within the total petroleum system. ?? 2005 International Association for Mathematical Geology.

  8. Combining exposure and effect modeling into an integrated probabilistic environmental risk assessment for nanoparticles.

    PubMed

    Jacobs, Rianne; Meesters, Johannes A J; Ter Braak, Cajo J F; van de Meent, Dik; van der Voet, Hilko

    2016-12-01

    There is a growing need for good environmental risk assessment of engineered nanoparticles (ENPs). Environmental risk assessment of ENPs has been hampered by lack of data and knowledge about ENPs, their environmental fate, and their toxicity. This leads to uncertainty in the risk assessment. To deal with uncertainty in the risk assessment effectively, probabilistic methods are advantageous. In the present study, the authors developed a method to model both the variability and the uncertainty in environmental risk assessment of ENPs. This method is based on the concentration ratio and the ratio of the exposure concentration to the critical effect concentration, both considered to be random. In this method, variability and uncertainty are modeled separately so as to allow the user to see which part of the total variation in the concentration ratio is attributable to uncertainty and which part is attributable to variability. The authors illustrate the use of the method with a simplified aquatic risk assessment of nano-titanium dioxide. The authors' method allows a more transparent risk assessment and can also direct further environmental and toxicological research to the areas in which it is most needed. Environ Toxicol Chem 2016;35:2958-2967. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC.

  9. Study on the Application of the Kent Index Method on the Risk Assessment of Disastrous Accidents in Subway Engineering

    PubMed Central

    Lu, Hao; Wang, Mingyang; Yang, Baohuai; Rong, Xiaoli

    2013-01-01

    With the development of subway engineering, according to uncertain factors and serious accidents involved in the construction of subways, implementing risk assessment is necessary and may bring a number of benefits for construction safety. The Kent index method extensively used in pipeline construction is improved to make risk assessment much more practical for the risk assessment of disastrous accidents in subway engineering. In the improved method, the indexes are divided into four categories, namely, basic, design, construction, and consequence indexes. In this study, a risk assessment model containing four kinds of indexes is provided. Three kinds of risk occurrence modes are listed. The probability index model which considers the relativity of the indexes is established according to the risk occurrence modes. The model provides the risk assessment process through the fault tree method and has been applied in the risk assessment of Nanjing subway's river-crossing tunnel construction. Based on the assessment results, the builders were informed of what risks should be noticed and what they should do to avoid the risks. The need for further research is discussed. Overall, this method may provide a tool for the builders, and improve the safety of the construction. PMID:23710136

  10. Pitfalls in statistical landslide susceptibility modelling

    NASA Astrophysics Data System (ADS)

    Schröder, Boris; Vorpahl, Peter; Märker, Michael; Elsenbeer, Helmut

    2010-05-01

    The use of statistical methods is a well-established approach to predict landslide occurrence probabilities and to assess landslide susceptibility. This is achieved by applying statistical methods relating historical landslide inventories to topographic indices as predictor variables. In our contribution, we compare several new and powerful methods developed in machine learning and well-established in landscape ecology and macroecology for predicting the distribution of shallow landslides in tropical mountain rainforests in southern Ecuador (among others: boosted regression trees, multivariate adaptive regression splines, maximum entropy). Although these methods are powerful, we think it is necessary to follow a basic set of guidelines to avoid some pitfalls regarding data sampling, predictor selection, and model quality assessment, especially if a comparison of different models is contemplated. We therefore suggest to apply a novel toolbox to evaluate approaches to the statistical modelling of landslide susceptibility. Additionally, we propose some methods to open the "black box" as an inherent part of machine learning methods in order to achieve further explanatory insights into preparatory factors that control landslides. Sampling of training data should be guided by hypotheses regarding processes that lead to slope failure taking into account their respective spatial scales. This approach leads to the selection of a set of candidate predictor variables considered on adequate spatial scales. This set should be checked for multicollinearity in order to facilitate model response curve interpretation. Model quality assesses how well a model is able to reproduce independent observations of its response variable. This includes criteria to evaluate different aspects of model performance, i.e. model discrimination, model calibration, and model refinement. In order to assess a possible violation of the assumption of independency in the training samples or a possible lack of explanatory information in the chosen set of predictor variables, the model residuals need to be checked for spatial auto¬correlation. Therefore, we calculate spline correlograms. In addition to this, we investigate partial dependency plots and bivariate interactions plots considering possible interactions between predictors to improve model interpretation. Aiming at presenting this toolbox for model quality assessment, we investigate the influence of strategies in the construction of training datasets for statistical models on model quality.

  11. Assessing Interval Estimation Methods for Hill Model ...

    EPA Pesticide Factsheets

    The Hill model of concentration-response is ubiquitous in toxicology, perhaps because its parameters directly relate to biologically significant metrics of toxicity such as efficacy and potency. Point estimates of these parameters obtained through least squares regression or maximum likelihood are commonly used in high-throughput risk assessment, but such estimates typically fail to include reliable information concerning confidence in (or precision of) the estimates. To address this issue, we examined methods for assessing uncertainty in Hill model parameter estimates derived from concentration-response data. In particular, using a sample of ToxCast concentration-response data sets, we applied four methods for obtaining interval estimates that are based on asymptotic theory, bootstrapping (two varieties), and Bayesian parameter estimation, and then compared the results. These interval estimation methods generally did not agree, so we devised a simulation study to assess their relative performance. We generated simulated data by constructing four statistical error models capable of producing concentration-response data sets comparable to those observed in ToxCast. We then applied the four interval estimation methods to the simulated data and compared the actual coverage of the interval estimates to the nominal coverage (e.g., 95%) in order to quantify performance of each of the methods in a variety of cases (i.e., different values of the true Hill model paramet

  12. Improving the use of crop models for risk assessment and climate change adaptation.

    PubMed

    Challinor, Andrew J; Müller, Christoph; Asseng, Senthold; Deva, Chetan; Nicklin, Kathryn Jane; Wallach, Daniel; Vanuytrecht, Eline; Whitfield, Stephen; Ramirez-Villegas, Julian; Koehler, Ann-Kristin

    2018-01-01

    Crop models are used for an increasingly broad range of applications, with a commensurate proliferation of methods. Careful framing of research questions and development of targeted and appropriate methods are therefore increasingly important. In conjunction with the other authors in this special issue, we have developed a set of criteria for use of crop models in assessments of impacts, adaptation and risk. Our analysis drew on the other papers in this special issue, and on our experience in the UK Climate Change Risk Assessment 2017 and the MACSUR, AgMIP and ISIMIP projects. The criteria were used to assess how improvements could be made to the framing of climate change risks, and to outline the good practice and new developments that are needed to improve risk assessment. Key areas of good practice include: i. the development, running and documentation of crop models, with attention given to issues of spatial scale and complexity; ii. the methods used to form crop-climate ensembles, which can be based on model skill and/or spread; iii. the methods used to assess adaptation, which need broadening to account for technological development and to reflect the full range options available. The analysis highlights the limitations of focussing only on projections of future impacts and adaptation options using pre-determined time slices. Whilst this long-standing approach may remain an essential component of risk assessments, we identify three further key components: 1.Working with stakeholders to identify the timing of risks. What are the key vulnerabilities of food systems and what does crop-climate modelling tell us about when those systems are at risk?2.Use of multiple methods that critically assess the use of climate model output and avoid any presumption that analyses should begin and end with gridded output.3.Increasing transparency and inter-comparability in risk assessments. Whilst studies frequently produce ranges that quantify uncertainty, the assumptions underlying these ranges are not always clear. We suggest that the contingency of results upon assumptions is made explicit via a common uncertainty reporting format; and/or that studies are assessed against a set of criteria, such as those presented in this paper.

  13. Bridging the etiologic and prognostic outlooks in individualized assessment of absolute risk of an illness: application in lung cancer.

    PubMed

    Karp, Igor; Sylvestre, Marie-Pierre; Abrahamowicz, Michal; Leffondré, Karen; Siemiatycki, Jack

    2016-11-01

    Assessment of individual risk of illness is an important activity in preventive medicine. Development of risk-assessment models has heretofore relied predominantly on studies involving follow-up of cohort-type populations, while case-control studies have generally been considered unfit for this purpose. To present a method for individualized assessment of absolute risk of an illness (as illustrated by lung cancer) based on data from a 'non-nested' case-control study. We used data from a case-control study conducted in Montreal, Canada in 1996-2001. Individuals diagnosed with lung cancer (n = 920) and age- and sex-matched lung-cancer-free subjects (n = 1288) completed questionnaires documenting life-time cigarette-smoking history and occupational, medical, and family history. Unweighted and weighted logistic models were fitted. Model overfitting was assessed using bootstrap-based cross-validation and 'shrinkage.' The discriminating ability was assessed by the c-statistic, and the risk-stratifying performance was assessed by examination of the variability in risk estimates over hypothetical risk-profiles. In the logistic models, the logarithm of incidence-density of lung cancer was expressed as a function of age, sex, cigarette-smoking history, history of respiratory conditions and exposure to occupational carcinogens, and family history of lung cancer. The models entailed a minimal degree of overfitting ('shrinkage' factor: 0.97 for both unweighted and weighted models) and moderately high discriminating ability (c-statistic: 0.82 for the unweighted model and 0.66 for the weighted model). The method's risk-stratifying performance was quite high. The presented method allows for individualized assessment of risk of lung cancer and can be used for development of risk-assessment models for other illnesses.

  14. An Assessment of Iterative Reconstruction Methods for Sparse Ultrasound Imaging

    PubMed Central

    Valente, Solivan A.; Zibetti, Marcelo V. W.; Pipa, Daniel R.; Maia, Joaquim M.; Schneider, Fabio K.

    2017-01-01

    Ultrasonic image reconstruction using inverse problems has recently appeared as an alternative to enhance ultrasound imaging over beamforming methods. This approach depends on the accuracy of the acquisition model used to represent transducers, reflectivity, and medium physics. Iterative methods, well known in general sparse signal reconstruction, are also suited for imaging. In this paper, a discrete acquisition model is assessed by solving a linear system of equations by an ℓ1-regularized least-squares minimization, where the solution sparsity may be adjusted as desired. The paper surveys 11 variants of four well-known algorithms for sparse reconstruction, and assesses their optimization parameters with the goal of finding the best approach for iterative ultrasound imaging. The strategy for the model evaluation consists of using two distinct datasets. We first generate data from a synthetic phantom that mimics real targets inside a professional ultrasound phantom device. This dataset is contaminated with Gaussian noise with an estimated SNR, and all methods are assessed by their resulting images and performances. The model and methods are then assessed with real data collected by a research ultrasound platform when scanning the same phantom device, and results are compared with beamforming. A distinct real dataset is finally used to further validate the proposed modeling. Although high computational effort is required by iterative methods, results show that the discrete model may lead to images closer to ground-truth than traditional beamforming. However, computing capabilities of current platforms need to evolve before frame rates currently delivered by ultrasound equipments are achievable. PMID:28282862

  15. DEVELOPMENT AND REVIEW OF MONITORING METHODS AND RISK ASSESSMENT MODELS USED TO DETERMINE THE EFFECTS OF BIOSOLIDS LAND APPLICATION ON HUMAN HEALTH AND THE ENVIRONMENT

    EPA Science Inventory

    Development and Review of monitoring methods and risk assessment models for biosolids land application impacts on air and land

    Ronald F Herrmann (NRMRL), Mike Broder (NCEA), and Mike Ware (NERL)

    Science Questions .

    MYP Science Question: What additional model...

  16. Geographic and temporal validity of prediction models: Different approaches were useful to examine model performance

    PubMed Central

    Austin, Peter C.; van Klaveren, David; Vergouwe, Yvonne; Nieboer, Daan; Lee, Douglas S.; Steyerberg, Ewout W.

    2017-01-01

    Objective Validation of clinical prediction models traditionally refers to the assessment of model performance in new patients. We studied different approaches to geographic and temporal validation in the setting of multicenter data from two time periods. Study Design and Setting We illustrated different analytic methods for validation using a sample of 14,857 patients hospitalized with heart failure at 90 hospitals in two distinct time periods. Bootstrap resampling was used to assess internal validity. Meta-analytic methods were used to assess geographic transportability. Each hospital was used once as a validation sample, with the remaining hospitals used for model derivation. Hospital-specific estimates of discrimination (c-statistic) and calibration (calibration intercepts and slopes) were pooled using random effects meta-analysis methods. I2 statistics and prediction interval width quantified geographic transportability. Temporal transportability was assessed using patients from the earlier period for model derivation and patients from the later period for model validation. Results Estimates of reproducibility, pooled hospital-specific performance, and temporal transportability were on average very similar, with c-statistics of 0.75. Between-hospital variation was moderate according to I2 statistics and prediction intervals for c-statistics. Conclusion This study illustrates how performance of prediction models can be assessed in settings with multicenter data at different time periods. PMID:27262237

  17. Mine safety assessment using gray relational analysis and bow tie model

    PubMed Central

    2018-01-01

    Mine safety assessment is a precondition for ensuring orderly and safety in production. The main purpose of this study was to prevent mine accidents more effectively by proposing a composite risk analysis model. First, the weights of the assessment indicators were determined by the revised integrated weight method, in which the objective weights were determined by a variation coefficient method and the subjective weights determined by the Delphi method. A new formula was then adopted to calculate the integrated weights based on the subjective and objective weights. Second, after the assessment indicator weights were determined, gray relational analysis was used to evaluate the safety of mine enterprises. Mine enterprise safety was ranked according to the gray relational degree, and weak links of mine safety practices identified based on gray relational analysis. Third, to validate the revised integrated weight method adopted in the process of gray relational analysis, the fuzzy evaluation method was used to the safety assessment of mine enterprises. Fourth, for first time, bow tie model was adopted to identify the causes and consequences of weak links and allow corresponding safety measures to be taken to guarantee the mine’s safe production. A case study of mine safety assessment was presented to demonstrate the effectiveness and rationality of the proposed composite risk analysis model, which can be applied to other related industries for safety evaluation. PMID:29561875

  18. Prioritization of in silico models and molecular descriptors for the assessment of ready biodegradability.

    PubMed

    Fernández, Alberto; Rallo, Robert; Giralt, Francesc

    2015-10-01

    Ready biodegradability is a key property for evaluating the long-term effects of chemicals on the environment and human health. As such, it is used as a screening test for the assessment of persistent, bioaccumulative and toxic substances. Regulators encourage the use of non-testing methods, such as in silico models, to save money and time. A dataset of 757 chemicals was collected to assess the performance of four freely available in silico models that predict ready biodegradability. They were applied to develop a new consensus method that prioritizes the use of each individual model according to its performance on chemical subsets driven by the presence or absence of different molecular descriptors. This consensus method was capable of almost eliminating unpredictable chemicals, while the performance of combined models was substantially improved with respect to that of the individual models. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Shifting attention from objective risk factors to patients' self-assessed health resources: a clinical model for general practice.

    PubMed

    Hollnagel, H; Malterud, K

    1995-12-01

    The study was designed to present and apply theoretical and empirical knowledge for the construction of a clinical model intended to shift the attention of the general practitioner from objective risk factors to self-assessed health resources in male and female patients. Review, discussion and analysis of selected theoretical models about personal health resources involving assessing existing theories according to their emphasis concerning self-assessed vs. doctor-assessed health resources, specific health resources vs. life and coping in general, abstract vs. clinically applicable theory, gender perspective explicitly included or not. Relevant theoretical models on health and coping (salutogenesis, coping and social support, control/demand, locus of control, health belief model, quality of life), and the perspective of the underprivileged Other (critical theory, feminist standpoint theory, the patient-centred clinical method) were presented and assessed. Components from Antonovsky's salutogenetic perspective and McWhinney's patient-centred clinical method, supported by gender perspectives, were integrated to a clinical model which is presented. General practitioners are recommended to shift their attention from objective risk factors to self-assessed health resources by means of the clinical model. The relevance and feasibility of the model should be explored in empirical research.

  20. Multisite-multivariable sensitivity analysis of distributed watershed models: enhancing the perceptions from computationally frugal methods

    USDA-ARS?s Scientific Manuscript database

    This paper assesses the impact of different likelihood functions in identifying sensitive parameters of the highly parameterized, spatially distributed Soil and Water Assessment Tool (SWAT) watershed model for multiple variables at multiple sites. The global one-factor-at-a-time (OAT) method of Morr...

  1. 3D Modelling and Printing Technology to Produce Patient-Specific 3D Models.

    PubMed

    Birbara, Nicolette S; Otton, James M; Pather, Nalini

    2017-11-10

    A comprehensive knowledge of mitral valve (MV) anatomy is crucial in the assessment of MV disease. While the use of three-dimensional (3D) modelling and printing in MV assessment has undergone early clinical evaluation, the precision and usefulness of this technology requires further investigation. This study aimed to assess and validate 3D modelling and printing technology to produce patient-specific 3D MV models. A prototype method for MV 3D modelling and printing was developed from computed tomography (CT) scans of a plastinated human heart. Mitral valve models were printed using four 3D printing methods and validated to assess precision. Cardiac CT and 3D echocardiography imaging data of four MV disease patients was used to produce patient-specific 3D printed models, and 40 cardiac health professionals (CHPs) were surveyed on the perceived value and potential uses of 3D models in a clinical setting. The prototype method demonstrated submillimetre precision for all four 3D printing methods used, and statistical analysis showed a significant difference (p<0.05) in precision between these methods. Patient-specific 3D printed models, particularly using multiple print materials, were considered useful by CHPs for preoperative planning, as well as other applications such as teaching and training. This study suggests that, with further advances in 3D modelling and printing technology, patient-specific 3D MV models could serve as a useful clinical tool. The findings also highlight the potential of this technology to be applied in a variety of medical areas within both clinical and educational settings. Copyright © 2017 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  2. Using the Monte Carlo method for assessing the tissue and organ doses of patients in dental radiography

    NASA Astrophysics Data System (ADS)

    Makarevich, K. O.; Minenko, V. F.; Verenich, K. A.; Kuten, S. A.

    2016-05-01

    This work is dedicated to modeling dental radiographic examinations to assess the absorbed doses of patients and effective doses. For simulating X-ray spectra, the TASMIP empirical model is used. Doses are assessed on the basis of the Monte Carlo method by using MCNP code for voxel phantoms of ICRP. The results of the assessment of doses to individual organs and effective doses for different types of dental examinations and features of X-ray tube are presented.

  3. Waste-to-energy: A review of life cycle assessment and its extension methods.

    PubMed

    Zhou, Zhaozhi; Tang, Yuanjun; Chi, Yong; Ni, Mingjiang; Buekens, Alfons

    2018-01-01

    This article proposes a comprehensive review of evaluation tools based on life cycle thinking, as applied to waste-to-energy. Habitually, life cycle assessment is adopted to assess environmental burdens associated with waste-to-energy initiatives. Based on this framework, several extension methods have been developed to focus on specific aspects: Exergetic life cycle assessment for reducing resource depletion, life cycle costing for evaluating its economic burden, and social life cycle assessment for recording its social impacts. Additionally, the environment-energy-economy model integrates both life cycle assessment and life cycle costing methods and judges simultaneously these three features for sustainable waste-to-energy conversion. Life cycle assessment is sufficiently developed on waste-to-energy with concrete data inventory and sensitivity analysis, although the data and model uncertainty are unavoidable. Compared with life cycle assessment, only a few evaluations are conducted to waste-to-energy techniques by using extension methods and its methodology and application need to be further developed. Finally, this article succinctly summarises some recommendations for further research.

  4. Model Diagnostics for Bayesian Networks

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2006-01-01

    Bayesian networks are frequently used in educational assessments primarily for learning about students' knowledge and skills. There is a lack of works on assessing fit of Bayesian networks. This article employs the posterior predictive model checking method, a popular Bayesian model checking tool, to assess fit of simple Bayesian networks. A…

  5. Portfolios: An Alternative Method of Student and Program Assessment

    PubMed Central

    Hannam, Susan E.

    1995-01-01

    The use of performance-based evaluation and alternative assessment techniques has become essential for curriculum programs seeking Commission of Accreditation of Allied Health Education Programs (CAAHEP) accreditation. In athletic training education, few assessment models exist to assess student performance over the entire course of their educational program. This article describes a model of assessment-a student athletic training portfolio of “best works.” The portfolio can serve as a method to assess student development and to assess program effectiveness. The goals of the program include purposes specific to the five NATA performance domains. In addition, four types of portfolio evidence are described: artifacts, attestations, productions, and reproductions. Quality assignments and projects completed by students as they progress through a six-semester program are identified relative to the type of evidence and the domain(s) they represent. The portfolio assists with student development, provides feedback for curriculum planning, allows for student/faculty collaboration and “coaching” of the student, and assists with job searching. This information will serve as a useful model for those athletic training programs looking for an alternative method of assessing student and program outcomes. PMID:16558359

  6. A method based on IHS cylindrical transform model for quality assessment of image fusion

    NASA Astrophysics Data System (ADS)

    Zhu, Xiaokun; Jia, Yonghong

    2005-10-01

    Image fusion technique has been widely applied to remote sensing image analysis and processing, and methods for quality assessment of image fusion in remote sensing have also become the research issues at home and abroad. Traditional assessment methods combine calculation of quantitative indexes and visual interpretation to compare fused images quantificationally and qualitatively. However, in the existing assessment methods, there are two defects: on one hand, most imdexes lack the theoretic support to compare different fusion methods. On the hand, there is not a uniform preference for most of the quantitative assessment indexes when they are applied to estimate the fusion effects. That is, the spatial resolution and spectral feature could not be analyzed synchronously by these indexes and there is not a general method to unify the spatial and spectral feature assessment. So in this paper, on the basis of the approximate general model of four traditional fusion methods, including Intensity Hue Saturation(IHS) triangle transform fusion, High Pass Filter(HPF) fusion, Principal Component Analysis(PCA) fusion, Wavelet Transform(WT) fusion, a correlation coefficient assessment method based on IHS cylindrical transform is proposed. By experiments, this method can not only get the evaluation results of spatial and spectral features on the basis of uniform preference, but also can acquire the comparison between fusion image sources and fused images, and acquire differences among fusion methods. Compared with the traditional assessment methods, the new methods is more intuitionistic, and in accord with subjective estimation.

  7. [The methods of assessment of health risk from exposure to radon and radon daughters].

    PubMed

    Demin, V F; Zhukovskiy, M V; Kiselev, S M

    2014-01-01

    The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.

  8. Generating Multiple Imputations for Matrix Sampling Data Analyzed with Item Response Models.

    ERIC Educational Resources Information Center

    Thomas, Neal; Gan, Nianci

    1997-01-01

    Describes and assesses missing data methods currently used to analyze data from matrix sampling designs implemented by the National Assessment of Educational Progress. Several improved methods are developed, and these models are evaluated using an EM algorithm to obtain maximum likelihood estimates followed by multiple imputation of complete data…

  9. Uncertainty estimates of purity measurements based on current information: toward a "live validation" of purity methods.

    PubMed

    Apostol, Izydor; Kelner, Drew; Jiang, Xinzhao Grace; Huang, Gang; Wypych, Jette; Zhang, Xin; Gastwirt, Jessica; Chen, Kenneth; Fodor, Szilan; Hapuarachchi, Suminda; Meriage, Dave; Ye, Frank; Poppe, Leszek; Szpankowski, Wojciech

    2012-12-01

    To predict precision and other performance characteristics of chromatographic purity methods, which represent the most widely used form of analysis in the biopharmaceutical industry. We have conducted a comprehensive survey of purity methods, and show that all performance characteristics fall within narrow measurement ranges. This observation was used to develop a model called Uncertainty Based on Current Information (UBCI), which expresses these performance characteristics as a function of the signal and noise levels, hardware specifications, and software settings. We applied the UCBI model to assess the uncertainty of purity measurements, and compared the results to those from conventional qualification. We demonstrated that the UBCI model is suitable to dynamically assess method performance characteristics, based on information extracted from individual chromatograms. The model provides an opportunity for streamlining qualification and validation studies by implementing a "live validation" of test results utilizing UBCI as a concurrent assessment of measurement uncertainty. Therefore, UBCI can potentially mitigate the challenges associated with laborious conventional method validation and facilitates the introduction of more advanced analytical technologies during the method lifecycle.

  10. Prioritization of in silico models and molecular descriptors for the assessment of ready biodegradability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fernández, Alberto; Rallo, Robert; Giralt, Francesc

    2015-10-15

    Ready biodegradability is a key property for evaluating the long-term effects of chemicals on the environment and human health. As such, it is used as a screening test for the assessment of persistent, bioaccumulative and toxic substances. Regulators encourage the use of non-testing methods, such as in silico models, to save money and time. A dataset of 757 chemicals was collected to assess the performance of four freely available in silico models that predict ready biodegradability. They were applied to develop a new consensus method that prioritizes the use of each individual model according to its performance on chemical subsetsmore » driven by the presence or absence of different molecular descriptors. This consensus method was capable of almost eliminating unpredictable chemicals, while the performance of combined models was substantially improved with respect to that of the individual models. - Highlights: • Consensus method to predict ready biodegradability by prioritizing multiple QSARs. • Consensus reduced the amount of unpredictable chemicals to less than 2%. • Performance increased with the number of QSAR models considered. • The absence of 2D atom pairs contributed significantly to the consensus model.« less

  11. Uncertainty in Agricultural Impact Assessment

    NASA Technical Reports Server (NTRS)

    Wallach, Daniel; Mearns, Linda O.; Rivington, Michael; Antle, John M.; Ruane, Alexander C.

    2014-01-01

    This chapter considers issues concerning uncertainty associated with modeling and its use within agricultural impact assessments. Information about uncertainty is important for those who develop assessment methods, since that information indicates the need for, and the possibility of, improvement of the methods and databases. Such information also allows one to compare alternative methods. Information about the sources of uncertainties is an aid in prioritizing further work on the impact assessment method. Uncertainty information is also necessary for those who apply assessment methods, e.g., for projecting climate change impacts on agricultural production and for stakeholders who want to use the results as part of a decision-making process (e.g., for adaptation planning). For them, uncertainty information indicates the degree of confidence they can place in the simulated results. Quantification of uncertainty also provides stakeholders with an important guideline for making decisions that are robust across the known uncertainties. Thus, uncertainty information is important for any decision based on impact assessment. Ultimately, we are interested in knowledge about uncertainty so that information can be used to achieve positive outcomes from agricultural modeling and impact assessment.

  12. Assessing the Application of a Geographic Presence-Only Model for Land Suitability Mapping

    PubMed Central

    Heumann, Benjamin W.; Walsh, Stephen J.; McDaniel, Phillip M.

    2011-01-01

    Recent advances in ecological modeling have focused on novel methods for characterizing the environment that use presence-only data and machine-learning algorithms to predict the likelihood of species occurrence. These novel methods may have great potential for land suitability applications in the developing world where detailed land cover information is often unavailable or incomplete. This paper assesses the adaptation and application of the presence-only geographic species distribution model, MaxEnt, for agricultural crop suitability mapping in a rural Thailand where lowland paddy rice and upland field crops predominant. To assess this modeling approach, three independent crop presence datasets were used including a social-demographic survey of farm households, a remote sensing classification of land use/land cover, and ground control points, used for geodetic and thematic reference that vary in their geographic distribution and sample size. Disparate environmental data were integrated to characterize environmental settings across Nang Rong District, a region of approximately 1,300 sq. km in size. Results indicate that the MaxEnt model is capable of modeling crop suitability for upland and lowland crops, including rice varieties, although model results varied between datasets due to the high sensitivity of the model to the distribution of observed crop locations in geographic and environmental space. Accuracy assessments indicate that model outcomes were influenced by the sample size and the distribution of sample points in geographic and environmental space. The need for further research into accuracy assessments of presence-only models lacking true absence data is discussed. We conclude that the Maxent model can provide good estimates of crop suitability, but many areas need to be carefully scrutinized including geographic distribution of input data and assessment methods to ensure realistic modeling results. PMID:21860606

  13. Assessing and reporting uncertainties in dietary exposure analysis - Part II: Application of the uncertainty template to a practical example of exposure assessment.

    PubMed

    Tennant, David; Bánáti, Diána; Kennedy, Marc; König, Jürgen; O'Mahony, Cian; Kettler, Susanne

    2017-11-01

    A previous publication described methods for assessing and reporting uncertainty in dietary exposure assessments. This follow-up publication uses a case study to develop proposals for representing and communicating uncertainty to risk managers. The food ingredient aspartame is used as the case study in a simple deterministic model (the EFSA FAIM template) and with more sophisticated probabilistic exposure assessment software (FACET). Parameter and model uncertainties are identified for each modelling approach and tabulated. The relative importance of each source of uncertainty is then evaluated using a semi-quantitative scale and the results expressed using two different forms of graphical summary. The value of this approach in expressing uncertainties in a manner that is relevant to the exposure assessment and useful to risk managers is then discussed. It was observed that the majority of uncertainties are often associated with data sources rather than the model itself. However, differences in modelling methods can have the greatest impact on uncertainties overall, particularly when the underlying data are the same. It was concluded that improved methods for communicating uncertainties for risk management is the research area where the greatest amount of effort is suggested to be placed in future. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  15. Item Response Theory for Peer Assessment

    ERIC Educational Resources Information Center

    Uto, Masaki; Ueno, Maomi

    2016-01-01

    As an assessment method based on a constructivist approach, peer assessment has become popular in recent years. However, in peer assessment, a problem remains that reliability depends on the rater characteristics. For this reason, some item response models that incorporate rater parameters have been proposed. Those models are expected to improve…

  16. Data-Driven Risk Assessment from Small Scale Epidemics: Estimation and Model Choice for Spatio-Temporal Data with Application to a Classical Swine Fever Outbreak

    PubMed Central

    Gamado, Kokouvi; Marion, Glenn; Porphyre, Thibaud

    2017-01-01

    Livestock epidemics have the potential to give rise to significant economic, welfare, and social costs. Incursions of emerging and re-emerging pathogens may lead to small and repeated outbreaks. Analysis of the resulting data is statistically challenging but can inform disease preparedness reducing potential future losses. We present a framework for spatial risk assessment of disease incursions based on data from small localized historic outbreaks. We focus on between-farm spread of livestock pathogens and illustrate our methods by application to data on the small outbreak of Classical Swine Fever (CSF) that occurred in 2000 in East Anglia, UK. We apply models based on continuous time semi-Markov processes, using data-augmentation Markov Chain Monte Carlo techniques within a Bayesian framework to infer disease dynamics and detection from incompletely observed outbreaks. The spatial transmission kernel describing pathogen spread between farms, and the distribution of times between infection and detection, is estimated alongside unobserved exposure times. Our results demonstrate inference is reliable even for relatively small outbreaks when the data-generating model is known. However, associated risk assessments depend strongly on the form of the fitted transmission kernel. Therefore, for real applications, methods are needed to select the most appropriate model in light of the data. We assess standard Deviance Information Criteria (DIC) model selection tools and recently introduced latent residual methods of model assessment, in selecting the functional form of the spatial transmission kernel. These methods are applied to the CSF data, and tested in simulated scenarios which represent field data, but assume the data generation mechanism is known. Analysis of simulated scenarios shows that latent residual methods enable reliable selection of the transmission kernel even for small outbreaks whereas the DIC is less reliable. Moreover, compared with DIC, model choice based on latent residual assessment correlated better with predicted risk. PMID:28293559

  17. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology

    EPA Science Inventory

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, e...

  18. Gait Analysis Methods for Rodent Models of Osteoarthritis

    PubMed Central

    Jacobs, Brittany Y.; Kloefkorn, Heidi E.; Allen, Kyle D.

    2014-01-01

    Patients with osteoarthritis (OA) primarily seek treatment due to pain and disability, yet the primary endpoints for rodent OA models tend to be histological measures of joint destruction. The discrepancy between clinical and preclinical evaluations is problematic, given that radiographic evidence of OA in humans does not always correlate to the severity of patient-reported symptoms. Recent advances in behavioral analyses have provided new methods to evaluate disease sequelae in rodents. Of particular relevance to rodent OA models are methods to assess rodent gait. While obvious differences exist between quadrupedal and bipedal gait sequences, the gait abnormalities seen in humans and in rodent OA models reflect similar compensatory behaviors that protect an injured limb from loading. The purpose of this review is to describe these compensations and current methods used to assess rodent gait characteristics, while detailing important considerations for the selection of gait analysis methods in rodent OA models. PMID:25160712

  19. A novel no-reference objective stereoscopic video quality assessment method based on visual saliency analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xinyan; Zhao, Wei; Ye, Long; Zhang, Qin

    2017-07-01

    This paper proposes a no-reference objective stereoscopic video quality assessment method with the motivation that making the effect of objective experiments close to that of subjective way. We believe that the image regions with different visual salient degree should not have the same weights when designing an assessment metric. Therefore, we firstly use GBVS algorithm to each frame pairs and separate both the left and right viewing images into the regions with strong, general and week saliency. Besides, local feature information like blockiness, zero-crossing and depth are extracted and combined with a mathematical model to calculate a quality assessment score. Regions with different salient degree are assigned with different weights in the mathematical model. Experiment results demonstrate the superiority of our method compared with the existed state-of-the-art no-reference objective Stereoscopic video quality assessment methods.

  20. Safety assessment for In-service Pressure Bending Pipe Containing Incomplete Penetration Defects

    NASA Astrophysics Data System (ADS)

    Wang, M.; Tang, P.; Xia, J. F.; Ling, Z. W.; Cai, G. Y.

    2017-12-01

    Incomplete penetration defect is a common defect in the welded joint of pressure pipes. While the safety classification of pressure pipe containing incomplete penetration defects, according to periodical inspection regulations in present, is more conservative. For reducing the repair of incomplete penetration defect, a scientific and applicable safety assessment method for pressure pipe is needed. In this paper, the stress analysis model of the pipe system was established for the in-service pressure bending pipe containing incomplete penetration defects. The local finite element model was set up to analyze the stress distribution of defect location and the stress linearization. And then, the applicability of two assessment methods, simplified assessment and U factor assessment method, to the assessment of incomplete penetration defects located at pressure bending pipe were analyzed. The results can provide some technical supports for the safety assessment of complex pipelines in the future.

  1. Assessment of optional sediment transport functions via the complex watershed simulation model SWAT

    USDA-ARS?s Scientific Manuscript database

    The Soil and Water Assessment Tool 2012 (SWAT2012) offers four sediment routing methods as optional alternatives to the default simplified Bagnold method. Previous studies compared only one of these alternative sediment routing methods with the default method. The proposed study evaluated the impac...

  2. Tracer kinetics of forearm endothelial function: comparison of an empirical method and a quantitative modeling technique.

    PubMed

    Zhao, Xueli; Arsenault, Andre; Lavoie, Kim L; Meloche, Bernard; Bacon, Simon L

    2007-01-01

    Forearm Endothelial Function (FEF) is a marker that has been shown to discriminate patients with cardiovascular disease (CVD). FEF has been assessed using several parameters: the Rate of Uptake Ratio (RUR), EWUR (Elbow-to-Wrist Uptake Ratio) and EWRUR (Elbow-to-Wrist Relative Uptake Ratio). However, the modeling functions of FEF require more robust models. The present study was designed to compare an empirical method with quantitative modeling techniques to better estimate the physiological parameters and understand the complex dynamic processes. The fitted time activity curves of the forearms, estimating blood and muscle components, were assessed using both an empirical method and a two-compartment model. Although correlational analyses suggested a good correlation between the methods for RUR (r=.90) and EWUR (r=.79), but not EWRUR (r=.34), Altman-Bland plots found poor agreement between the methods for all 3 parameters. These results indicate that there is a large discrepancy between the empirical and computational method for FEF. Further work is needed to establish the physiological and mathematical validity of the 2 modeling methods.

  3. Assessment of quantitative structure-activity relationship of toxicity prediction models for Korean chemical substance control legislation

    PubMed Central

    Kim, Kwang-Yon; Shin, Seong Eun; No, Kyoung Tai

    2015-01-01

    Objectives For successful adoption of legislation controlling registration and assessment of chemical substances, it is important to obtain sufficient toxicological experimental evidence and other related information. It is also essential to obtain a sufficient number of predicted risk and toxicity results. Particularly, methods used in predicting toxicities of chemical substances during acquisition of required data, ultimately become an economic method for future dealings with new substances. Although the need for such methods is gradually increasing, the-required information about reliability and applicability range has not been systematically provided. Methods There are various representative environmental and human toxicity models based on quantitative structure-activity relationships (QSAR). Here, we secured the 10 representative QSAR-based prediction models and its information that can make predictions about substances that are expected to be regulated. We used models that predict and confirm usability of the information expected to be collected and submitted according to the legislation. After collecting and evaluating each predictive model and relevant data, we prepared methods quantifying the scientific validity and reliability, which are essential conditions for using predictive models. Results We calculated predicted values for the models. Furthermore, we deduced and compared adequacies of the models using the Alternative non-testing method assessed for Registration, Evaluation, Authorization, and Restriction of Chemicals Substances scoring system, and deduced the applicability domains for each model. Additionally, we calculated and compared inclusion rates of substances expected to be regulated, to confirm the applicability. Conclusions We evaluated and compared the data, adequacy, and applicability of our selected QSAR-based toxicity prediction models, and included them in a database. Based on this data, we aimed to construct a system that can be used with predicted toxicity results. Furthermore, by presenting the suitability of individual predicted results, we aimed to provide a foundation that could be used in actual assessments and regulations. PMID:26206368

  4. Quality assessment of protein model-structures using evolutionary conservation.

    PubMed

    Kalman, Matan; Ben-Tal, Nir

    2010-05-15

    Programs that evaluate the quality of a protein structural model are important both for validating the structure determination procedure and for guiding the model-building process. Such programs are based on properties of native structures that are generally not expected for faulty models. One such property, which is rarely used for automatic structure quality assessment, is the tendency for conserved residues to be located at the structural core and for variable residues to be located at the surface. We present ConQuass, a novel quality assessment program based on the consistency between the model structure and the protein's conservation pattern. We show that it can identify problematic structural models, and that the scores it assigns to the server models in CASP8 correlate with the similarity of the models to the native structure. We also show that when the conservation information is reliable, the method's performance is comparable and complementary to that of the other single-structure quality assessment methods that participated in CASP8 and that do not use additional structural information from homologs. A perl implementation of the method, as well as the various perl and R scripts used for the analysis are available at http://bental.tau.ac.il/ConQuass/. nirb@tauex.tau.ac.il Supplementary data are available at Bioinformatics online.

  5. Assessment of Differential Item Functioning under Cognitive Diagnosis Models: The DINA Model Example

    ERIC Educational Resources Information Center

    Li, Xiaomin; Wang, Wen-Chung

    2015-01-01

    The assessment of differential item functioning (DIF) is routinely conducted to ensure test fairness and validity. Although many DIF assessment methods have been developed in the context of classical test theory and item response theory, they are not applicable for cognitive diagnosis models (CDMs), as the underlying latent attributes of CDMs are…

  6. A dose assessment method for arbitrary geometries with virtual reality in the nuclear facilities decommissioning

    NASA Astrophysics Data System (ADS)

    Chao, Nan; Liu, Yong-kuo; Xia, Hong; Ayodeji, Abiodun; Bai, Lu

    2018-03-01

    During the decommissioning of nuclear facilities, a large number of cutting and demolition activities are performed, which results in a frequent change in the structure and produce many irregular objects. In order to assess dose rates during the cutting and demolition process, a flexible dose assessment method for arbitrary geometries and radiation sources was proposed based on virtual reality technology and Point-Kernel method. The initial geometry is designed with the three-dimensional computer-aided design tools. An approximate model is built automatically in the process of geometric modeling via three procedures namely: space division, rough modeling of the body and fine modeling of the surface, all in combination with collision detection of virtual reality technology. Then point kernels are generated by sampling within the approximate model, and when the material and radiometric attributes are inputted, dose rates can be calculated with the Point-Kernel method. To account for radiation scattering effects, buildup factors are calculated with the Geometric-Progression formula in the fitting function. The effectiveness and accuracy of the proposed method was verified by means of simulations using different geometries and the dose rate results were compared with that derived from CIDEC code, MCNP code and experimental measurements.

  7. Modeling an internal gear pump

    NASA Astrophysics Data System (ADS)

    Chen, Zongbin; Xu, Rongwu; He, Lin; Liao, Jian

    2018-05-01

    Considering the nature and characteristics of construction waste piles, this paper analyzed the factors affecting the stability of the slope of construction waste piles, and established the system of the assessment indexes for the slope failure risks of construction waste piles. Based on the basic principles and methods of fuzzy mathematics, the factor set and the remark set were established. The membership grade of continuous factor indexes is determined using the "ridge row distribution" function, while that for the discrete factor indexes was determined by the Delphi Method. For the weight of factors, the subjective weight was determined by the Analytic Hierarchy Process (AHP) and objective weight by the entropy weight method. And the distance function was introduced to determine the combination coefficient. This paper established a fuzzy comprehensive assessment model of slope failure risks of construction waste piles, and assessed pile slopes in the two dimensions of hazard and vulnerability. The root mean square of the hazard assessment result and vulnerability assessment result was the final assessment result. The paper then used a certain construction waste pile slope as the example for analysis, assessed the risks of the four stages of a landfill, verified the assessment model and analyzed the slope's failure risks and preventive measures against a slide.

  8. A Review of Methods Applied by the U.S. Geological Survey in the Assessment of Identified Geothermal Resources

    USGS Publications Warehouse

    Williams, Colin F.; Reed, Marshall J.; Mariner, Robert H.

    2008-01-01

    The U. S. Geological Survey (USGS) is conducting an updated assessment of geothermal resources in the United States. The primary method applied in assessments of identified geothermal systems by the USGS and other organizations is the volume method, in which the recoverable heat is estimated from the thermal energy available in a reservoir. An important focus in the assessment project is on the development of geothermal resource models consistent with the production histories and observed characteristics of exploited geothermal fields. The new assessment will incorporate some changes in the models for temperature and depth ranges for electric power production, preferred chemical geothermometers for estimates of reservoir temperatures, estimates of reservoir volumes, and geothermal energy recovery factors. Monte Carlo simulations are used to characterize uncertainties in the estimates of electric power generation. These new models for the recovery of heat from heterogeneous, fractured reservoirs provide a physically realistic basis for evaluating the production potential of natural geothermal reservoirs.

  9. Educational research methods for researching innovations in teaching, learning and assessment: The nursing lecturer as researcher.

    PubMed

    Marks-Maran, Diane

    2015-11-01

    The author, who has had previous experience as a nurse researcher, has been engaged in helping nurse lecturers to undertake evaluation research studies into innovations in their teaching, learning and assessment methods. In order to undertake this work successfully, it was important to move from thinking like a nurse researcher to thinking like an educational researcher and developing the role of the nursing lecturer as researcher of their teaching. This article explores the difference between evaluation and evaluation research and argues for the need to use educational research methods when undertaking evaluation research into innovations in teaching, learning and assessment. A new model for educational evaluation research is presented together with two case examples of the model in use. The model has been tested on over 30 research studies into innovations in teaching, learning and assessment over the past 8 years. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Energy Information Systems

    Science.gov Websites

    Energy Analytics Campaign > 2014-2018 Assessment of Automated M&V Methods > 2012-2018 Better Assessment of automated measurement and verification methods. Granderson, J. et al. Lawrence Berkeley . PDF, 726 KB Performance Metrics and Objective Testing Methods for Energy Baseline Modeling Software

  11. Assessing the accuracy and stability of variable selection ...

    EPA Pesticide Factsheets

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological datasets there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used, or stepwise procedures are employed which iteratively add/remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating dataset consists of the good/poor condition of n=1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p=212) of landscape features from the StreamCat dataset. Two types of RF models are compared: a full variable set model with all 212 predictors, and a reduced variable set model selected using a backwards elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors, and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substanti

  12. Implementation of the nursing process in a health area: models and assessment structures used

    PubMed Central

    Huitzi-Egilegor, Joseba Xabier; Elorza-Puyadena, Maria Isabel; Urkia-Etxabe, Jose Maria; Asurabarrena-Iraola, Carmen

    2014-01-01

    OBJECTIVE: to analyze what nursing models and nursing assessment structures have been used in the implementation of the nursing process at the public and private centers in the health area Gipuzkoa (Basque Country). METHOD: a retrospective study was undertaken, based on the analysis of the nursing records used at the 158 centers studied. RESULTS: the Henderson model, Carpenito's bifocal structure, Gordon's assessment structure and the Resident Assessment Instrument Nursing Home 2.0 have been used as nursing models and assessment structures to implement the nursing process. At some centers, the selected model or assessment structure has varied over time. CONCLUSION: Henderson's model has been the most used to implement the nursing process. Furthermore, the trend is observed to complement or replace Henderson's model by nursing assessment structures. PMID:25493672

  13. A new method for constructing networks from binary data

    NASA Astrophysics Data System (ADS)

    van Borkulo, Claudia D.; Borsboom, Denny; Epskamp, Sacha; Blanken, Tessa F.; Boschloo, Lynn; Schoevers, Robert A.; Waldorp, Lourens J.

    2014-08-01

    Network analysis is entering fields where network structures are unknown, such as psychology and the educational sciences. A crucial step in the application of network models lies in the assessment of network structure. Current methods either have serious drawbacks or are only suitable for Gaussian data. In the present paper, we present a method for assessing network structures from binary data. Although models for binary data are infamous for their computational intractability, we present a computationally efficient model for estimating network structures. The approach, which is based on Ising models as used in physics, combines logistic regression with model selection based on a Goodness-of-Fit measure to identify relevant relationships between variables that define connections in a network. A validation study shows that this method succeeds in revealing the most relevant features of a network for realistic sample sizes. We apply our proposed method to estimate the network of depression and anxiety symptoms from symptom scores of 1108 subjects. Possible extensions of the model are discussed.

  14. Risk assessment of flood disaster and forewarning model at different spatial-temporal scales

    NASA Astrophysics Data System (ADS)

    Zhao, Jun; Jin, Juliang; Xu, Jinchao; Guo, Qizhong; Hang, Qingfeng; Chen, Yaqian

    2018-05-01

    Aiming at reducing losses from flood disaster, risk assessment of flood disaster and forewarning model is studied. The model is built upon risk indices in flood disaster system, proceeding from the whole structure and its parts at different spatial-temporal scales. In this study, on the one hand, it mainly establishes the long-term forewarning model for the surface area with three levels of prediction, evaluation, and forewarning. The method of structure-adaptive back-propagation neural network on peak identification is used to simulate indices in prediction sub-model. Set pair analysis is employed to calculate the connection degrees of a single index, comprehensive index, and systematic risk through the multivariate connection number, and the comprehensive assessment is made by assessment matrixes in evaluation sub-model. The comparison judging method is adopted to divide warning degree of flood disaster on risk assessment comprehensive index with forewarning standards in forewarning sub-model and then the long-term local conditions for proposing planning schemes. On the other hand, it mainly sets up the real-time forewarning model for the spot, which introduces the real-time correction technique of Kalman filter based on hydrological model with forewarning index, and then the real-time local conditions for presenting an emergency plan. This study takes Tunxi area, Huangshan City of China, as an example. After risk assessment and forewarning model establishment and application for flood disaster at different spatial-temporal scales between the actual and simulated data from 1989 to 2008, forewarning results show that the development trend for flood disaster risk remains a decline on the whole from 2009 to 2013, despite the rise in 2011. At the macroscopic level, project and non-project measures are advanced, while at the microcosmic level, the time, place, and method are listed. It suggests that the proposed model is feasible with theory and application, thus offering a way for assessing and forewarning flood disaster risk.

  15. [Modeling the academic performance of medical students in basic sciences and pre-clinical courses: a longitudinal study].

    PubMed

    Zúñiga, Denisse; Mena, Beltrán; Oliva, Rose; Pedrals, Nuria; Padilla, Oslando; Bitran, Marcela

    2009-10-01

    The study of predictors of academic performance is relevant for medical education. Most studies of academic performance use global ratings as outcome measure, and do not evaluate the influence of the assessment methods. To model by multivariate analysis, the academic performance of medical considering, besides academic and demographic variables, the methods used to assess students' learning and their preferred modes of information processing. Two hundred seventy two students admitted to the medical school of the Pontificia Universidad Católica de Chile from 2000 to 2003. Six groups of variables were studied to model the students' performance in five basic science courses (Anatomy, Biology, Calculus, Chemistry and Physics) and two pre-clinical courses (Integrated Medical Clinic I and IT). The assessment methods examined were multiple choice question tests, Objective Structured Clinical Examination and tutor appraisal. The results of the university admission tests (high school grades, mathematics and biology tests), the assessment methods used, the curricular year and previous application to medical school, were predictors of academic performance. The information processing modes influenced academic performance, but only in interaction with other variables. Perception (abstract or concrete) interacted with the assessment methods, and information use (active or reflexive), with sex. The correlation between the real and predicted grades was 0.7. In addition to the academic results obtained prior to university entrance, the methods of assessment used in the university and the information processing modes influence the academic performance of medical students in basic and preclinical courses.

  16. Improving threading algorithms for remote homology modeling by combining fragment and template comparisons

    PubMed Central

    Zhou, Hongyi; Skolnick, Jeffrey

    2010-01-01

    In this work, we develop a method called FTCOM for assessing the global quality of protein structural models for targets of medium and hard difficulty (remote homology) produced by structure prediction approaches such as threading or ab initio structure prediction. FTCOM requires the Cα coordinates of full length models and assesses model quality based on fragment comparison and a score derived from comparison of the model to top threading templates. On a set of 361 medium/hard targets, FTCOM was applied to and assessed for its ability to improve upon the results from the SP3, SPARKS, PROSPECTOR_3, and PRO-SP3-TASSER threading algorithms. The average TM-score improves by 5%–10% for the first selected model by the new method over models obtained by the original selection procedure in the respective threading methods. Moreover the number of foldable targets (TM-score ≥0.4) increases from least 7.6% for SP3 to 54% for SPARKS. Thus, FTCOM is a promising approach to template selection. PMID:20455261

  17. EFFECTS-BASED CUMULATIVE RISK ASSESSMENT IN A LOW-INCOME URBAN COMMUNITY NEAR A SUPERFUND SITE

    EPA Science Inventory

    We will introduce into the cumulative risk assessment framework novel methods for non-cancer risk assessment, techniques for dose-response modeling that extend insights from chemical mixtures frameworks to non-chemical stressors, multilevel statistical methods used to address ...

  18. Photons Revisited

    NASA Astrophysics Data System (ADS)

    Batic, Matej; Begalli, Marcia; Han, Min Cheol; Hauf, Steffen; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Han Sung; Grazia Pia, Maria; Saracco, Paolo; Weidenspointner, Georg

    2014-06-01

    A systematic review of methods and data for the Monte Carlo simulation of photon interactions is in progress: it concerns a wide set of theoretical modeling approaches and data libraries available for this purpose. Models and data libraries are assessed quantitatively with respect to an extensive collection of experimental measurements documented in the literature to determine their accuracy; this evaluation exploits rigorous statistical analysis methods. The computational performance of the associated modeling algorithms is evaluated as well. An overview of the assessment of photon interaction models and results of the experimental validation are presented.

  19. A comparative analysis of extended water cloud model and backscatter modelling for above-ground biomass assessment in Corbett Tiger Reserve

    NASA Astrophysics Data System (ADS)

    Kumar, Yogesh; Singh, Sarnam; Chatterjee, R. S.; Trivedi, Mukul

    2016-04-01

    Forest biomass acts as a backbone in regulating the climate by storing carbon within itself. Thus the assessment of forest biomass is crucial in understanding the dynamics of the environment. Traditionally the destructive methods were adopted for the assessment of biomass which were further advanced to the non-destructive methods. The allometric equations developed by destructive methods were further used in non-destructive methods for the assessment, but they were mostly applied for woody/commercial timber species. However now days Remote Sensing data are primarily used for the biomass geospatial pattern assessment. The Optical Remote Sensing data (Landsat8, LISS III, etc.) are being used very successfully for the estimation of above ground biomass (AGB). However optical data is not suitable for all atmospheric/environmental conditions, because it can't penetrate through clouds and haze. Thus Radar data is one of the alternate possible ways to acquire data in all-weather conditions irrespective of weather and light. The paper examines the potential of ALOS PALSAR L-band dual polarisation data for the estimation of AGB in the Corbett Tiger Reserve (CTR) covering an area of 889 km2. The main focus of this study is to explore the accuracy of Polarimetric Scattering Model (Extended Water Cloud Model (EWCM) with respect to Backscatter model in the assessment of AGB. The parameters of the EWCM were estimated using the decomposition components (Raney Decomposition) and the plot level information. The above ground biomass in the CTR ranges from 9.6 t/ha to 322.6 t/ha.

  20. Contribution of Submarine Groundwater on the Water-Food Nexus in Coastal Ecosystems: Effects on Biodiversity and Fishery Production

    NASA Astrophysics Data System (ADS)

    Shoji, J.; Sugimoto, R.; Honda, H.; Tominaga, O.; Taniguchi, M.

    2014-12-01

    In the past decade, machine-learning methods for empirical rainfall-runoff modeling have seen extensive development. However, the majority of research has focused on a small number of methods, such as artificial neural networks, while not considering other approaches for non-parametric regression that have been developed in recent years. These methods may be able to achieve comparable predictive accuracy to ANN's and more easily provide physical insights into the system of interest through evaluation of covariate influence. Additionally, these methods could provide a straightforward, computationally efficient way of evaluating climate change impacts in basins where data to support physical hydrologic models is limited. In this paper, we use multiple regression and machine-learning approaches to predict monthly streamflow in five highly-seasonal rivers in the highlands of Ethiopia. We find that generalized additive models, random forests, and cubist models achieve better predictive accuracy than ANNs in many basins assessed and are also able to outperform physical models developed for the same region. We discuss some challenges that could hinder the use of such models for climate impact assessment, such as biases resulting from model formulation and prediction under extreme climate conditions, and suggest methods for preventing and addressing these challenges. Finally, we demonstrate how predictor variable influence can be assessed to provide insights into the physical functioning of data-sparse watersheds.

  1. Triangular model integrating clinical teaching and assessment

    PubMed Central

    Abdelaziz, Adel; Koshak, Emad

    2014-01-01

    Structuring clinical teaching is a challenge facing medical education curriculum designers. A variety of instructional methods on different domains of learning are indicated to accommodate different learning styles. Conventional methods of clinical teaching, like training in ambulatory care settings, are prone to the factor of coincidence in having varieties of patient presentations. Accordingly, alternative methods of instruction are indicated to compensate for the deficiencies of these conventional methods. This paper presents an initiative that can be used to design a checklist as a blueprint to guide appropriate selection and implementation of teaching/learning and assessment methods in each of the educational courses and modules based on educational objectives. Three categories of instructional methods were identified, and within each a variety of methods were included. These categories are classroom-type settings, health services-based settings, and community service-based settings. Such categories have framed our triangular model of clinical teaching and assessment. PMID:24624002

  2. Triangular model integrating clinical teaching and assessment.

    PubMed

    Abdelaziz, Adel; Koshak, Emad

    2014-01-01

    Structuring clinical teaching is a challenge facing medical education curriculum designers. A variety of instructional methods on different domains of learning are indicated to accommodate different learning styles. Conventional methods of clinical teaching, like training in ambulatory care settings, are prone to the factor of coincidence in having varieties of patient presentations. Accordingly, alternative methods of instruction are indicated to compensate for the deficiencies of these conventional methods. This paper presents an initiative that can be used to design a checklist as a blueprint to guide appropriate selection and implementation of teaching/learning and assessment methods in each of the educational courses and modules based on educational objectives. Three categories of instructional methods were identified, and within each a variety of methods were included. These categories are classroom-type settings, health services-based settings, and community service-based settings. Such categories have framed our triangular model of clinical teaching and assessment.

  3. Sensors vs. experts - A performance comparison of sensor-based fall risk assessment vs. conventional assessment in a sample of geriatric patients

    PubMed Central

    2011-01-01

    Background Fall events contribute significantly to mortality, morbidity and costs in our ageing population. In order to identify persons at risk and to target preventive measures, many scores and assessment tools have been developed. These often require expertise and are costly to implement. Recent research investigates the use of wearable inertial sensors to provide objective data on motion features which can be used to assess individual fall risk automatically. So far it is unknown how well this new method performs in comparison with conventional fall risk assessment tools. The aim of our research is to compare the predictive performance of our new sensor-based method with conventional and established methods, based on prospective data. Methods In a first study phase, 119 inpatients of a geriatric clinic took part in motion measurements using a wireless triaxial accelerometer during a Timed Up&Go (TUG) test and a 20 m walk. Furthermore, the St. Thomas Risk Assessment Tool in Falling Elderly Inpatients (STRATIFY) was performed, and the multidisciplinary geriatric care team estimated the patients' fall risk. In a second follow-up phase of the study, 46 of the participants were interviewed after one year, including a fall and activity assessment. The predictive performances of the TUG, the STRATIFY and team scores are compared. Furthermore, two automatically induced logistic regression models based on conventional clinical and assessment data (CONV) as well as sensor data (SENSOR) are matched. Results Among the risk assessment scores, the geriatric team score (sensitivity 56%, specificity 80%) outperforms STRATIFY and TUG. The induced logistic regression models CONV and SENSOR achieve similar performance values (sensitivity 68%/58%, specificity 74%/78%, AUC 0.74/0.72, +LR 2.64/2.61). Both models are able to identify more persons at risk than the simple scores. Conclusions Sensor-based objective measurements of motion parameters in geriatric patients can be used to assess individual fall risk, and our prediction model's performance matches that of a model based on conventional clinical and assessment data. Sensor-based measurements using a small wearable device may contribute significant information to conventional methods and are feasible in an unsupervised setting. More prospective research is needed to assess the cost-benefit relation of our approach. PMID:21711504

  4. Determining if an older adult can make and execute decisions to live safely at home: a capacity assessment and intervention model

    PubMed Central

    Skelton, Felicia; Kunik, Mark E.; Regev, Tziona; Naik, Aanand D.

    2009-01-01

    Determining an older adult’s capacity to live safely and independently in the community presents a serious and complicated challenge to the health care system. Evaluating one’s ability to make and execute decisions regarding safe and independent living incorporates clinical assessments, bioethical considerations, and often legal declarations of capacity. Capacity assessments usually result in life changes for patients and their families, including a caregiver managing some everyday tasks, placement outside of the home, and even legal guardianship. The process of determining capacity and recommending intervention is often inefficient and highly variable in most cases. Physicians are rarely trained to conduct capacity assessments and assessment methods are heterogeneous. An interdisciplinary team of clinicians developed the capacity assessment and intervention (CAI) model at a community outpatient geriatrics clinic to address these critical gaps. This report follows one patient through the entire CAI model, describing processes for a typical case. It then examines two additional case reports that highlight common challenges in capacity assessment. The CAI model uses assessment methods common to geriatrics clinical practice and conducts assessments and interventions in a standardized fashion. Reliance on common, validated measures increases generalizability of the model across geriatrics practice settings and patient populations. PMID:19481271

  5. Teaching and assessing procedural skills using simulation: metrics and methodology.

    PubMed

    Lammers, Richard L; Davenport, Moira; Korley, Frederick; Griswold-Theodorson, Sharon; Fitch, Michael T; Narang, Aneesh T; Evans, Leigh V; Gross, Amy; Rodriguez, Elliot; Dodge, Kelly L; Hamann, Cara J; Robey, Walter C

    2008-11-01

    Simulation allows educators to develop learner-focused training and outcomes-based assessments. However, the effectiveness and validity of simulation-based training in emergency medicine (EM) requires further investigation. Teaching and testing technical skills require methods and assessment instruments that are somewhat different than those used for cognitive or team skills. Drawing from work published by other medical disciplines as well as educational, behavioral, and human factors research, the authors developed six research themes: measurement of procedural skills; development of performance standards; assessment and validation of training methods, simulator models, and assessment tools; optimization of training methods; transfer of skills learned on simulator models to patients; and prevention of skill decay over time. The article reviews relevant and established educational research methodologies and identifies gaps in our knowledge of how physicians learn procedures. The authors present questions requiring further research that, once answered, will advance understanding of simulation-based procedural training and assessment in EM.

  6. Geometry reconstruction method for patient-specific finite element models for the assessment of tibia fracture risk in osteogenesis imperfecta.

    PubMed

    Caouette, Christiane; Ikin, Nicole; Villemure, Isabelle; Arnoux, Pierre-Jean; Rauch, Frank; Aubin, Carl-Éric

    2017-04-01

    Lower limb deformation in children with osteogenesis imperfecta (OI) impairs ambulation and may lead to fracture. Corrective surgery is based on empirical assessment criteria. The objective was to develop a reconstruction method of the tibia for OI patients that could be used as input of a comprehensive finite element model to assess fracture risks. Data were obtained from three children with OI and tibia deformities. Four pQCT scans were registered to biplanar radiographs, and a template mesh was deformed to fit the bone outline. Cortical bone thickness was computed. Sensitivity of the model to missing slices of pQCT was assessed by calculating maximal von Mises stress for a vertical hopping load case. Sensitivity of the model to ±5 % of cortical thickness measurements was assessed by calculating loads at fracture. Difference between the mesh contour and bone outline on the radiographs was below 1 mm. Removal of one pQCT slice increased maximal von Mises stress by up to 10 %. Simulated ±5 % variation of cortical bone thickness leads to variations of up to 4.1 % on predicted fracture loads. Using clinically available tibia imaging from children with OI, the developed reconstruction method allowed the building of patient-specific finite element models.

  7. Proposed method for hazard mapping of landslide propagation zone

    NASA Astrophysics Data System (ADS)

    Serbulea, Manole-Stelian; Gogu, Radu; Manoli, Daniel-Marcel; Gaitanaru, Dragos Stefan; Priceputu, Adrian; Andronic, Adrian; Anghel, Alexandra; Liviu Bugea, Adrian; Ungureanu, Constantin; Niculescu, Alexandru

    2013-04-01

    Sustainable development of communities situated in areas with landslide potential requires a fully understanding of the mechanisms that govern the triggering of the phenomenon as well as the propagation of the sliding mass, with catastrophic consequences on the nearby inhabitants and environment. Modern analysis methods for areas affected by the movement of the soil bodies are presented in this work, as well as a new procedure to assess the landslide hazard. Classical soil mechanics offer sufficient numeric models to assess the landslide triggering zone, such as Limit Equilibrium Methods (Fellenius, Janbu, Morgenstern-Price, Bishop, Spencer etc.), blocks model or progressive mobilization models, Lagrange-based finite element method etc. The computation methods for assessing the propagation zones are quite recent and have high computational requirements, thus not being sufficiently used in practice to confirm their feasibility. The proposed procedure aims to assess not only the landslide hazard factor, but also the affected areas, by means of simple mathematical operations. The method can easily be employed in GIS software, without requiring engineering training. The result is obtained by computing the first and second derivative of the digital terrain model (slope and curvature maps). Using the curvature maps, it is shown that one can assess the areas most likely to be affected by the propagation of the sliding masses. The procedure is first applied on a simple theoretical model and then used on a representative section of a high exposure area in Romania. The method is described by comparison with Romanian legislation for risk and vulnerability assessment, which specifies that the landslide hazard is to be assessed, using an average hazard factor Km, obtained from various other factors. Following the employed example, it is observed that using the Km factor there is an inconsistent distribution of the polygonal surfaces corresponding to different landslide potential. For small values of Km (0.00..0.10) the polygonal surfaces have reduced dimensions along the slopes belonging to main rivers. This can be corrected by including in the analysis the potential areas to be affected by soil instability. Finally, it is shown that the proposed procedure can be used to better assess these areas and to produce more reliable landslide hazard maps. This work was supported by a grant of the Romanian National Authority for Scientific Research, Program for research - Space Technology and Advanced Research - STAR, project number 30/2012.

  8. AN INTEGRATED PERSPECTIVE ON THE ASSESSMENT OF TECHNOLOGIES: INTEGRATE-HTA.

    PubMed

    Wahlster, Philip; Brereton, Louise; Burns, Jacob; Hofmann, Björn; Mozygemba, Kati; Oortwijn, Wija; Pfadenhauer, Lisa; Polus, Stephanie; Rehfuess, Eva; Schilling, Imke; van der Wilt, Gert Jan; Gerhardus, Ansgar

    2017-01-01

    Current health technology assessment (HTA) is not well equipped to assess complex technologies as insufficient attention is being paid to the diversity in patient characteristics and preferences, context, and implementation. Strategies to integrate these and several other aspects, such as ethical considerations, in a comprehensive assessment are missing. The aim of the European research project INTEGRATE-HTA was to develop a model for an integrated HTA of complex technologies. A multi-method, four-stage approach guided the development of the INTEGRATE-HTA Model: (i) definition of the different dimensions of information to be integrated, (ii) literature review of existing methods for integration, (iii) adjustment of concepts and methods for assessing distinct aspects of complex technologies in the frame of an integrated process, and (iv) application of the model in a case study and subsequent revisions. The INTEGRATE-HTA Model consists of five steps, each involving stakeholders: (i) definition of the technology and the objective of the HTA; (ii) development of a logic model to provide a structured overview of the technology and the system in which it is embedded; (iii) evidence assessment on effectiveness, economic, ethical, legal, and socio-cultural aspects, taking variability of participants, context, implementation issues, and their interactions into account; (iv) populating the logic model with the data generated in step 3; (v) structured process of decision-making. The INTEGRATE-HTA Model provides a structured process for integrated HTAs of complex technologies. Stakeholder involvement in all steps is essential as a means of ensuring relevance and meaningful interpretation of the evidence.

  9. Applying under-sampling techniques and cost-sensitive learning methods on risk assessment of breast cancer.

    PubMed

    Hsu, Jia-Lien; Hung, Ping-Cheng; Lin, Hung-Yen; Hsieh, Chung-Ho

    2015-04-01

    Breast cancer is one of the most common cause of cancer mortality. Early detection through mammography screening could significantly reduce mortality from breast cancer. However, most of screening methods may consume large amount of resources. We propose a computational model, which is solely based on personal health information, for breast cancer risk assessment. Our model can be served as a pre-screening program in the low-cost setting. In our study, the data set, consisting of 3976 records, is collected from Taipei City Hospital starting from 2008.1.1 to 2008.12.31. Based on the dataset, we first apply the sampling techniques and dimension reduction method to preprocess the testing data. Then, we construct various kinds of classifiers (including basic classifiers, ensemble methods, and cost-sensitive methods) to predict the risk. The cost-sensitive method with random forest classifier is able to achieve recall (or sensitivity) as 100 %. At the recall of 100 %, the precision (positive predictive value, PPV), and specificity of cost-sensitive method with random forest classifier was 2.9 % and 14.87 %, respectively. In our study, we build a breast cancer risk assessment model by using the data mining techniques. Our model has the potential to be served as an assisting tool in the breast cancer screening.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Longgao; Yang, Xiaoyan; School of Environmental Science and Spatial Informatics, China University of Mining and Technology, Xuzhou 221116

    The implementation of land use planning (LUP) has a large impact on environmental quality. There lacks a widely accepted and consolidated approach to assess the LUP environmental impact using Strategic Environmental Assessment (SEA). In this paper, we developed a state-impact-state (SIS) model employed in the LUP environmental impact assessment (LUPEA). With the usage of Matter-element (ME) and Extenics method, the methodology based on the SIS model was established and applied in the LUPEA of Zoucheng County, China. The results show that: (1) this methodology provides an intuitive and easy understanding logical model for both the theoretical analysis and application ofmore » LUPEA; (2) the spatial multi-temporal assessment from base year, near-future year to planning target year suggests the positive impact on the environmental quality in the whole County despite certain environmental degradation in some towns; (3) besides the spatial assessment, other achievements including the environmental elements influenced by land use and their weights, the identification of key indicators in LUPEA, and the appropriate environmental mitigation measures were obtained; and (4) this methodology can be used to achieve multi-temporal assessment of LUP environmental impact of County or Town level in other areas. - Highlights: • A State-Impact-State model for Land Use Planning Environmental Assessment (LUPEA). • Matter-element (ME) and Extenics methods were embedded in the LUPEA. • The model was applied to the LUPEA of Zoucheng County. • The assessment shows improving environment quality since 2000 in Zoucheng County. • The method provides a useful tool for the LUPEA in the county level.« less

  11. Rapid condition assessment of structural condition after a blast using state-space identification

    NASA Astrophysics Data System (ADS)

    Eskew, Edward; Jang, Shinae

    2015-04-01

    After a blast event, it is important to quickly quantify the structural damage for emergency operations. In order improve the speed, accuracy, and efficiency of condition assessments after a blast, the authors have previously performed work to develop a methodology for rapid assessment of the structural condition of a building after a blast. The method involved determining a post-event equivalent stiffness matrix using vibration measurements and a finite element (FE) model. A structural model was built for the damaged structure based on the equivalent stiffness, and inter-story drifts from the blast are determined using numerical simulations, with forces determined from the blast parameters. The inter-story drifts are then compared to blast design conditions to assess the structures damage. This method still involved engineering judgment in terms of determining significant frequencies, which can lead to error, especially with noisy measurements. In an effort to improve accuracy and automate the process, this paper will look into a similar method of rapid condition assessment using subspace state-space identification. The accuracy of the method will be tested using a benchmark structural model, as well as experimental testing. The blast damage assessments will be validated using pressure-impulse (P-I) diagrams, which present the condition limits across blast parameters. Comparisons between P-I diagrams generated using the true system parameters and equivalent parameters will show the accuracy of the rapid condition based blast assessments.

  12. Cluster-based upper body marker models for three-dimensional kinematic analysis: Comparison with an anatomical model and reliability analysis.

    PubMed

    Boser, Quinn A; Valevicius, Aïda M; Lavoie, Ewen B; Chapman, Craig S; Pilarski, Patrick M; Hebert, Jacqueline S; Vette, Albert H

    2018-04-27

    Quantifying angular joint kinematics of the upper body is a useful method for assessing upper limb function. Joint angles are commonly obtained via motion capture, tracking markers placed on anatomical landmarks. This method is associated with limitations including administrative burden, soft tissue artifacts, and intra- and inter-tester variability. An alternative method involves the tracking of rigid marker clusters affixed to body segments, calibrated relative to anatomical landmarks or known joint angles. The accuracy and reliability of applying this cluster method to the upper body has, however, not been comprehensively explored. Our objective was to compare three different upper body cluster models with an anatomical model, with respect to joint angles and reliability. Non-disabled participants performed two standardized functional upper limb tasks with anatomical and cluster markers applied concurrently. Joint angle curves obtained via the marker clusters with three different calibration methods were compared to those from an anatomical model, and between-session reliability was assessed for all models. The cluster models produced joint angle curves which were comparable to and highly correlated with those from the anatomical model, but exhibited notable offsets and differences in sensitivity for some degrees of freedom. Between-session reliability was comparable between all models, and good for most degrees of freedom. Overall, the cluster models produced reliable joint angles that, however, cannot be used interchangeably with anatomical model outputs to calculate kinematic metrics. Cluster models appear to be an adequate, and possibly advantageous alternative to anatomical models when the objective is to assess trends in movement behavior. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Assessing the accuracy and reproducibility of modality independent elastography in a murine model of breast cancer

    PubMed Central

    Weis, Jared A.; Flint, Katelyn M.; Sanchez, Violeta; Yankeelov, Thomas E.; Miga, Michael I.

    2015-01-01

    Abstract. Cancer progression has been linked to mechanics. Therefore, there has been recent interest in developing noninvasive imaging tools for cancer assessment that are sensitive to changes in tissue mechanical properties. We have developed one such method, modality independent elastography (MIE), that estimates the relative elastic properties of tissue by fitting anatomical image volumes acquired before and after the application of compression to biomechanical models. The aim of this study was to assess the accuracy and reproducibility of the method using phantoms and a murine breast cancer model. Magnetic resonance imaging data were acquired, and the MIE method was used to estimate relative volumetric stiffness. Accuracy was assessed using phantom data by comparing to gold-standard mechanical testing of elasticity ratios. Validation error was <12%. Reproducibility analysis was performed on animal data, and within-subject coefficients of variation ranged from 2 to 13% at the bulk level and 32% at the voxel level. To our knowledge, this is the first study to assess the reproducibility of an elasticity imaging metric in a preclinical cancer model. Our results suggest that the MIE method can reproducibly generate accurate estimates of the relative mechanical stiffness and provide guidance on the degree of change needed in order to declare biological changes rather than experimental error in future therapeutic studies. PMID:26158120

  14. Comparison of hydromorphological assessment methods: Application to the Boise River, USA

    NASA Astrophysics Data System (ADS)

    Benjankar, Rohan; Koenig, Frauke; Tonina, Daniele

    2013-06-01

    Recent national and international legislation (e.g., the European Water Framework Directive) identified the need to quantify the ecological condition of river systems as a critical component for an integrated river management approach. An important defining driver of ecological condition is stream hydromorphology. Several methodologies have been proposed from simple table-based approaches to complex hydraulics-based models. In this paper, three different methods for river hydromorphological assessment are applied to the Boise River, United States of America (USA): (1) the German LAWA overview method (Bund/Laender Arbeitsgemeinschaft Wasser/German Working Group on water issues of the Federal States and the Federal Government represented by the Federal Environment Ministry), (2) a special approach for a hydromorphological assessment of urban rivers and (3) a hydraulic-based method. The hydraulic-based method assessed stream conditions from a statistical analysis of flow properties predicted with hydrodynamic modeling. The investigation focuses on comparing the three methods and defining the transferability of the methods among different contexts, Europe and West United States. It also provides comparison of the hydromorphological conditions of an urban and a rural reaches of the Boise River.

  15. Prediction of global and local model quality in CASP8 using the ModFOLD server.

    PubMed

    McGuffin, Liam J

    2009-01-01

    The development of effective methods for predicting the quality of three-dimensional (3D) models is fundamentally important for the success of tertiary structure (TS) prediction strategies. Since CASP7, the Quality Assessment (QA) category has existed to gauge the ability of various model quality assessment programs (MQAPs) at predicting the relative quality of individual 3D models. For the CASP8 experiment, automated predictions were submitted in the QA category using two methods from the ModFOLD server-ModFOLD version 1.1 and ModFOLDclust. ModFOLD version 1.1 is a single-model machine learning based method, which was used for automated predictions of global model quality (QMODE1). ModFOLDclust is a simple clustering based method, which was used for automated predictions of both global and local quality (QMODE2). In addition, manual predictions of model quality were made using ModFOLD version 2.0--an experimental method that combines the scores from ModFOLDclust and ModFOLD v1.1. Predictions from the ModFOLDclust method were the most successful of the three in terms of the global model quality, whilst the ModFOLD v1.1 method was comparable in performance to other single-model based methods. In addition, the ModFOLDclust method performed well at predicting the per-residue, or local, model quality scores. Predictions of the per-residue errors in our own 3D models, selected using the ModFOLD v2.0 method, were also the most accurate compared with those from other methods. All of the MQAPs described are publicly accessible via the ModFOLD server at: http://www.reading.ac.uk/bioinf/ModFOLD/. The methods are also freely available to download from: http://www.reading.ac.uk/bioinf/downloads/. Copyright 2009 Wiley-Liss, Inc.

  16. Spatio-temporal pattern clustering for skill assessment of the Korea Operational Oceanographic System

    NASA Astrophysics Data System (ADS)

    Kim, J.; Park, K.

    2016-12-01

    In order to evaluate the performance of operational forecast models in the Korea operational oceanographic system (KOOS) which has been developed by Korea Institute of Ocean Science and Technology (KIOST), a skill assessment (SA) tool has developed and provided multiple skill metrics including not only correlation and error skills by comparing predictions and observation but also pattern clustering with numerical models, satellite, and observation. The KOOS has produced 72 hours forecast information on atmospheric and hydrodynamic forecast variables of wind, pressure, current, tide, wave, temperature, and salinity at every 12 hours per day produced by operating numerical models such as WRF, ROMS, MOM5, WW-III, and SWAN and the SA has conducted to evaluate the forecasts. We have been operationally operated several kinds of numerical models such as WRF, ROMS, MOM5, MOHID, WW-III. Quantitative assessment of operational ocean forecast model is very important to provide accurate ocean forecast information not only to general public but also to support ocean-related problems. In this work, we propose a method of pattern clustering using machine learning method and GIS-based spatial analytics to evaluate spatial distribution of numerical models and spatial observation data such as satellite and HF radar. For the clustering, we use 10 or 15 years-long reanalysis data which was computed by the KOOS, ECMWF, and HYCOM to make best matching clusters which are classified physical meaning with time variation and then we compare it with forecast data. Moreover, for evaluating current, we develop extraction method of dominant flow and apply it to hydrodynamic models and HF radar's sea surface current data. By applying pattern clustering method, it allows more accurate and effective assessment of ocean forecast models' performance by comparing not only specific observation positions which are determined by observation stations but also spatio-temporal distribution of whole model areas. We believe that our proposed method will be very useful to examine and evaluate large amount of numerical modeling data as well as satellite data.

  17. Highly efficient model updating for structural condition assessment of large-scale bridges.

    DOT National Transportation Integrated Search

    2015-02-01

    For eciently updating models of large-scale structures, the response surface (RS) method based on radial basis : functions (RBFs) is proposed to model the input-output relationship of structures. The key issues for applying : the proposed method a...

  18. Assessment of Technologies for the Space Shuttle External Tank Thermal Protection System and Recommendations for Technology Improvement - Part III: Material Property Characterization, Analysis, and Test Methods

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.

    2005-01-01

    The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.

  19. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations.

    PubMed

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.

  20. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations

    PubMed Central

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440

  1. Ecosystem Model Skill Assessment. Yes We Can!

    PubMed Central

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S.

    2016-01-01

    Need to Assess the Skill of Ecosystem Models Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. Northeast US Atlantis Marine Ecosystem Model We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. Skill Assessment Is Both Possible and Advisable We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable to any type of predictive model, and should be considered for use in fields outside ecology (e.g. economics, climate change, and risk assessment). PMID:26731540

  2. Some suggested future directions of quantitative resource assessments

    USGS Publications Warehouse

    Singer, D.A.

    2001-01-01

    Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral occurrences, geophysics, and geochemistry used in resource assessments and mineral exploration. Grade and tonnage models and development of quantitative descriptive, economic, and deposit density models will help reduce the uncertainty of these new assessments.

  3. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    EPA Science Inventory

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  4. On the Estimation of Standard Errors in Cognitive Diagnosis Models

    ERIC Educational Resources Information Center

    Philipp, Michel; Strobl, Carolin; de la Torre, Jimmy; Zeileis, Achim

    2018-01-01

    Cognitive diagnosis models (CDMs) are an increasingly popular method to assess mastery or nonmastery of a set of fine-grained abilities in educational or psychological assessments. Several inference techniques are available to quantify the uncertainty of model parameter estimates, to compare different versions of CDMs, or to check model…

  5. REDUCING AMBIGUITY IN THE FUNCTIONAL ASSESSMENT OF PROBLEM BEHAVIOR

    PubMed Central

    Rooker, Griffin W.; DeLeon, Iser G.; Borrero, Carrie S. W.; Frank-Crawford, Michelle A.; Roscoe, Eileen M.

    2015-01-01

    Severe problem behavior (e.g., self-injury and aggression) remains among the most serious challenges for the habilitation of persons with intellectual disabilities and is a significant obstacle to community integration. The current standard of behavior analytic treatment for problem behavior in this population consists of a functional assessment and treatment model. Within that model, the first step is to assess the behavior–environment relations that give rise to and maintain problem behavior, a functional behavioral assessment. Conventional methods of assessing behavioral function include indirect, descriptive, and experimental assessments of problem behavior. Clinical investigators have produced a rich literature demonstrating the relative effectiveness for each method, but in clinical practice, each can produce ambiguous or difficult-to-interpret outcomes that may impede treatment development. This paper outlines potential sources of variability in assessment outcomes and then reviews the evidence on strategies for avoiding ambiguous outcomes and/or clarifying initially ambiguous results. The end result for each assessment method is a set of best practice guidelines, given the available evidence, for conducting the initial assessment. PMID:26236145

  6. Affordable non-traditional source data mining for context assessment to improve distributed fusion system robustness

    NASA Astrophysics Data System (ADS)

    Bowman, Christopher; Haith, Gary; Steinberg, Alan; Morefield, Charles; Morefield, Michael

    2013-05-01

    This paper describes methods to affordably improve the robustness of distributed fusion systems by opportunistically leveraging non-traditional data sources. Adaptive methods help find relevant data, create models, and characterize the model quality. These methods also can measure the conformity of this non-traditional data with fusion system products including situation modeling and mission impact prediction. Non-traditional data can improve the quantity, quality, availability, timeliness, and diversity of the baseline fusion system sources and therefore can improve prediction and estimation accuracy and robustness at all levels of fusion. Techniques are described that automatically learn to characterize and search non-traditional contextual data to enable operators integrate the data with the high-level fusion systems and ontologies. These techniques apply the extension of the Data Fusion & Resource Management Dual Node Network (DNN) technical architecture at Level 4. The DNN architecture supports effectively assessment and management of the expanded portfolio of data sources, entities of interest, models, and algorithms including data pattern discovery and context conformity. Affordable model-driven and data-driven data mining methods to discover unknown models from non-traditional and `big data' sources are used to automatically learn entity behaviors and correlations with fusion products, [14 and 15]. This paper describes our context assessment software development, and the demonstration of context assessment of non-traditional data to compare to an intelligence surveillance and reconnaissance fusion product based upon an IED POIs workflow.

  7. Conceptual modeling for Prospective Health Technology Assessment.

    PubMed

    Gantner-Bär, Marion; Djanatliev, Anatoli; Prokosch, Hans-Ulrich; Sedlmayr, Martin

    2012-01-01

    Prospective Health Technology Assessment (ProHTA) is a new and innovative approach to analyze and assess new technologies, methods and procedures in health care. Simulation processes are used to model innovations before the cost-intensive design and development phase. Thus effects on patient care, the health care system as well as health economics aspects can be estimated. To generate simulation models a valid information base is necessary and therefore conceptual modeling is most suitable. Project-specifically improved methods and characteristics of simulation modeling are combined in the ProHTA Conceptual Modeling Process and initially implemented for acute ischemic stroke treatment in Germany. Additionally the project aims at simulation of other diseases and health care systems as well. ProHTA is an interdisciplinary research project within the Cluster of Excellence for Medical Technology - Medical Valley European Metropolitan Region Nuremberg (EMN), which is funded by the German Federal Ministry of Education and Research (BMBF), project grant No. 01EX1013B.

  8. Assessing the utility of frequency dependent nudging for reducing biases in biogeochemical models

    NASA Astrophysics Data System (ADS)

    Lagman, Karl B.; Fennel, Katja; Thompson, Keith R.; Bianucci, Laura

    2014-09-01

    Bias errors, resulting from inaccurate boundary and forcing conditions, incorrect model parameterization, etc. are a common problem in environmental models including biogeochemical ocean models. While it is important to correct bias errors wherever possible, it is unlikely that any environmental model will ever be entirely free of such errors. Hence, methods for bias reduction are necessary. A widely used technique for online bias reduction is nudging, where simulated fields are continuously forced toward observations or a climatology. Nudging is robust and easy to implement, but suppresses high-frequency variability and introduces artificial phase shifts. As a solution to this problem Thompson et al. (2006) introduced frequency dependent nudging where nudging occurs only in prescribed frequency bands, typically centered on the mean and the annual cycle. They showed this method to be effective for eddy resolving ocean circulation models. Here we add a stability term to the previous form of frequency dependent nudging which makes the method more robust for non-linear biological models. Then we assess the utility of frequency dependent nudging for biological models by first applying the method to a simple predator-prey model and then to a 1D ocean biogeochemical model. In both cases we only nudge in two frequency bands centered on the mean and the annual cycle, and then assess how well the variability in higher frequency bands is recovered. We evaluate the effectiveness of frequency dependent nudging in comparison to conventional nudging and find significant improvements with the former.

  9. Using HEC-RAS to Enhance Interpretive Capabilities of Geomorphic Assessments

    NASA Astrophysics Data System (ADS)

    Keefer, L. L.

    2005-12-01

    The purpose of a geomorphic assessment is to characterize and evaluate a fluvial system for determining the past watershed and channel conditions, current geomorphic character and potential future channel adjustments. The geomorphic assessment approach utilized by the Illinois State Water Survey assesses channel response to disturbance at multiple temporal and spatial scales to help identify the underlying factors and events which led to the existing channel morphology. This is accomplished through two phases of investigation that involve a historical and physical analysis of the watershed, disturbance history, and field work at increasing levels of detail. To infer future channel adjustments, the geomorphic assessment protocol combines two methods of analyses that are dependent on the quantity and detail of the available data. The first method is the compilation of multiple lines of evidence using qualitative information related to the dominant fluvial environment, channel gradient, stream power thresholds, and channel evolution models. The second method is the use of hydraulic models which provide additional interpretative skills to evaluate potential channel adjustments. The structured data collection framework of the geomorphic assessment approach is used for the development of a HEC-RAS model. The model results are then used as another tool to determine the influence of bridges and control structures on channel stability, stream power profiles to identify potential channel bed degradation zones, and provide data for physically-based bank stability models. This poster will demonstrate the advantages of using a hydraulic model, such as HEC-RAS, to expand the interpretive capabilities of geomorphic assessments. The results from applying this approach will be demonstrated for the Big Creek watershed of the Cache River Basin in southern Illinois.

  10. Concurrent validity of different functional and neuroproteomic pain assessment methods in the rat osteoarthritis monosodium iodoacetate (MIA) model.

    PubMed

    Otis, Colombe; Gervais, Julie; Guillot, Martin; Gervais, Julie-Anne; Gauvin, Dominique; Péthel, Catherine; Authier, Simon; Dansereau, Marc-André; Sarret, Philippe; Martel-Pelletier, Johanne; Pelletier, Jean-Pierre; Beaudry, Francis; Troncy, Eric

    2016-06-23

    Lack of validity in osteoarthritis pain models and assessment methods is suspected. Our goal was to 1) assess the repeatability and reproducibility of measurement and the influence of environment, and acclimatization, to different pain assessment outcomes in normal rats, and 2) test the concurrent validity of the most reliable methods in relation to the expression of different spinal neuropeptides in a chemical model of osteoarthritic pain. Repeatability and inter-rater reliability of reflexive nociceptive mechanical thresholds, spontaneous static weight-bearing, treadmill, rotarod, and operant place escape/avoidance paradigm (PEAP) were assessed by the intraclass correlation coefficient (ICC). The most reliable acclimatization protocol was determined by comparing coefficients of variation. In a pilot comparative study, the sensitivity and responsiveness to treatment of the most reliable methods were tested in the monosodium iodoacetate (MIA) model over 21 days. Two MIA (2 mg) groups (including one lidocaine treatment group) and one sham group (0.9 % saline) received an intra-articular (50 μL) injection. No effect of environment (observer, inverted circadian cycle, or exercise) was observed; all tested methods except mechanical sensitivity (ICC <0.3), offered good repeatability (ICC ≥0.7). The most reliable acclimatization protocol included five assessments over two weeks. MIA-related osteoarthritic change in pain was demonstrated with static weight-bearing, punctate tactile allodynia evaluation, treadmill exercise and operant PEAP, the latter being the most responsive to analgesic intra-articular lidocaine. Substance P and calcitonin gene-related peptide were higher in MIA groups compared to naive (adjusted P (adj-P) = 0.016) or sham-treated (adj-P = 0.029) rats. Repeated post-MIA lidocaine injection resulted in 34 times lower downregulation for spinal substance P compared to MIA alone (adj-P = 0.029), with a concomitant increase of 17 % in time spent on the PEAP dark side (indicative of increased comfort). This study of normal rats and rats with pain established the most reliable and sensitive pain assessment methods and an optimized acclimatization protocol. Operant PEAP testing was more responsive to lidocaine analgesia than other tests used, while neuropeptide spinal concentration is an objective quantification method attractive to support and validate different centralized pain functional assessment methods.

  11. Taming Log Files from Game/Simulation-Based Assessments: Data Models and Data Analysis Tools. Research Report. ETS RR-16-10

    ERIC Educational Resources Information Center

    Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm

    2016-01-01

    Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…

  12. Dynamic building risk assessment theoretic model for rainstorm-flood utilization ABM and ABS

    NASA Astrophysics Data System (ADS)

    Lai, Wenze; Li, Wenbo; Wang, Hailei; Huang, Yingliang; Wu, Xuelian; Sun, Bingyun

    2015-12-01

    Flood is one of natural disasters with the worst loss in the world. It needs to assess flood disaster risk so that we can reduce the loss of flood disaster. Disaster management practical work needs the dynamic risk results of building. Rainstorm flood disaster system is a typical complex system. From the view of complex system theory, flood disaster risk is the interaction result of hazard effect objects, rainstorm flood hazard factors, and hazard environments. Agent-based modeling (ABM) is an important tool for complex system modeling. Rainstorm-flood building risk dynamic assessment method (RFBRDAM) was proposed using ABM in this paper. The interior structures and procedures of different agents in proposed meth had been designed. On the Netlogo platform, the proposed method was implemented to assess the building risk changes of the rainstorm flood disaster in the Huaihe River Basin using Agent-based simulation (ABS). The results indicated that the proposed method can dynamically assess building risk of the whole process for the rainstorm flood disaster. The results of this paper can provide one new approach for flood disaster building risk dynamic assessment and flood disaster management.

  13. Assessing Graduate Attributes: Building a Criteria-Based Competency Model

    ERIC Educational Resources Information Center

    Ipperciel, Donald; ElAtia, Samira

    2014-01-01

    Graduate attributes (GAs) have become a necessary framework of reference for the 21st century competency-based model of higher education. However, the issue of evaluating and assessing GAs still remains unchartered territory. In this article, we present a criteria-based method of assessment that allows for an institution-wide comparison of the…

  14. A Model for Making Decisions about Ethical Dilemmas in Student Assessment

    ERIC Educational Resources Information Center

    Johnson, Robert L.; Liu, Jin; Burgess, Yin

    2017-01-01

    In this mixed-methods study we investigated the development of a generalized ethics decision-making model that can be applied in considering ethical dilemmas related to student assessment. For the study, we developed five scenarios that describe ethical dilemmas associated with student assessment. Survey participants (i.e., educators) completed an…

  15. Assessment of Matrix Multiplication Learning with a Rule-Based Analytical Model--"A Bayesian Network Representation"

    ERIC Educational Resources Information Center

    Zhang, Zhidong

    2016-01-01

    This study explored an alternative assessment procedure to examine learning trajectories of matrix multiplication. It took rule-based analytical and cognitive task analysis methods specifically to break down operation rules for a given matrix multiplication. Based on the analysis results, a hierarchical Bayesian network, an assessment model,…

  16. Ecosystem Model Skill Assessment. Yes We Can!

    PubMed

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S

    2016-01-01

    Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable to any type of predictive model, and should be considered for use in fields outside ecology (e.g. economics, climate change, and risk assessment).

  17. Assessing groundwater vulnerability to agrichemical contamination in the Midwest US

    USGS Publications Warehouse

    Burkart, M.R.; Kolpin, D.W.; James, D.E.

    1999-01-01

    Agrichemicals (herbicides and nitrate) are significant sources of diffuse pollution to groundwater. Indirect methods are needed to assess the potential for groundwater contamination by diffuse sources because groundwater monitoring is too costly to adequately define the geographic extent of contamination at a regional or national scale. This paper presents examples of the application of statistical, overlay and index, and process-based modeling methods for groundwater vulnerability assessments to a variety of data from the Midwest U.S. The principles for vulnerability assessment include both intrinsic (pedologic, climatologic, and hydrogeologic factors) and specific (contaminant and other anthropogenic factors) vulnerability of a location. Statistical methods use the frequency of contaminant occurrence, contaminant concentration, or contamination probability as a response variable. Statistical assessments are useful for defining the relations among explanatory and response variables whether they define intrinsic or specific vulnerability. Multivariate statistical analyses are useful for ranking variables critical to estimating water quality responses of interest. Overlay and index methods involve intersecting maps of intrinsic and specific vulnerability properties and indexing the variables by applying appropriate weights. Deterministic models use process-based equations to simulate contaminant transport and are distinguished from the other methods in their potential to predict contaminant transport in both space and time. An example of a one-dimensional leaching model linked to a geographic information system (GIS) to define a regional metamodel for contamination in the Midwest is included.

  18. Fuzzy-probabilistic model for risk assessment of radioactive material railway transportation.

    PubMed

    Avramenko, M; Bolyatko, V; Kosterev, V

    2005-01-01

    Transportation of radioactive materials is obviously accompanied by a certain risk. A model for risk assessment of emergency situations and terrorist attacks may be useful for choosing possible routes and for comparing the various defence strategies. In particular, risk assessment is crucial for safe transportation of excess weapons-grade plutonium arising from the removal of plutonium from military employment. A fuzzy-probabilistic model for risk assessment of railway transportation has been developed taking into account the different natures of risk-affecting parameters (probabilistic and not probabilistic but fuzzy). Fuzzy set theory methods as well as standard methods of probability theory have been used for quantitative risk assessment. Information-preserving transformations are applied to realise the correct aggregation of probabilistic and fuzzy parameters. Estimations have also been made of the inhalation doses resulting from possible accidents during plutonium transportation. The obtained data show the scale of possible consequences that may arise from plutonium transportation accidents.

  19. An Experimental Comparison of Similarity Assessment Measures for 3D Models on Constrained Surface Deformation

    NASA Astrophysics Data System (ADS)

    Quan, Lulin; Yang, Zhixin

    2010-05-01

    To address the issues in the area of design customization, this paper expressed the specification and application of the constrained surface deformation, and reported the experimental performance comparison of three prevail effective similarity assessment algorithms on constrained surface deformation domain. Constrained surface deformation becomes a promising method that supports for various downstream applications of customized design. Similarity assessment is regarded as the key technology for inspecting the success of new design via measuring the difference level between the deformed new design and the initial sample model, and indicating whether the difference level is within the limitation. According to our theoretical analysis and pre-experiments, three similarity assessment algorithms are suitable for this domain, including shape histogram based method, skeleton based method, and U system moment based method. We analyze their basic functions and implementation methodologies in detail, and do a series of experiments on various situations to test their accuracy and efficiency using precision-recall diagram. Shoe model is chosen as an industrial example for the experiments. It shows that shape histogram based method gained an optimal performance in comparison. Based on the result, we proposed a novel approach that integrating surface constrains and shape histogram description with adaptive weighting method, which emphasize the role of constrains during the assessment. The limited initial experimental result demonstrated that our algorithm outperforms other three algorithms. A clear direction for future development is also drawn at the end of the paper.

  20. Methods to assess performance of models estimating risk of death in intensive care patients: a review.

    PubMed

    Cook, D A

    2006-04-01

    Models that estimate the probability of death of intensive care unit patients can be used to stratify patients according to the severity of their condition and to control for casemix and severity of illness. These models have been used for risk adjustment in quality monitoring, administration, management and research and as an aid to clinical decision making. Models such as the Mortality Prediction Model family, SAPS II, APACHE II, APACHE III and the organ system failure models provide estimates of the probability of in-hospital death of ICU patients. This review examines methods to assess the performance of these models. The key attributes of a model are discrimination (the accuracy of the ranking in order of probability of death) and calibration (the extent to which the model's prediction of probability of death reflects the true risk of death). These attributes should be assessed in existing models that predict the probability of patient mortality, and in any subsequent model that is developed for the purposes of estimating these probabilities. The literature contains a range of approaches for assessment which are reviewed and a survey of the methodologies used in studies of intensive care mortality models is presented. The systematic approach used by Standards for Reporting Diagnostic Accuracy provides a framework to incorporate these theoretical considerations of model assessment and recommendations are made for evaluation and presentation of the performance of models that estimate the probability of death of intensive care patients.

  1. A method to assess the allocation suitability of recreational activities: An economic approach

    NASA Astrophysics Data System (ADS)

    Wang, Hsiao-Lin

    1996-03-01

    Most existing methods of planning focus on development of a recreational area; less consideration is placed on the allocation of recreational activities within a recreational area. Most existing research emphasizes the economic benefits of developing a recreational area; few authors assessed the allocation suitability of recreational activities from an economic point of view. The purpose of this work was to develop a model to assess the allocation suitability of recreational activities according to the application of a concept of analysis of cost and benefit under a premise of ecological concern. The model was verified with a case study of Taiwan. We suggest that the proposed model should form a critical part of recreational planning.

  2. Assessing the accuracy and stability of variable selection methods for random forest modeling in ecology.

    PubMed

    Fox, Eric W; Hill, Ryan A; Leibowitz, Scott G; Olsen, Anthony R; Thornbrugh, Darren J; Weber, Marc H

    2017-07-01

    Random forest (RF) modeling has emerged as an important statistical learning method in ecology due to its exceptional predictive performance. However, for large and complex ecological data sets, there is limited guidance on variable selection methods for RF modeling. Typically, either a preselected set of predictor variables are used or stepwise procedures are employed which iteratively remove variables according to their importance measures. This paper investigates the application of variable selection methods to RF models for predicting probable biological stream condition. Our motivating data set consists of the good/poor condition of n = 1365 stream survey sites from the 2008/2009 National Rivers and Stream Assessment, and a large set (p = 212) of landscape features from the StreamCat data set as potential predictors. We compare two types of RF models: a full variable set model with all 212 predictors and a reduced variable set model selected using a backward elimination approach. We assess model accuracy using RF's internal out-of-bag estimate, and a cross-validation procedure with validation folds external to the variable selection process. We also assess the stability of the spatial predictions generated by the RF models to changes in the number of predictors and argue that model selection needs to consider both accuracy and stability. The results suggest that RF modeling is robust to the inclusion of many variables of moderate to low importance. We found no substantial improvement in cross-validated accuracy as a result of variable reduction. Moreover, the backward elimination procedure tended to select too few variables and exhibited numerous issues such as upwardly biased out-of-bag accuracy estimates and instabilities in the spatial predictions. We use simulations to further support and generalize results from the analysis of real data. A main purpose of this work is to elucidate issues of model selection bias and instability to ecologists interested in using RF to develop predictive models with large environmental data sets.

  3. Nondestructive testing methods to predict effect of degradation on wood : a critical assessment

    Treesearch

    J. Kaiserlik

    1978-01-01

    Results are reported for an assessment of methods for predicting strength of wood, wood-based, or related material. Research directly applicable to nondestructive strength prediction was very limited. In wood, strength prediction research is limited to vibration decay, wave attenuation, and multiparameter "degradation models." Nonwood methods with potential...

  4. Evaluating the implementation of confusion assessment method-intensive care unit using a quality improvement approach.

    PubMed

    Stewart, C; Bench, S

    2018-05-15

    Quality improvement (QI) is a way through which health care delivery can be made safer and more effective. Various models of quality improvement methods exist in health care today. These models can help guide and manage the process of introducing changes into clinical practice. The aim of this project was to implement the use of a delirium assessment tool into three adult critical care units within the same hospital using a QI approach. The objective was to improve the identification and management of delirium. Using the Model for Improvement framework, a multidisciplinary working group was established. A delirium assessment tool was introduced via a series of educational initiatives. New local guidelines regarding the use of delirium assessment and management for the multidisciplinary team were also produced. Audit data were collected at 6 weeks and 5 months post-implementation to evaluate compliance with the use of the tool across three critical care units within a single hospital in London. At 6 weeks, in 134 assessment points out of a possible 202, the tool was deemed to be used appropriately, meaning that 60% of patients received timely assessment; 18% of patients were identified as delirious in audit one. Five months later, only 95 assessment points out of a possible 199 were being appropriately assessed (47%); however, a greater number (32%) were identified as delirious. This project emphasizes the complexity of changing practice in a large busy critical care centre. Despite an initial increase in delirium assessment, this was not sustained over time. The use of a QI model highlights the continuous process of embedding changes into clinical practice and the need to use a QI method that can address the challenging nature of modern health care. QI models guide changes in practice. Consideration should be given to the type of QI model used. © 2018 British Association of Critical Care Nurses.

  5. Assessment of liquefaction-induced hazards using Bayesian networks based on standard penetration test data

    NASA Astrophysics Data System (ADS)

    Tang, Xiao-Wei; Bai, Xu; Hu, Ji-Lei; Qiu, Jiang-Nan

    2018-05-01

    Liquefaction-induced hazards such as sand boils, ground cracks, settlement, and lateral spreading are responsible for considerable damage to engineering structures during major earthquakes. Presently, there is no effective empirical approach that can assess different liquefaction-induced hazards in one model. This is because of the uncertainties and complexity of the factors related to seismic liquefaction and liquefaction-induced hazards. In this study, Bayesian networks (BNs) are used to integrate multiple factors related to seismic liquefaction, sand boils, ground cracks, settlement, and lateral spreading into a model based on standard penetration test data. The constructed BN model can assess four different liquefaction-induced hazards together. In a case study, the BN method outperforms an artificial neural network and Ishihara and Yoshimine's simplified method in terms of accuracy, Brier score, recall, precision, and area under the curve (AUC) of the receiver operating characteristic (ROC). This demonstrates that the BN method is a good alternative tool for the risk assessment of liquefaction-induced hazards. Furthermore, the performance of the BN model in estimating liquefaction-induced hazards in Japan's 2011 Tōhoku earthquake confirms its correctness and reliability compared with the liquefaction potential index approach. The proposed BN model can also predict whether the soil becomes liquefied after an earthquake and can deduce the chain reaction process of liquefaction-induced hazards and perform backward reasoning. The assessment results from the proposed model provide informative guidelines for decision-makers to detect the damage state of a field following liquefaction.

  6. Interactive Rapid Dose Assessment Model (IRDAM): reactor-accident assessment methods. Vol. 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness, the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This document describes the technical bases for IRDAM including methods, models and assumptions used in calculations. IRDAM calculates whole body (5-cm depth) and infant thyroid doses at six fixed downwind distances between 500 and 20,000 meters. Radionuclides considered primarily consist of noble gases and radioiodines.more » In order to provide a rapid assessment capability consistent with the capacity of the Osborne-1 computer, certain simplifying approximations and assumptions are made. These are described, along with default values (assumptions used in the absence of specific input) in the text of this document. Two companion volumes to this one provide additional information on IRDAM. The user's Guide (NUREG/CR-3012, Volume 1) describes the setup and operation of equipment necessary to run IRDAM. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios.« less

  7. Marsupials from space: fluctuating asymmetry, geographical information systems and animal conservation

    PubMed Central

    Teixeira, Camila Palhares; Hirsch, André; Perini, Henrique; Young, Robert John

    2006-01-01

    We report the development of a new quantitative method of assessing the effects of anthropogenic impacts on living beings; this method allows us to assess actual impacts and to travel backwards in time to assess impacts. In this method, we have crossed data on fluctuating asymmetry (FA, a measure of environmental or genetic stress), using Didelphis albiventris as a model, with geographical information systems data relating to environmental composition. Our results show that more impacted environments resulted in statistically higher levels of FA. Our method appears to be a useful and flexible conservation tool for assessing anthropogenic impacts. PMID:16627287

  8. Assessing Continuous Operator Workload With a Hybrid Scaffolded Neuroergonomic Modeling Approach.

    PubMed

    Borghetti, Brett J; Giametta, Joseph J; Rusnock, Christina F

    2017-02-01

    We aimed to predict operator workload from neurological data using statistical learning methods to fit neurological-to-state-assessment models. Adaptive systems require real-time mental workload assessment to perform dynamic task allocations or operator augmentation as workload issues arise. Neuroergonomic measures have great potential for informing adaptive systems, and we combine these measures with models of task demand as well as information about critical events and performance to clarify the inherent ambiguity of interpretation. We use machine learning algorithms on electroencephalogram (EEG) input to infer operator workload based upon Improved Performance Research Integration Tool workload model estimates. Cross-participant models predict workload of other participants, statistically distinguishing between 62% of the workload changes. Machine learning models trained from Monte Carlo resampled workload profiles can be used in place of deterministic workload profiles for cross-participant modeling without incurring a significant decrease in machine learning model performance, suggesting that stochastic models can be used when limited training data are available. We employed a novel temporary scaffold of simulation-generated workload profile truth data during the model-fitting process. A continuous workload profile serves as the target to train our statistical machine learning models. Once trained, the workload profile scaffolding is removed and the trained model is used directly on neurophysiological data in future operator state assessments. These modeling techniques demonstrate how to use neuroergonomic methods to develop operator state assessments, which can be employed in adaptive systems.

  9. Numerical Exposure Assessment Method for Low Frequency Range and Application to Wireless Power Transfer.

    PubMed

    Park, SangWook; Kim, Minhyuk

    2016-01-01

    In this paper, a numerical exposure assessment method is presented for a quasi-static analysis by the use of finite-difference time-domain (FDTD) algorithm. The proposed method is composed of scattered field FDTD method and quasi-static approximation for analyzing of the low frequency band electromagnetic problems. The proposed method provides an effective tool to compute induced electric fields in an anatomically realistic human voxel model exposed to an arbitrary non-uniform field source in the low frequency ranges. The method is verified, and excellent agreement with theoretical solutions is found for a dielectric sphere model exposed to a magnetic dipole source. The assessment method serves a practical example of the electric fields, current densities, and specific absorption rates induced in a human head and body in close proximity to a 150-kHz wireless power transfer system for cell phone charging. The results are compared to the limits recommended by the International Commission on Non-Ionizing Radiation Protection (ICNIRP) and the IEEE standard guidelines.

  10. Numerical Exposure Assessment Method for Low Frequency Range and Application to Wireless Power Transfer

    PubMed Central

    Kim, Minhyuk

    2016-01-01

    In this paper, a numerical exposure assessment method is presented for a quasi-static analysis by the use of finite-difference time-domain (FDTD) algorithm. The proposed method is composed of scattered field FDTD method and quasi-static approximation for analyzing of the low frequency band electromagnetic problems. The proposed method provides an effective tool to compute induced electric fields in an anatomically realistic human voxel model exposed to an arbitrary non-uniform field source in the low frequency ranges. The method is verified, and excellent agreement with theoretical solutions is found for a dielectric sphere model exposed to a magnetic dipole source. The assessment method serves a practical example of the electric fields, current densities, and specific absorption rates induced in a human head and body in close proximity to a 150-kHz wireless power transfer system for cell phone charging. The results are compared to the limits recommended by the International Commission on Non-Ionizing Radiation Protection (ICNIRP) and the IEEE standard guidelines. PMID:27898688

  11. Accuracy and precision of polyurethane dental arch models fabricated using a three-dimensional subtractive rapid prototyping method with an intraoral scanning technique.

    PubMed

    Kim, Jae-Hong; Kim, Ki-Baek; Kim, Woong-Chul; Kim, Ji-Hwan; Kim, Hae-Young

    2014-03-01

    This study aimed to evaluate the accuracy and precision of polyurethane (PUT) dental arch models fabricated using a three-dimensional (3D) subtractive rapid prototyping (RP) method with an intraoral scanning technique by comparing linear measurements obtained from PUT models and conventional plaster models. Ten plaster models were duplicated using a selected standard master model and conventional impression, and 10 PUT models were duplicated using the 3D subtractive RP technique with an oral scanner. Six linear measurements were evaluated in terms of x, y, and z-axes using a non-contact white light scanner. Accuracy was assessed using mean differences between two measurements, and precision was examined using four quantitative methods and the Bland-Altman graphical method. Repeatability was evaluated in terms of intra-examiner variability, and reproducibility was assessed in terms of inter-examiner and inter-method variability. The mean difference between plaster models and PUT models ranged from 0.07 mm to 0.33 mm. Relative measurement errors ranged from 2.2% to 7.6% and intraclass correlation coefficients ranged from 0.93 to 0.96, when comparing plaster models and PUT models. The Bland-Altman plot showed good agreement. The accuracy and precision of PUT dental models for evaluating the performance of oral scanner and subtractive RP technology was acceptable. Because of the recent improvements in block material and computerized numeric control milling machines, the subtractive RP method may be a good choice for dental arch models.

  12. Data collection costs in industrial environments for three occupational posture exposure assessment methods

    PubMed Central

    2012-01-01

    Background Documentation of posture measurement costs is rare and cost models that do exist are generally naïve. This paper provides a comprehensive cost model for biomechanical exposure assessment in occupational studies, documents the monetary costs of three exposure assessment methods for different stakeholders in data collection, and uses simulations to evaluate the relative importance of cost components. Methods Trunk and shoulder posture variables were assessed for 27 aircraft baggage handlers for 3 full shifts each using three methods typical to ergonomic studies: self-report via questionnaire, observation via video film, and full-shift inclinometer registration. The cost model accounted for expenses related to meetings to plan the study, administration, recruitment, equipment, training of data collectors, travel, and onsite data collection. Sensitivity analyses were conducted using simulated study parameters and cost components to investigate the impact on total study cost. Results Inclinometry was the most expensive method (with a total study cost of € 66,657), followed by observation (€ 55,369) and then self report (€ 36,865). The majority of costs (90%) were borne by researchers. Study design parameters such as sample size, measurement scheduling and spacing, concurrent measurements, location and travel, and equipment acquisition were shown to have wide-ranging impacts on costs. Conclusions This study provided a general cost modeling approach that can facilitate decision making and planning of data collection in future studies, as well as investigation into cost efficiency and cost efficient study design. Empirical cost data from a large field study demonstrated the usefulness of the proposed models. PMID:22738341

  13. Goodness-of-Fit Assessment of Item Response Theory Models

    ERIC Educational Resources Information Center

    Maydeu-Olivares, Alberto

    2013-01-01

    The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…

  14. Bayes Nets in Educational Assessment: Where Do the Numbers Come from? CSE Technical Report.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Almond, Russell G.; Yan, Duanli; Steinberg, Linda S.

    Educational assessments that exploit advances in technology and cognitive psychology can produce observations and pose student models that outstrip familiar test-theoretic models and analytic methods. Bayesian inference networks (BINs), which include familiar models and techniques as special cases, can be used to manage belief about students'…

  15. [Health assessment and economic assessment in health: introduction to the debate on the points of intersection].

    PubMed

    Sancho, Leyla Gomes; Dain, Sulamis

    2012-03-01

    The study aims to infer the existence of a continuum between Health Assessment and Economic Assessment in Health, by highlighting points of intersection of these forms of appraisal. To achieve this, a review of the theoretical foundations, methods and approaches of both forms of assessment was conducted. It was based on the theoretical model of health evaluation as reported by Hartz et al and economic assessment in health approaches reported by Brouwer et al. It was seen that there is a continuum between the theoretical model of evaluative research and the extrawelfarist approach for economic assessment in health, and between the normative theoretical model for health assessment and the welfarist approaches for economic assessment in health. However, in practice the assessment is still conducted using the normative theoretical model and with a welfarist approach.

  16. New parsimonious simulation methods and tools to assess future food and environmental security of farm populations

    PubMed Central

    Antle, John M.; Stoorvogel, Jetse J.; Valdivia, Roberto O.

    2014-01-01

    This article presents conceptual and empirical foundations for new parsimonious simulation models that are being used to assess future food and environmental security of farm populations. The conceptual framework integrates key features of the biophysical and economic processes on which the farming systems are based. The approach represents a methodological advance by coupling important behavioural processes, for example, self-selection in adaptive responses to technological and environmental change, with aggregate processes, such as changes in market supply and demand conditions or environmental conditions as climate. Suitable biophysical and economic data are a critical limiting factor in modelling these complex systems, particularly for the characterization of out-of-sample counterfactuals in ex ante analyses. Parsimonious, population-based simulation methods are described that exploit available observational, experimental, modelled and expert data. The analysis makes use of a new scenario design concept called representative agricultural pathways. A case study illustrates how these methods can be used to assess food and environmental security. The concluding section addresses generalizations of parametric forms and linkages of regional models to global models. PMID:24535388

  17. New parsimonious simulation methods and tools to assess future food and environmental security of farm populations.

    PubMed

    Antle, John M; Stoorvogel, Jetse J; Valdivia, Roberto O

    2014-04-05

    This article presents conceptual and empirical foundations for new parsimonious simulation models that are being used to assess future food and environmental security of farm populations. The conceptual framework integrates key features of the biophysical and economic processes on which the farming systems are based. The approach represents a methodological advance by coupling important behavioural processes, for example, self-selection in adaptive responses to technological and environmental change, with aggregate processes, such as changes in market supply and demand conditions or environmental conditions as climate. Suitable biophysical and economic data are a critical limiting factor in modelling these complex systems, particularly for the characterization of out-of-sample counterfactuals in ex ante analyses. Parsimonious, population-based simulation methods are described that exploit available observational, experimental, modelled and expert data. The analysis makes use of a new scenario design concept called representative agricultural pathways. A case study illustrates how these methods can be used to assess food and environmental security. The concluding section addresses generalizations of parametric forms and linkages of regional models to global models.

  18. A Comparison of Fuzzy Models in Similarity Assessment of Misregistered Area Class Maps

    NASA Astrophysics Data System (ADS)

    Brown, Scott

    Spatial uncertainty refers to unknown error and vagueness in geographic data. It is relevant to land change and urban growth modelers, soil and biome scientists, geological surveyors and others, who must assess thematic maps for similarity, or categorical agreement. In this paper I build upon prior map comparison research, testing the effectiveness of similarity measures on misregistered data. Though several methods compare uncertain thematic maps, few methods have been tested on misregistration. My objective is to test five map comparison methods for sensitivity to misregistration, including sub-pixel errors in both position and rotation. Methods included four fuzzy categorical models: fuzzy kappa's model, fuzzy inference, cell aggregation, and the epsilon band. The fifth method used conventional crisp classification. I applied these methods to a case study map and simulated data in two sets: a test set with misregistration error, and a control set with equivalent uniform random error. For all five methods, I used raw accuracy or the kappa statistic to measure similarity. Rough-set epsilon bands report the most similarity increase in test maps relative to control data. Conversely, the fuzzy inference model reports a decrease in test map similarity.

  19. Alignment of Standards and Assessment: A Theoretical and Empirical Study of Methods for Alignment

    ERIC Educational Resources Information Center

    Nasstrom, Gunilla; Henriksson, Widar

    2008-01-01

    Introduction: In a standards-based school-system alignment of policy documents with standards and assessment is important. To be able to evaluate whether schools and students have reached the standards, the assessment should focus on the standards. Different models and methods can be used for measuring alignment, i.e. the correspondence between…

  20. An Assessment of the Department of Education's Approach and Model for Analyzing Lender Profitability.

    ERIC Educational Resources Information Center

    Jenkins, Sarah; And Others

    An assessment was done of the Department of Education's (ED) approach to determining lender profitability for Guaranteed Student Loans. The assessment described the current net present value (NPV) method as well as discussing its strengths and weaknesses. The NPV method has been widely accepted for determining the profitability of different…

  1. Model Selection for Monitoring CO2 Plume during Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-12-31

    The model selection method developed as part of this project mainly includes four steps: (1) assessing the connectivity/dynamic characteristics of a large prior ensemble of models, (2) model clustering using multidimensional scaling coupled with k-mean clustering, (3) model selection using the Bayes' rule in the reduced model space, (4) model expansion using iterative resampling of the posterior models. The fourth step expresses one of the advantages of the method: it provides a built-in means of quantifying the uncertainty in predictions made with the selected models. In our application to plume monitoring, by expanding the posterior space of models, the finalmore » ensemble of representations of geological model can be used to assess the uncertainty in predicting the future displacement of the CO2 plume. The software implementation of this approach is attached here.« less

  2. Accuracy and precision of polyurethane dental arch models fabricated using a three-dimensional subtractive rapid prototyping method with an intraoral scanning technique

    PubMed Central

    Kim, Jae-Hong; Kim, Ki-Baek; Kim, Woong-Chul; Kim, Ji-Hwan

    2014-01-01

    Objective This study aimed to evaluate the accuracy and precision of polyurethane (PUT) dental arch models fabricated using a three-dimensional (3D) subtractive rapid prototyping (RP) method with an intraoral scanning technique by comparing linear measurements obtained from PUT models and conventional plaster models. Methods Ten plaster models were duplicated using a selected standard master model and conventional impression, and 10 PUT models were duplicated using the 3D subtractive RP technique with an oral scanner. Six linear measurements were evaluated in terms of x, y, and z-axes using a non-contact white light scanner. Accuracy was assessed using mean differences between two measurements, and precision was examined using four quantitative methods and the Bland-Altman graphical method. Repeatability was evaluated in terms of intra-examiner variability, and reproducibility was assessed in terms of inter-examiner and inter-method variability. Results The mean difference between plaster models and PUT models ranged from 0.07 mm to 0.33 mm. Relative measurement errors ranged from 2.2% to 7.6% and intraclass correlation coefficients ranged from 0.93 to 0.96, when comparing plaster models and PUT models. The Bland-Altman plot showed good agreement. Conclusions The accuracy and precision of PUT dental models for evaluating the performance of oral scanner and subtractive RP technology was acceptable. Because of the recent improvements in block material and computerized numeric control milling machines, the subtractive RP method may be a good choice for dental arch models. PMID:24696823

  3. Assessing Security of Supply: Three Methods Used in Finland

    NASA Astrophysics Data System (ADS)

    Sivonen, Hannu

    Public Private Partnership (PPP) has an important role in securing supply in Finland. Three methods are used in assessing the level of security of supply. First, in national expert groups, a linear mathematical model has been used. The model is based on interdependency estimates. It ranks societal functions or its more detailed components, such as items in the food supply chain, according to the effect and risk pertinent to the interdependencies. Second, the security of supply is assessed in industrial branch committees (clusters and pools) in the form of indicators. The level of security of supply is assessed against five generic factors (dimension 1) and tens of business branch specific functions (dimension 2). Third, in two thousand individual critical companies, the maturity of operational continuity management is assessed using Capability Maturity Model (CMM) in an extranet application. The pool committees and authorities obtain an anonymous summary. The assessments are used in allocating efforts for securing supply. The efforts may be new instructions, training, exercising, and in some cases, investment and regulation.

  4. Airframe noise prediction evaluation

    NASA Technical Reports Server (NTRS)

    Yamamoto, Kingo J.; Donelson, Michael J.; Huang, Shumei C.; Joshi, Mahendra C.

    1995-01-01

    The objective of this study is to evaluate the accuracy and adequacy of current airframe noise prediction methods using available airframe noise measurements from tests of a narrow body transport (DC-9) and a wide body transport (DC-10) in addition to scale model test data. General features of the airframe noise from these aircraft and models are outlined. The results of the assessment of two airframe prediction methods, Fink's and Munson's methods, against flight test data of these aircraft and scale model wind tunnel test data are presented. These methods were extensively evaluated against measured data from several configurations including clean, slat deployed, landing gear-deployed, flap deployed, and landing configurations of both DC-9 and DC-10. They were also assessed against a limited number of configurations of scale models. The evaluation was conducted in terms of overall sound pressure level (OASPL), tone corrected perceived noise level (PNLT), and one-third-octave band sound pressure level (SPL).

  5. Quantification for complex assessment: uncertainty estimation in final year project thesis assessment

    NASA Astrophysics Data System (ADS)

    Kim, Ho Sung

    2013-12-01

    A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final year project thesis assessment including peer assessment. A guide map can be generated by the method for finding expected uncertainties prior to the assessment implementation with a given set of variables. It employs a scale for visualisation of expertise levels, derivation of which is based on quantified clarities of mental images for levels of the examiner's expertise and the examinee's expertise achieved. To identify the relevant expertise areas that depend on the complexity in assessment format, a graphical continuum model was developed. The continuum model consists of assessment task, assessment standards and criterion for the transition towards the complex assessment owing to the relativity between implicitness and explicitness and is capable of identifying areas of expertise required for scale development.

  6. Perceptual video quality assessment in H.264 video coding standard using objective modeling.

    PubMed

    Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu

    2014-01-01

    Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.

  7. Predictive models for Escherichia coli concentrations at inland lake beaches and relationship of model variables to pathogen detection

    EPA Science Inventory

    Methods are needed improve the timeliness and accuracy of recreational water‐quality assessments. Traditional culture methods require 18–24 h to obtain results and may not reflect current conditions. Predictive models, based on environmental and water quality variables, have been...

  8. Fold assessment for comparative protein structure modeling.

    PubMed

    Melo, Francisco; Sali, Andrej

    2007-11-01

    Accurate and automated assessment of both geometrical errors and incompleteness of comparative protein structure models is necessary for an adequate use of the models. Here, we describe a composite score for discriminating between models with the correct and incorrect fold. To find an accurate composite score, we designed and applied a genetic algorithm method that searched for a most informative subset of 21 input model features as well as their optimized nonlinear transformation into the composite score. The 21 input features included various statistical potential scores, stereochemistry quality descriptors, sequence alignment scores, geometrical descriptors, and measures of protein packing. The optimized composite score was found to depend on (1) a statistical potential z-score for residue accessibilities and distances, (2) model compactness, and (3) percentage sequence identity of the alignment used to build the model. The accuracy of the composite score was compared with the accuracy of assessment by single and combined features as well as by other commonly used assessment methods. The testing set was representative of models produced by automated comparative modeling on a genomic scale. The composite score performed better than any other tested score in terms of the maximum correct classification rate (i.e., 3.3% false positives and 2.5% false negatives) as well as the sensitivity and specificity across the whole range of thresholds. The composite score was implemented in our program MODELLER-8 and was used to assess models in the MODBASE database that contains comparative models for domains in approximately 1.3 million protein sequences.

  9. Cross-validation pitfalls when selecting and assessing regression and classification models.

    PubMed

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  10. Validation of a Cognitive Diagnostic Model across Multiple Forms of a Reading Comprehension Assessment

    ERIC Educational Resources Information Center

    Clark, Amy K.

    2013-01-01

    The present study sought to fit a cognitive diagnostic model (CDM) across multiple forms of a passage-based reading comprehension assessment using the attribute hierarchy method. Previous research on CDMs for reading comprehension assessments served as a basis for the attributes in the hierarchy. The two attribute hierarchies were fit to data from…

  11. Assessing CO2 Mitigation Options Utilizing Detailed Electricity Characteristics and Including Renewable Generation

    NASA Astrophysics Data System (ADS)

    Bensaida, K.; Alie, Colin; Elkamel, A.; Almansoori, A.

    2017-08-01

    This paper presents a novel techno-economic optimization model for assessing the effectiveness of CO2 mitigation options for the electricity generation sub-sector that includes renewable energy generation. The optimization problem was formulated as a MINLP model using the GAMS modeling system. The model seeks the minimization of the power generation costs under CO2 emission constraints by dispatching power from low CO2 emission-intensity units. The model considers the detailed operation of the electricity system to effectively assess the performance of GHG mitigation strategies and integrates load balancing, carbon capture and carbon taxes as methods for reducing CO2 emissions. Two case studies are discussed to analyze the benefits and challenges of the CO2 reduction methods in the electricity system. The proposed mitigations options would not only benefit the environment, but they will as well improve the marginal cost of producing energy which represents an advantage for stakeholders.

  12. Application of 3D reconstruction system in diabetic foot ulcer injury assessment

    NASA Astrophysics Data System (ADS)

    Li, Jun; Jiang, Li; Li, Tianjian; Liang, Xiaoyao

    2018-04-01

    To deal with the considerable deviation of transparency tracing method and digital planimetry method used in current clinical diabetic foot ulcer injury assessment, this paper proposes a 3D reconstruction system which can be used to get foot model with good quality texture, then injury assessment is done by measuring the reconstructed model. The system uses the Intel RealSense SR300 depth camera which is based on infrared structured-light as input device, the required data from different view is collected by moving the camera around the scanned object. The geometry model is reconstructed by fusing the collected data, then the mesh is sub-divided to increase the number of mesh vertices and the color of each vertex is determined using a non-linear optimization, all colored vertices compose the surface texture of the reconstructed model. Experimental results indicate that the reconstructed model has millimeter-level geometric accuracy and texture with few artificial effect.

  13. Comparing exposure assessment methods for traffic-related air pollution in an adverse pregnancy outcome study

    PubMed Central

    Wu, Jun; Wilhelm, Michelle; Chung, Judith; Ritz, Beate

    2011-01-01

    Background Previous studies reported adverse impacts of traffic-related air pollution exposure on pregnancy outcomes. Yet, little information exists on how effect estimates are impacted by the different exposure assessment methods employed in these studies. Objectives To compare effect estimates for traffic-related air pollution exposure and preeclampsia, preterm birth (gestational age less than 37 weeks), and very preterm birth (gestational age less than 30 weeks) based on four commonly-used exposure assessment methods. Methods We identified 81,186 singleton births during 1997–2006 at four hospitals in Los Angeles and Orange Counties, California. Exposures were assigned to individual subjects based on residential address at delivery using the nearest ambient monitoring station data [carbon monoxide (CO), nitrogen dioxide (NO2), nitric oxide (NO), nitrogen oxides (NOx), ozone (O3), and particulate matter less than 2.5 (PM2.5) or less than 10 (PM10) μm in aerodynamic diameter], both unadjusted and temporally-adjusted land-use regression (LUR) model estimates (NO, NO2, and NOx), CALINE4 line-source air dispersion model estimates (NOx and PM2.5), and a simple traffic-density measure. We employed unconditional logistic regression to analyze preeclampsia in our birth cohort, while for gestational age-matched risk sets with preterm and very preterm birth we employed conditional logistic regression. Results We observed elevated risks for preeclampsia, preterm birth, and very preterm birth from maternal exposures to traffic air pollutants measured at ambient stations (CO, NO, NO2, and NOx) and modeled through CALINE4 (NOx and PM2.5) and LUR (NO2 and NOx). Increased risk of preterm birth and very preterm birth were also positively associated with PM10 and PM2.5 air pollution measured at ambient stations. For LUR-modeled NO2 and NOx exposures, elevated risks for all the outcomes were observed in Los Angeles only – the region for which the LUR models were initially developed. Unadjusted LUR models often produced odds ratios somewhat larger in size than temporally-adjusted models. The size of effect estimates was smaller for exposures based on simpler traffic density measures than the other exposure assessment methods. Conclusion We generally confirmed that traffic-related air pollution was associated with adverse reproductive outcomes regardless of the exposure assessment method employed, yet the size of the estimated effect depended on how both temporal and spatial variations were incorporated into exposure assessment. The LUR model was not transferable even between two contiguous areas within the same large metropolitan area in Southern California. PMID:21453913

  14. Sensors vs. experts - a performance comparison of sensor-based fall risk assessment vs. conventional assessment in a sample of geriatric patients.

    PubMed

    Marschollek, Michael; Rehwald, Anja; Wolf, Klaus-Hendrik; Gietzelt, Matthias; Nemitz, Gerhard; zu Schwabedissen, Hubertus Meyer; Schulze, Mareike

    2011-06-28

    Fall events contribute significantly to mortality, morbidity and costs in our ageing population. In order to identify persons at risk and to target preventive measures, many scores and assessment tools have been developed. These often require expertise and are costly to implement. Recent research investigates the use of wearable inertial sensors to provide objective data on motion features which can be used to assess individual fall risk automatically. So far it is unknown how well this new method performs in comparison with conventional fall risk assessment tools. The aim of our research is to compare the predictive performance of our new sensor-based method with conventional and established methods, based on prospective data. In a first study phase, 119 inpatients of a geriatric clinic took part in motion measurements using a wireless triaxial accelerometer during a Timed Up&Go (TUG) test and a 20 m walk. Furthermore, the St. Thomas Risk Assessment Tool in Falling Elderly Inpatients (STRATIFY) was performed, and the multidisciplinary geriatric care team estimated the patients' fall risk. In a second follow-up phase of the study, 46 of the participants were interviewed after one year, including a fall and activity assessment. The predictive performances of the TUG, the STRATIFY and team scores are compared. Furthermore, two automatically induced logistic regression models based on conventional clinical and assessment data (CONV) as well as sensor data (SENSOR) are matched. Among the risk assessment scores, the geriatric team score (sensitivity 56%, specificity 80%) outperforms STRATIFY and TUG. The induced logistic regression models CONV and SENSOR achieve similar performance values (sensitivity 68%/58%, specificity 74%/78%, AUC 0.74/0.72, +LR 2.64/2.61). Both models are able to identify more persons at risk than the simple scores. Sensor-based objective measurements of motion parameters in geriatric patients can be used to assess individual fall risk, and our prediction model's performance matches that of a model based on conventional clinical and assessment data. Sensor-based measurements using a small wearable device may contribute significant information to conventional methods and are feasible in an unsupervised setting. More prospective research is needed to assess the cost-benefit relation of our approach.

  15. Climate Change Impacts at Department of Defense Installations

    DTIC Science & Technology

    2017-06-16

    locations. The ease of use of this method and its flexibility have led to a wide variety of applications for assessing impacts of climate change 4...versions of these statistical methods to provide the basis for regional climate assessments for various states, regions, and government agencies...averaging (REA) method proposed by Giorgi and Mearns (2002). This method assigns reliability classifications for the multi-model ensemble simulation by

  16. Review of methods for developing regional probabilistic risk assessments, part 2: modeling invasive plant, insect, and pathogen species

    Treesearch

    P. B. Woodbury; D. A. Weinstein

    2010-01-01

    We reviewed probabilistic regional risk assessment methodologies to identify the methods that are currently in use and are capable of estimating threats to ecosystems from fire and fuels, invasive species, and their interactions with stressors. In a companion chapter, we highlight methods useful for evaluating risks from fire. In this chapter, we highlight methods...

  17. Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.

    PubMed

    Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K

    2011-01-01

    We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.

  18. Fish Consumption Advisories: Toward a Unified, Scientifically Credible Approach

    EPA Science Inventory

    A model is proposed for fish consumption advisories based on consensus-derived risk assessment values for common contaminants in fish and the latest risk assessment methods. he model accounts in part for the expected toxicity to mixtures of chemicals, the underlying uncertainties...

  19. The model of flood control using servqual method and importance performance analysis in Surakarta City – Indonesia

    NASA Astrophysics Data System (ADS)

    Titi Purwantini, V.; Sutanto, Yusuf

    2018-05-01

    This research is to create a model of flood control in the city of Surakarta using Servqual method and Importance Performance Analysis. Service quality is generally defined as the overall assessment of a service by the customersor the extent to which a service meets customer’s needs or expectations. The purpose of this study is to find the first model of flood control that is appropriate to the condition of the community. Surakarta This means looking for a model that can provide satisfactory service for the people of Surakarta who are in the location of the flood. The second is to find the right model to improve service performance of Surakarta City Government in serving the people in flood location. The method used to determine the satisfaction of the public on the quality of service is to see the difference in the quality of service expected by the community with the reality. This method is Servqual Method While to assess the performance of city government officials is by comparing the actual performance with the quality of services provided, this method is This means looking for a model that can provide satisfactory service for the people of Surakarta who are in the location of the flood.The second is to find the right model to improve service performance of Surakarta City Government in serving the people in flood location. The method used to determine the satisfaction of the public on the quality of service is to see the difference in the quality of service expected by the community with the reality. This method is Servqual Method While to assess the performance of city government officials is by comparing the actual performance with the quality of services provided, this method is Importance Performance Analysis. Samples were people living in flooded areas in the city of Surakarta. Result this research is Satisfaction = Responsiveness+ Realibility + Assurance + Empathy+ Tangible (Servqual Model) and Importance Performance Analysis is From Cartesian diagram can be made Flood Control Formula as follow: Food Control = High performance

  20. Landslide Susceptibility Statistical Methods: A Critical and Systematic Literature Review

    NASA Astrophysics Data System (ADS)

    Mihir, Monika; Malamud, Bruce; Rossi, Mauro; Reichenbach, Paola; Ardizzone, Francesca

    2014-05-01

    Landslide susceptibility assessment, the subject of this systematic review, is aimed at understanding the spatial probability of slope failures under a set of geomorphological and environmental conditions. It is estimated that about 375 landslides that occur globally each year are fatal, with around 4600 people killed per year. Past studies have brought out the increasing cost of landslide damages which primarily can be attributed to human occupation and increased human activities in the vulnerable environments. Many scientists, to evaluate and reduce landslide risk, have made an effort to efficiently map landslide susceptibility using different statistical methods. In this paper, we do a critical and systematic landslide susceptibility literature review, in terms of the different statistical methods used. For each of a broad set of studies reviewed we note: (i) study geography region and areal extent, (ii) landslide types, (iii) inventory type and temporal period covered, (iv) mapping technique (v) thematic variables used (vi) statistical models, (vii) assessment of model skill, (viii) uncertainty assessment methods, (ix) validation methods. We then pulled out broad trends within our review of landslide susceptibility, particularly regarding the statistical methods. We found that the most common statistical methods used in the study of landslide susceptibility include logistic regression, artificial neural network, discriminant analysis and weight of evidence. Although most of the studies we reviewed assessed the model skill, very few assessed model uncertainty. In terms of geographic extent, the largest number of landslide susceptibility zonations were in Turkey, Korea, Spain, Italy and Malaysia. However, there are also many landslides and fatalities in other localities, particularly India, China, Philippines, Nepal and Indonesia, Guatemala, and Pakistan, where there are much fewer landslide susceptibility studies available in the peer-review literature. This raises some concern that existing studies do not always cover all the regions globally that currently experience landslides and landslide fatalities.

  1. APOLLO: a quality assessment service for single and multiple protein models.

    PubMed

    Wang, Zheng; Eickholt, Jesse; Cheng, Jianlin

    2011-06-15

    We built a web server named APOLLO, which can evaluate the absolute global and local qualities of a single protein model using machine learning methods or the global and local qualities of a pool of models using a pair-wise comparison approach. Based on our evaluations on 107 CASP9 (Critical Assessment of Techniques for Protein Structure Prediction) targets, the predicted quality scores generated from our machine learning and pair-wise methods have an average per-target correlation of 0.671 and 0.917, respectively, with the true model quality scores. Based on our test on 92 CASP9 targets, our predicted absolute local qualities have an average difference of 2.60 Å with the actual distances to native structure. http://sysbio.rnet.missouri.edu/apollo/. Single and pair-wise global quality assessment software is also available at the site.

  2. A comparative analysis of modeled and monitored ambient hazardous air pollutants in Texas: a novel approach using concordance correlation.

    PubMed

    Lupo, Philip J; Symanski, Elaine

    2009-11-01

    Often, in studies evaluating the health effects of hazardous air pollutants (HAPs), researchers rely on ambient air levels to estimate exposure. Two potential data sources are modeled estimates from the U.S. Environmental Protection Agency (EPA) Assessment System for Population Exposure Nationwide (ASPEN) and ambient air pollutant measurements from monitoring networks. The goal was to conduct comparisons of modeled and monitored estimates of HAP levels in the state of Texas using traditional approaches and a previously unexploited method, concordance correlation analysis, to better inform decisions regarding agreement. Census tract-level ASPEN estimates and monitoring data for all HAPs throughout Texas, available from the EPA Air Quality System, were obtained for 1990, 1996, and 1999. Monitoring sites were mapped to census tracts using U.S. Census data. Exclusions were applied to restrict the monitored data to measurements collected using a common sampling strategy with minimal missing values over time. Comparisons were made for 28 HAPs in 38 census tracts located primarily in urban areas throughout Texas. For each pollutant and by year of assessment, modeled and monitored air pollutant annual levels were compared using standard methods (i.e., ratios of model-to-monitor annual levels). Concordance correlation analysis was also used, which assesses linearity and agreement while providing a formal method of statistical inference. Forty-eight percent of the median model-to-monitor values fell between 0.5 and 2, whereas only 17% of concordance correlation coefficients were significant and greater than 0.5. On the basis of concordance correlation analysis, the findings indicate there is poorer agreement when compared with the previously applied ad hoc methods to assess comparability between modeled and monitored levels of ambient HAPs.

  3. A method for independent modelling in support of regulatory review of dose assessments.

    PubMed

    Dverstorp, Björn; Xu, Shulan

    2017-11-01

    Several countries consider geological disposal facilities as the preferred option for spent nuclear fuel due to their potential to provide isolation from the surface environment on very long timescales. In 2011 the Swedish Nuclear Fuel & Waste Management Co. (SKB) submitted a license application for construction of a spent nuclear fuel repository. The disposal method involves disposing spent fuel in copper canisters with a cast iron insert at about 500 m depth in crystalline basement rock, and each canister is surrounded by a buffer of swelling bentonite clay. SKB's license application is supported by a post-closure safety assessment, SR-Site. SR-Site has been reviewed by the Swedish Radiation Safety Authority (SSM) for five years. The main method for review of SKB's license application is document review, which is carried out by SSM's staff and supported by SSM's external experts. The review has proven a challenging task due to its broad scope, complexity and multidisciplinary nature. SSM and its predecessors have, for several decades, been developing independent models to support regulatory reviews of post-closure safety assessments for geological repositories. For the review of SR-Site, SSM has developed a modelling approach with a structured application of independent modelling activities, including replication modelling, use of alternative conceptual models and bounding calculations, to complement the traditional document review. This paper describes this scheme and its application to biosphere and dose assessment modelling. SSM's independent modelling has provided important insights regarding quality and reasonableness of SKB's rather complex biosphere modelling and has helped quantifying conservatisms and highlighting conceptual uncertainty. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Tenax extraction as a simple approach to improve environmental risk assessments.

    PubMed

    Harwood, Amanda D; Nutile, Samuel A; Landrum, Peter F; Lydy, Michael J

    2015-07-01

    It is well documented that using exhaustive chemical extractions is not an effective means of assessing exposure of hydrophobic organic compounds in sediments and that bioavailability-based techniques are an improvement over traditional methods. One technique that has shown special promise as a method for assessing the bioavailability of hydrophobic organic compounds in sediment is the use of Tenax-extractable concentrations. A 6-h or 24-h single-point Tenax-extractable concentration correlates to both bioaccumulation and toxicity. This method has demonstrated effectiveness for several hydrophobic organic compounds in various organisms under both field and laboratory conditions. In addition, a Tenax bioaccumulation model was developed for multiple compounds relating 24-h Tenax-extractable concentrations to oligochaete tissue concentrations exposed in both the laboratory and field. This model has demonstrated predictive capacity for additional compounds and species. Use of Tenax-extractable concentrations to estimate exposure is rapid, simple, straightforward, and relatively inexpensive, as well as accurate. Therefore, this method would be an invaluable tool if implemented in risk assessments. © 2015 SETAC.

  5. Description and evaluation of an interference assessment for a slotted-wall wind tunnel

    NASA Technical Reports Server (NTRS)

    Kemp, William B., Jr.

    1991-01-01

    A wind-tunnel interference assessment method applicable to test sections with discrete finite-length wall slots is described. The method is based on high order panel method technology and uses mixed boundary conditions to satisfy both the tunnel geometry and wall pressure distributions measured in the slotted-wall region. Both the test model and its sting support system are represented by distributed singularities. The method yields interference corrections to the model test data as well as surveys through the interference field at arbitrary locations. These results include the equivalent of tunnel Mach calibration, longitudinal pressure gradient, tunnel flow angularity, wall interference, and an inviscid form of sting interference. Alternative results which omit the direct contribution of the sting are also produced. The method was applied to the National Transonic Facility at NASA Langley Research Center for both tunnel calibration tests and tests of two models of subsonic transport configurations.

  6. Designing Cognitively Diagnostic Assessment for Algebraic Content Knowledge and Thinking Skills

    ERIC Educational Resources Information Center

    Zhang, Zhidong

    2018-01-01

    This study explored a diagnostic assessment method that emphasized the cognitive process of algebra learning. The study utilized a design and a theory-driven model to examine the content knowledge. Using the theory driven model, the thinking skills of algebra learning was also examined. A Bayesian network model was applied to represent the theory…

  7. Probabilistic assessment methodology for continuous-type petroleum accumulations

    USGS Publications Warehouse

    Crovelli, R.A.

    2003-01-01

    The analytic resource assessment method, called ACCESS (Analytic Cell-based Continuous Energy Spreadsheet System), was developed to calculate estimates of petroleum resources for the geologic assessment model, called FORSPAN, in continuous-type petroleum accumulations. The ACCESS method is based upon mathematical equations derived from probability theory in the form of a computer spreadsheet system. ?? 2003 Elsevier B.V. All rights reserved.

  8. Minimizing effects of methodological decisions on interpretation and prediction in species distribution studies: An example with background selection

    USGS Publications Warehouse

    Jarnevich, Catherine S.; Talbert, Marian; Morisette, Jeffrey T.; Aldridge, Cameron L.; Brown, Cynthia; Kumar, Sunil; Manier, Daniel; Talbert, Colin; Holcombe, Tracy R.

    2017-01-01

    Evaluating the conditions where a species can persist is an important question in ecology both to understand tolerances of organisms and to predict distributions across landscapes. Presence data combined with background or pseudo-absence locations are commonly used with species distribution modeling to develop these relationships. However, there is not a standard method to generate background or pseudo-absence locations, and method choice affects model outcomes. We evaluated combinations of both model algorithms (simple and complex generalized linear models, multivariate adaptive regression splines, Maxent, boosted regression trees, and random forest) and background methods (random, minimum convex polygon, and continuous and binary kernel density estimator (KDE)) to assess the sensitivity of model outcomes to choices made. We evaluated six questions related to model results, including five beyond the common comparison of model accuracy assessment metrics (biological interpretability of response curves, cross-validation robustness, independent data accuracy and robustness, and prediction consistency). For our case study with cheatgrass in the western US, random forest was least sensitive to background choice and the binary KDE method was least sensitive to model algorithm choice. While this outcome may not hold for other locations or species, the methods we used can be implemented to help determine appropriate methodologies for particular research questions.

  9. Combining a dispersal model with network theory to assess habitat connectivity.

    PubMed

    Lookingbill, Todd R; Gardner, Robert H; Ferrari, Joseph R; Keller, Cherry E

    2010-03-01

    Assessing the potential for threatened species to persist and spread within fragmented landscapes requires the identification of core areas that can sustain resident populations and dispersal corridors that can link these core areas with isolated patches of remnant habitat. We developed a set of GIS tools, simulation methods, and network analysis procedures to assess potential landscape connectivity for the Delmarva fox squirrel (DFS; Sciurus niger cinereus), an endangered species inhabiting forested areas on the Delmarva Peninsula, USA. Information on the DFS's life history and dispersal characteristics, together with data on the composition and configuration of land cover on the peninsula, were used as input data for an individual-based model to simulate dispersal patterns of millions of squirrels. Simulation results were then assessed using methods from graph theory, which quantifies habitat attributes associated with local and global connectivity. Several bottlenecks to dispersal were identified that were not apparent from simple distance-based metrics, highlighting specific locations for landscape conservation, restoration, and/or squirrel translocations. Our approach links simulation models, network analysis, and available field data in an efficient and general manner, making these methods useful and appropriate for assessing the movement dynamics of threatened species within landscapes being altered by human and natural disturbances.

  10. Method for Identification of Results of Dynamic Overloads in Assessment of Safety Use of the Mine Auxiliary Transportation System

    NASA Astrophysics Data System (ADS)

    Tokarczyk, Jarosław

    2016-12-01

    Method for identification the effects of dynamic overload affecting the people, which may occur in the emergency state of suspended monorail is presented in the paper. The braking curve using MBS (Multi-Body System) simulation was determined. For this purpose a computational model (MBS) of suspended monorail was developed and two different variants of numerical calculations were carried out. An algorithm of conducting numerical simulations to assess the effects of dynamic overload acting on the suspended monorails' users is also posted in the paper. An example of computational model FEM (Finite Element Method) composed of technical mean and the anthropometrical model ATB (Articulated Total Body) is shown. The simulation results are presented: graph of HIC (Head Injury Criterion) parameter and successive phases of dislocation of ATB model. Generator of computational models for safety criterion, which enables preparation of input data and remote starting the simulation, is proposed.

  11. Laboratory-based versus non-laboratory-based method for assessment of cardiovascular disease risk: the NHANES I Follow-up Study cohort

    PubMed Central

    Gaziano, Thomas A; Young, Cynthia R; Fitzmaurice, Garrett; Atwood, Sidney; Gaziano, J Michael

    2008-01-01

    Summary Background Around 80% of all cardiovascular deaths occur in developing countries. Assessment of those patients at high risk is an important strategy for prevention. Since developing countries have limited resources for prevention strategies that require laboratory testing, we assessed if a risk prediction method that did not require any laboratory tests could be as accurate as one requiring laboratory information. Methods The National Health and Nutrition Examination Survey (NHANES) was a prospective cohort study of 14 407 US participants aged between 25–74 years at the time they were first examined (between 1971 and 1975). Our follow-up study population included participants with complete information on these surveys who did not report a history of cardiovascular disease (myocardial infarction, heart failure, stroke, angina) or cancer, yielding an analysis dataset N=6186. We compared how well either method could predict first-time fatal and non-fatal cardiovascular disease events in this cohort. For the laboratory-based model, which required blood testing, we used standard risk factors to assess risk of cardiovascular disease: age, systolic blood pressure, smoking status, total cholesterol, reported diabetes status, and current treatment for hypertension. For the non-laboratory-based model, we substituted body-mass index for cholesterol. Findings In the cohort of 6186, there were 1529 first-time cardiovascular events and 578 (38%) deaths due to cardiovascular disease over 21 years. In women, the laboratory-based model was useful for predicting events, with a c statistic of 0·829. The c statistic of the non-laboratory-based model was 0·831. In men, the results were similar (0·784 for the laboratory-based model and 0·783 for the non-laboratory-based model). Results were similar between the laboratory-based and non-laboratory-based models in both men and women when restricted to fatal events only. Interpretation A method that uses non-laboratory-based risk factors predicted cardiovascular events as accurately as one that relied on laboratory-based values. This approach could simplify risk assessment in situations where laboratory testing is inconvenient or unavailable. PMID:18342687

  12. Goodness-of-fit tests and model diagnostics for negative binomial regression of RNA sequencing data.

    PubMed

    Mi, Gu; Di, Yanming; Schafer, Daniel W

    2015-01-01

    This work is about assessing model adequacy for negative binomial (NB) regression, particularly (1) assessing the adequacy of the NB assumption, and (2) assessing the appropriateness of models for NB dispersion parameters. Tools for the first are appropriate for NB regression generally; those for the second are primarily intended for RNA sequencing (RNA-Seq) data analysis. The typically small number of biological samples and large number of genes in RNA-Seq analysis motivate us to address the trade-offs between robustness and statistical power using NB regression models. One widely-used power-saving strategy, for example, is to assume some commonalities of NB dispersion parameters across genes via simple models relating them to mean expression rates, and many such models have been proposed. As RNA-Seq analysis is becoming ever more popular, it is appropriate to make more thorough investigations into power and robustness of the resulting methods, and into practical tools for model assessment. In this article, we propose simulation-based statistical tests and diagnostic graphics to address model adequacy. We provide simulated and real data examples to illustrate that our proposed methods are effective for detecting the misspecification of the NB mean-variance relationship as well as judging the adequacy of fit of several NB dispersion models.

  13. Assessment of the integration capability of system architectures from a complex and distributed software systems perspective

    NASA Astrophysics Data System (ADS)

    Leuchter, S.; Reinert, F.; Müller, W.

    2014-06-01

    Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.

  14. Risk assessment of storm surge disaster based on numerical models and remote sensing

    NASA Astrophysics Data System (ADS)

    Liu, Qingrong; Ruan, Chengqing; Zhong, Shan; Li, Jian; Yin, Zhonghui; Lian, Xihu

    2018-06-01

    Storm surge is one of the most serious ocean disasters in the world. Risk assessment of storm surge disaster for coastal areas has important implications for planning economic development and reducing disaster losses. Based on risk assessment theory, this paper uses coastal hydrological observations, a numerical storm surge model and multi-source remote sensing data, proposes methods for valuing hazard and vulnerability for storm surge and builds a storm surge risk assessment model. Storm surges in different recurrence periods are simulated in numerical models and the flooding areas and depth are calculated, which are used for assessing the hazard of storm surge; remote sensing data and GIS technology are used for extraction of coastal key objects and classification of coastal land use are identified, which is used for vulnerability assessment of storm surge disaster. The storm surge risk assessment model is applied for a typical coastal city, and the result shows the reliability and validity of the risk assessment model. The building and application of storm surge risk assessment model provides some basis reference for the city development plan and strengthens disaster prevention and mitigation.

  15. Comparison of the Various Methodologies Used in Studying Runoff and Sediment Load in the Yellow River Basin

    NASA Astrophysics Data System (ADS)

    Xu, M., III; Liu, X.

    2017-12-01

    In the past 60 years, both the runoff and sediment load in the Yellow River Basin showed significant decreasing trends owing to the influences of human activities and climate change. Quantifying the impact of each factor (e.g. precipitation, sediment trapping dams, pasture, terrace, etc.) on the runoff and sediment load is among the key issues to guide the implement of water and soil conservation measures, and to predict the variation trends in the future. Hundreds of methods have been developed for studying the runoff and sediment load in the Yellow River Basin. Generally, these methods can be classified into empirical methods and physical-based models. The empirical methods, including hydrological method, soil and water conservation method, etc., are widely used in the Yellow River management engineering. These methods generally apply the statistical analyses like the regression analysis to build the empirical relationships between the main characteristic variables in a river basin. The elasticity method extensively used in the hydrological research can be classified into empirical method as it is mathematically deduced to be equivalent with the hydrological method. Physical-based models mainly include conceptual models and distributed models. The conceptual models are usually lumped models (e.g. SYMHD model, etc.) and can be regarded as transition of empirical models and distributed models. Seen from the publications that less studies have been conducted applying distributed models than empirical models as the simulation results of runoff and sediment load based on distributed models (e.g. the Digital Yellow Integrated Model, the Geomorphology-Based Hydrological Model, etc.) were usually not so satisfied owing to the intensive human activities in the Yellow River Basin. Therefore, this study primarily summarizes the empirical models applied in the Yellow River Basin and theoretically analyzes the main causes for the significantly different results using different empirical researching methods. Besides, we put forward an assessment frame for the researching methods of the runoff and sediment load variations in the Yellow River Basin from the point of view of inputting data, model structure and result output. And the assessment frame was then applied in the Huangfuchuan River.

  16. Statistical Methods for Assessments in Simulations and Serious Games. Research Report. ETS RR-14-12

    ERIC Educational Resources Information Center

    Fu, Jianbin; Zapata, Diego; Mavronikolas, Elia

    2014-01-01

    Simulation or game-based assessments produce outcome data and process data. In this article, some statistical models that can potentially be used to analyze data from simulation or game-based assessments are introduced. Specifically, cognitive diagnostic models that can be used to estimate latent skills from outcome data so as to scale these…

  17. A call to improve methods for estimating tree biomass for regional and national assessments

    Treesearch

    Aaron R. Weiskittel; David W. MacFarlane; Philip J. Radtke; David L.R. Affleck; Hailemariam Temesgen; Christopher W. Woodall; James A. Westfall; John W. Coulston

    2015-01-01

    Tree biomass is typically estimated using statistical models. This review highlights five limitations of most tree biomass models, which include the following: (1) biomass data are costly to collect and alternative sampling methods are used; (2) belowground data and models are generally lacking; (3) models are often developed from small and geographically limited data...

  18. Bayesian Revision of Residual Detection Power

    NASA Technical Reports Server (NTRS)

    DeLoach, Richard

    2013-01-01

    This paper addresses some issues with quality assessment and quality assurance in response surface modeling experiments executed in wind tunnels. The role of data volume on quality assurance for response surface models is reviewed. Specific wind tunnel response surface modeling experiments are considered for which apparent discrepancies exist between fit quality expectations based on implemented quality assurance tactics, and the actual fit quality achieved in those experiments. These discrepancies are resolved by using Bayesian inference to account for certain imperfections in the assessment methodology. Estimates of the fraction of out-of-tolerance model predictions based on traditional frequentist methods are revised to account for uncertainty in the residual assessment process. The number of sites in the design space for which residuals are out of tolerance is seen to exceed the number of sites where the model actually fails to fit the data. A method is presented to estimate how much of the design space in inadequately modeled by low-order polynomial approximations to the true but unknown underlying response function.

  19. ProQ3: Improved model quality assessments using Rosetta energy terms

    PubMed Central

    Uziela, Karolis; Shu, Nanjiang; Wallner, Björn; Elofsson, Arne

    2016-01-01

    Quality assessment of protein models using no other information than the structure of the model itself has been shown to be useful for structure prediction. Here, we introduce two novel methods, ProQRosFA and ProQRosCen, inspired by the state-of-art method ProQ2, but using a completely different description of a protein model. ProQ2 uses contacts and other features calculated from a model, while the new predictors are based on Rosetta energies: ProQRosFA uses the full-atom energy function that takes into account all atoms, while ProQRosCen uses the coarse-grained centroid energy function. The two new predictors also include residue conservation and terms corresponding to the agreement of a model with predicted secondary structure and surface area, as in ProQ2. We show that the performance of these predictors is on par with ProQ2 and significantly better than all other model quality assessment programs. Furthermore, we show that combining the input features from all three predictors, the resulting predictor ProQ3 performs better than any of the individual methods. ProQ3, ProQRosFA and ProQRosCen are freely available both as a webserver and stand-alone programs at http://proq3.bioinfo.se/. PMID:27698390

  20. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Competency in health care management: a training model in epidemiologic methods for assessing and improving the quality of clinical practice through evidence-based decision making.

    PubMed

    Hudak, R P; Jacoby, I; Meyer, G S; Potter, A L; Hooper, T I; Krakauer, H

    1997-01-01

    This article describes a training model that focuses on health care management by applying epidemiologic methods to assess and improve the quality of clinical practice. The model's uniqueness is its focus on integrating clinical evidence-based decision making with fundamental principles of resource management to achieve attainable, cost-effective, high-quality health outcomes. The target students are current and prospective clinical and administrative executives who must optimize decision making at the clinical and managerial levels of health care organizations.

  2. A scoping review of malaria forecasting: past work and future directions

    PubMed Central

    Zinszer, Kate; Verma, Aman D; Charland, Katia; Brewer, Timothy F; Brownstein, John S; Sun, Zhuoyu; Buckeridge, David L

    2012-01-01

    Objectives There is a growing body of literature on malaria forecasting methods and the objective of our review is to identify and assess methods, including predictors, used to forecast malaria. Design Scoping review. Two independent reviewers searched information sources, assessed studies for inclusion and extracted data from each study. Information sources Search strategies were developed and the following databases were searched: CAB Abstracts, EMBASE, Global Health, MEDLINE, ProQuest Dissertations & Theses and Web of Science. Key journals and websites were also manually searched. Eligibility criteria for included studies We included studies that forecasted incidence, prevalence or epidemics of malaria over time. A description of the forecasting model and an assessment of the forecast accuracy of the model were requirements for inclusion. Studies were restricted to human populations and to autochthonous transmission settings. Results We identified 29 different studies that met our inclusion criteria for this review. The forecasting approaches included statistical modelling, mathematical modelling and machine learning methods. Climate-related predictors were used consistently in forecasting models, with the most common predictors being rainfall, relative humidity, temperature and the normalised difference vegetation index. Model evaluation was typically based on a reserved portion of data and accuracy was measured in a variety of ways including mean-squared error and correlation coefficients. We could not compare the forecast accuracy of models from the different studies as the evaluation measures differed across the studies. Conclusions Applying different forecasting methods to the same data, exploring the predictive ability of non-environmental variables, including transmission reducing interventions and using common forecast accuracy measures will allow malaria researchers to compare and improve models and methods, which should improve the quality of malaria forecasting. PMID:23180505

  3. Cost assessment of natural hazards in Europe - state-of-the-art, knowledge gaps and recommendations

    NASA Astrophysics Data System (ADS)

    Meyer, V.; Becker, N.; Markantonis, V.; Schwarze, R.; van den Bergh, J. C. J. M.; Bouwer, L. M.; Bubeck, P.; Ciavola, P.; Thieken, A. H.; Genovese, E.; Green, C.; Hallegatte, S.; Kreibich, H.; Lequeux, Q.; Viavattenne, C.; Logar, I.; Papyrakis, E.; Pfurtscheller, C.; Poussin, J.; Przyluski, V.

    2012-04-01

    Effective and efficient reduction of natural hazard risks requires a thorough understanding of the costs of natural hazards in order to develop sustainable risk management strategies. The current methods that assess the costs of different natural hazards employ a diversity of terminologies and approaches for different hazards and impacted sectors. This makes it difficult to arrive at robust, comprehensive and comparable cost figures. The CONHAZ (Costs of Natural Hazards) project aimed to compile and synthesise current knowledge on cost assessment methods in order to strengthen the role of cost assessments in the development of integrated natural hazard management and adaptation planning. In order to achieve this, CONHAZ has adopted a comprehensive approach, considering natural hazards ranging from droughts, floods and coastal hazards to Alpine hazards, as well as different impacted sectors and cost types. Its specific objectives have been 1) to compile the state-of-the-art methods for cost assessment; 2) to analyse and assess these methods in terms of technical aspects, as well as terminology, data quality and availability, and research gaps; and 3) to synthesise resulting knowledge into recommendations and to identify further research needs. This presentation summarises the main results of CONHAZ. CONHAZ differentiates between direct tangible damages, losses due to business interruption, indirect damages, intangible effects, and costs of risk mitigation. It is shown that the main focus of cost assessment methods and their application in practice is on direct costs, while existing methods for assessing intangible and indirect effects are rather rarely applied and methods for assessing indirect effects often cannot be used on the scale of interest (e.g. the regional scale). Furthermore, methods often focus on single sectors and/or hazards, and only very few are able to reflect several sectors or multiple hazards. Process understanding and its use in cost assessment is poor, leading to highly uncertain results. However, sensitivity and uncertainty analyses as well as validations are hardly undertaken. Important recommendations are that cost assessment can be made more comprehensive by including indirect and intangible effects. Furthermore, the importance is highlighted of identifying sources of uncertainties, of reducing them effectively and of documenting remaining ones. One source of uncertainty concerns data sources. A framework for supporting data collection on the European level ensuring minimum data quality standards would facilitate the development and consistency of European and national databases. Furthermore, an improvement of methods is needed with regard to a better understanding and modelling of the damaging processes. In particular, there is a need for a better understanding of the economic response to external shocks and to improve models for indirect cost assessment based on this. Also models to estimate direct economic damage need to be based on more knowledge about the complex processes leading to damages. Moreover, the dynamics of risk due to climate and socio-economic change have to be better considered in the models in order to unveil uncertainties about future developments in the costs of natural hazards. Finally, there is a need for appropriate and transparent tools and guidance to support decision makers in the integration of uncertain cost assessment figures into decision making.

  4. Optimization of DRASTIC method by supervised committee machine artificial intelligence to assess groundwater vulnerability for Maragheh-Bonab plain aquifer, Iran

    NASA Astrophysics Data System (ADS)

    Fijani, Elham; Nadiri, Ata Allah; Asghari Moghaddam, Asghar; Tsai, Frank T.-C.; Dixon, Barnali

    2013-10-01

    Contamination of wells with nitrate-N (NO3-N) poses various threats to human health. Contamination of groundwater is a complex process and full of uncertainty in regional scale. Development of an integrative vulnerability assessment methodology can be useful to effectively manage (including prioritization of limited resource allocation to monitor high risk areas) and protect this valuable freshwater source. This study introduces a supervised committee machine with artificial intelligence (SCMAI) model to improve the DRASTIC method for groundwater vulnerability assessment for the Maragheh-Bonab plain aquifer in Iran. Four different AI models are considered in the SCMAI model, whose input is the DRASTIC parameters. The SCMAI model improves the committee machine artificial intelligence (CMAI) model by replacing the linear combination in the CMAI with a nonlinear supervised ANN framework. To calibrate the AI models, NO3-N concentration data are divided in two datasets for the training and validation purposes. The target value of the AI models in the training step is the corrected vulnerability indices that relate to the first NO3-N concentration dataset. After model training, the AI models are verified by the second NO3-N concentration dataset. The results show that the four AI models are able to improve the DRASTIC method. Since the best AI model performance is not dominant, the SCMAI model is considered to combine the advantages of individual AI models to achieve the optimal performance. The SCMAI method re-predicts the groundwater vulnerability based on the different AI model prediction values. The results show that the SCMAI outperforms individual AI models and committee machine with artificial intelligence (CMAI) model. The SCMAI model ensures that no water well with high NO3-N levels would be classified as low risk and vice versa. The study concludes that the SCMAI model is an effective model to improve the DRASTIC model and provides a confident estimate of the pollution risk.

  5. Using the Draw-a-Scientist Test for Inquiry and Evaluation

    ERIC Educational Resources Information Center

    Miele, Eleanor

    2014-01-01

    The Draw-a-Scientist Test (DAST) is a tool to assess stereotypical imagery of scientists. This paper describes the use of the DAST as both a model for inquiry and as a method of assessing the affective domain. The DAST was administered in a science education methods course for undergraduate students of elementary education, a methods course for…

  6. GIS-based regionalized life cycle assessment: how big is small enough? Methodology and case study of electricity generation.

    PubMed

    Mutel, Christopher L; Pfister, Stephan; Hellweg, Stefanie

    2012-01-17

    We describe a new methodology for performing regionalized life cycle assessment and systematically choosing the spatial scale of regionalized impact assessment methods. We extend standard matrix-based calculations to include matrices that describe the mapping from inventory to impact assessment spatial supports. Uncertainty in inventory spatial data is modeled using a discrete spatial distribution function, which in a case study is derived from empirical data. The minimization of global spatial autocorrelation is used to choose the optimal spatial scale of impact assessment methods. We demonstrate these techniques on electricity production in the United States, using regionalized impact assessment methods for air emissions and freshwater consumption. Case study results show important differences between site-generic and regionalized calculations, and provide specific guidance for future improvements of inventory data sets and impact assessment methods.

  7. Combining correlative and mechanistic habitat suitability models to improve ecological compensation.

    PubMed

    Meineri, Eric; Deville, Anne-Sophie; Grémillet, David; Gauthier-Clerc, Michel; Béchet, Arnaud

    2015-02-01

    Only a few studies have shown positive impacts of ecological compensation on species dynamics affected by human activities. We argue that this is due to inappropriate methods used to forecast required compensation in environmental impact assessments. These assessments are mostly descriptive and only valid at limited spatial and temporal scales. However, habitat suitability models developed to predict the impacts of environmental changes on potential species' distributions should provide rigorous science-based tools for compensation planning. Here we describe the two main classes of predictive models: correlative models and individual-based mechanistic models. We show how these models can be used alone or synoptically to improve compensation planning. While correlative models are easier to implement, they tend to ignore underlying ecological processes and lack accuracy. On the contrary, individual-based mechanistic models can integrate biological interactions, dispersal ability and adaptation. Moreover, among mechanistic models, those considering animal energy balance are particularly efficient at predicting the impact of foraging habitat loss. However, mechanistic models require more field data compared to correlative models. Hence we present two approaches which combine both methods for compensation planning, especially in relation to the spatial scale considered. We show how the availability of biological databases and software enabling fast and accurate population projections could be advantageously used to assess ecological compensation requirement efficiently in environmental impact assessments. © 2014 The Authors. Biological Reviews © 2014 Cambridge Philosophical Society.

  8. An integrated eco-hydrologic modeling framework for assessing the effects of interacting stressors on forest ecosystem services

    EPA Science Inventory

    The U.S. Environmental Protection Agency recently established the Ecosystem Services Research Program to help formulate methods and models for conducting comprehensive risk assessments that quantify how multiple ecosystem services interact and respond in concert to environmental ...

  9. An integrated eco-hydrologic modeling framework for assessing the effects of interacting stressors on multiple ecosystem services

    EPA Science Inventory

    The U.S. Environmental Protection Agency recently established the Ecosystem Services Research Program to help formulate methods and models for conducting comprehensive risk assessments that quantify how multiple ecosystem services interact and respond in concert to environmental ...

  10. Model of environmental life cycle assessment for coal mining operations.

    PubMed

    Burchart-Korol, Dorota; Fugiel, Agata; Czaplicka-Kolarz, Krystyna; Turek, Marian

    2016-08-15

    This paper presents a novel approach to environmental assessment of coal mining operations, which enables assessment of the factors that are both directly and indirectly affecting the environment and are associated with the production of raw materials and energy used in processes. The primary novelty of the paper is the development of a computational environmental life cycle assessment (LCA) model for coal mining operations and the application of the model for coal mining operations in Poland. The LCA model enables the assessment of environmental indicators for all identified unit processes in hard coal mines with the life cycle approach. The proposed model enables the assessment of greenhouse gas emissions (GHGs) based on the IPCC method and the assessment of damage categories, such as human health, ecosystems and resources based on the ReCiPe method. The model enables the assessment of GHGs for hard coal mining operations in three time frames: 20, 100 and 500years. The model was used to evaluate the coal mines in Poland. It was demonstrated that the largest environmental impacts in damage categories were associated with the use of fossil fuels, methane emissions and the use of electricity, processing of wastes, heat, and steel supports. It was concluded that an environmental assessment of coal mining operations, apart from direct influence from processing waste, methane emissions and drainage water, should include the use of electricity, heat and steel, particularly for steel supports. Because the model allows the comparison of environmental impact assessment for various unit processes, it can be used for all hard coal mines, not only in Poland but also in the world. This development is an important step forward in the study of the impacts of fossil fuels on the environment with the potential to mitigate the impact of the coal industry on the environment. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Fast Geometric Consensus Approach for Protein Model Quality Assessment

    PubMed Central

    Adamczak, Rafal; Pillardy, Jaroslaw; Vallat, Brinda K.

    2011-01-01

    Abstract Model quality assessment (MQA) is an integral part of protein structure prediction methods that typically generate multiple candidate models. The challenge lies in ranking and selecting the best models using a variety of physical, knowledge-based, and geometric consensus (GC)-based scoring functions. In particular, 3D-Jury and related GC methods assume that well-predicted (sub-)structures are more likely to occur frequently in a population of candidate models, compared to incorrectly folded fragments. While this approach is very successful in the context of diversified sets of models, identifying similar substructures is computationally expensive since all pairs of models need to be superimposed using MaxSub or related heuristics for structure-to-structure alignment. Here, we consider a fast alternative, in which structural similarity is assessed using 1D profiles, e.g., consisting of relative solvent accessibilities and secondary structures of equivalent amino acid residues in the respective models. We show that the new approach, dubbed 1D-Jury, allows to implicitly compare and rank N models in O(N) time, as opposed to quadratic complexity of 3D-Jury and related clustering-based methods. In addition, 1D-Jury avoids computationally expensive 3D superposition of pairs of models. At the same time, structural similarity scores based on 1D profiles are shown to correlate strongly with those obtained using MaxSub. In terms of the ability to select the best models as top candidates 1D-Jury performs on par with other GC methods. Other potential applications of the new approach, including fast clustering of large numbers of intermediate structures generated by folding simulations, are discussed as well. PMID:21244273

  12. Methods of Comprehensive Assessment for China’s Energy Sustainability

    NASA Astrophysics Data System (ADS)

    Xu, Zhijin; Song, Yankui

    2018-02-01

    In order to assess the sustainable development of China’s energy objectively and accurately, we need to establish a reasonable indicator system for energy sustainability and make a targeted comprehensive assessment with the scientific methods. This paper constructs a comprehensive indicator system for energy sustainability from five aspects of economy, society, environment, energy resources and energy technology based on the theory of sustainable development and the theory of symbiosis. On this basis, it establishes and discusses the assessment models and the general assessment methods for energy sustainability with the help of fuzzy mathematics. It is of some reference for promoting the sustainable development of China’s energy, economy and society.

  13. The construction and assessment of a statistical model for the prediction of protein assay data.

    PubMed

    Pittman, J; Sacks, J; Young, S Stanley

    2002-01-01

    The focus of this work is the development of a statistical model for a bioinformatics database whose distinctive structure makes model assessment an interesting and challenging problem. The key components of the statistical methodology, including a fast approximation to the singular value decomposition and the use of adaptive spline modeling and tree-based methods, are described, and preliminary results are presented. These results are shown to compare favorably to selected results achieved using comparitive methods. An attempt to determine the predictive ability of the model through the use of cross-validation experiments is discussed. In conclusion a synopsis of the results of these experiments and their implications for the analysis of bioinformatic databases in general is presented.

  14. Pesticide Environmental Accounting: a method for assessing the external costs of individual pesticide applications.

    PubMed

    Leach, A W; Mumford, J D

    2008-01-01

    The Pesticide Environmental Accounting (PEA) tool provides a monetary estimate of environmental and health impacts per hectare-application for any pesticide. The model combines the Environmental Impact Quotient method and a methodology for absolute estimates of external pesticide costs in UK, USA and Germany. For many countries resources are not available for intensive assessments of external pesticide costs. The model converts external costs of a pesticide in the UK, USA and Germany to Mediterranean countries. Economic and policy applications include estimating impacts of pesticide reduction policies or benefits from technologies replacing pesticides, such as sterile insect technique. The system integrates disparate data and approaches into a single logical method. The assumptions in the system provide transparency and consistency but at the cost of some specificity and precision, a reasonable trade-off for a method that provides both comparative estimates of pesticide impacts and area-based assessments of absolute impacts.

  15. A new method to quantify the health risks from sources of perfluoroalkyl substances, combined with positive matrix factorization and risk assessment models.

    PubMed

    Xu, Jiao; Shi, Guo-Liang; Guo, Chang-Sheng; Wang, Hai-Ting; Tian, Ying-Ze; Huangfu, Yan-Qi; Zhang, Yuan; Feng, Yin-Chang; Xu, Jian

    2018-01-01

    A hybrid model based on the positive matrix factorization (PMF) model and the health risk assessment model for assessing risks associated with sources of perfluoroalkyl substances (PFASs) in water was established and applied at Dianchi Lake to test its applicability. The new method contains 2 stages: 1) the sources of PFASs were apportioned by the PMF model and 2) the contribution of health risks from each source was calculated by the new hybrid model. Two factors were extracted by PMF, with factor 1 identified as aqueous fire-fighting foams source and factor 2 as fluoropolymer manufacturing and processing and perfluorooctanoic acid production source. The health risk of PFASs in the water assessed by the health risk assessment model was 9.54 × 10 -7  a -1 on average, showing no obvious adverse effects to human health. The 2 sources' risks estimated by the new hybrid model ranged from 2.95 × 10 -10 to 6.60 × 10 -6  a -1 and from 1.64 × 10 -7 to 1.62 × 10 -6  a -1 , respectively. The new hybrid model can provide useful information on the health risks of PFAS sources, which is helpful for pollution control and environmental management. Environ Toxicol Chem 2018;37:107-115. © 2017 SETAC. © 2017 SETAC.

  16. A Quantitative Climate-Match Score for Risk-Assessment Screening of Reptile and Amphibian Introductions

    NASA Astrophysics Data System (ADS)

    van Wilgen, Nicola J.; Roura-Pascual, Núria; Richardson, David M.

    2009-09-01

    Assessing climatic suitability provides a good preliminary estimate of the invasive potential of a species to inform risk assessment. We examined two approaches for bioclimatic modeling for 67 reptile and amphibian species introduced to California and Florida. First, we modeled the worldwide distribution of the biomes found in the introduced range to highlight similar areas worldwide from which invaders might arise. Second, we modeled potentially suitable environments for species based on climatic factors in their native ranges, using three sources of distribution data. Performance of the three datasets and both approaches were compared for each species. Climate match was positively correlated with species establishment success (maximum predicted suitability in the introduced range was more strongly correlated with establishment success than mean suitability). Data assembled from the Global Amphibian Assessment through NatureServe provided the most accurate models for amphibians, while ecoregion data compiled by the World Wide Fund for Nature yielded models which described reptile climatic suitability better than available point-locality data. We present three methods of assigning a climate-match score for use in risk assessment using both the mean and maximum climatic suitabilities. Managers may choose to use different methods depending on the stringency of the assessment and the available data, facilitating higher resolution and accuracy for herpetofaunal risk assessment. Climate-matching has inherent limitations and other factors pertaining to ecological interactions and life-history traits must also be considered for thorough risk assessment.

  17. Exploring precipitation pattern scaling methodologies and robustness among CMIP5 models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kravitz, Ben; Lynch, Cary; Hartin, Corinne

    Pattern scaling is a well-established method for approximating modeled spatial distributions of changes in temperature by assuming a time-invariant pattern that scales with changes in global mean temperature. We compare two methods of pattern scaling for annual mean precipitation (regression and epoch difference) and evaluate which method is better in particular circumstances by quantifying their robustness to interpolation/extrapolation in time, inter-model variations, and inter-scenario variations. Both the regression and epoch-difference methods (the two most commonly used methods of pattern scaling) have good absolute performance in reconstructing the climate model output, measured as an area-weighted root mean square error. We decomposemore » the precipitation response in the RCP8.5 scenario into a CO 2 portion and a non-CO 2 portion. Extrapolating RCP8.5 patterns to reconstruct precipitation change in the RCP2.6 scenario results in large errors due to violations of pattern scaling assumptions when this CO 2-/non-CO 2-forcing decomposition is applied. As a result, the methodologies discussed in this paper can help provide precipitation fields to be utilized in other models (including integrated assessment models or impacts assessment models) for a wide variety of scenarios of future climate change.« less

  18. Exploring precipitation pattern scaling methodologies and robustness among CMIP5 models

    DOE PAGES

    Kravitz, Ben; Lynch, Cary; Hartin, Corinne; ...

    2017-05-12

    Pattern scaling is a well-established method for approximating modeled spatial distributions of changes in temperature by assuming a time-invariant pattern that scales with changes in global mean temperature. We compare two methods of pattern scaling for annual mean precipitation (regression and epoch difference) and evaluate which method is better in particular circumstances by quantifying their robustness to interpolation/extrapolation in time, inter-model variations, and inter-scenario variations. Both the regression and epoch-difference methods (the two most commonly used methods of pattern scaling) have good absolute performance in reconstructing the climate model output, measured as an area-weighted root mean square error. We decomposemore » the precipitation response in the RCP8.5 scenario into a CO 2 portion and a non-CO 2 portion. Extrapolating RCP8.5 patterns to reconstruct precipitation change in the RCP2.6 scenario results in large errors due to violations of pattern scaling assumptions when this CO 2-/non-CO 2-forcing decomposition is applied. As a result, the methodologies discussed in this paper can help provide precipitation fields to be utilized in other models (including integrated assessment models or impacts assessment models) for a wide variety of scenarios of future climate change.« less

  19. Assessment of medical communication skills by computer: assessment method and student experiences.

    PubMed

    Hulsman, R L; Mollema, E D; Hoos, A M; de Haes, J C J M; Donnison-Speijer, J D

    2004-08-01

    A computer-assisted assessment (CAA) program for communication skills designated ACT was developed using the objective structured video examination (OSVE) format. This method features assessment of cognitive scripts underlying communication behaviour, a broad range of communication problems covered in 1 assessment, highly standardised assessment and rating procedures, and large group assessments without complex organisation. The Academic Medical Centre (AMC) at the University of Amsterdam, the Netherlands. Aims To describe the development of the AMC Communication Test (ACT); to describe our experiences with the examination and rating procedures; to present test score descriptives, and to present the students' opinions of ACT. The ACT presents films on history taking, breaking bad news and shared decision making. Each film is accompanied by 3 types of short essay questions derived from our assessment model: "knows", "knows why/when" and "knows how". Evaluation questions about ACT were integrated into the assessment. Participants A total of 210 third year medical undergraduates were assessed. This study reports on the 110 (53%) students who completed all evaluation questions. Marking 210 examinations took about 17 days. The test score matched a normal distribution and showed a good level of discrimination of the students. About 75% passed the examination. Some support for the validity of our assessment model was found in the students' differential performance on the 3 types of questions. The ACT was well received. Student evaluations confirmed our efforts to develop realistic films that related well to the communication training programme. The ACT is a useful assessment method which complements interpersonal assessment methods for the evaluation of the medical communication skills of undergraduates.

  20. Assessment of glutathione levels in model solution and grape ferments supplemented with glutathione-enriched inactive dry yeast preparations using a novel UPLC-MS/MS method.

    PubMed

    Kritzinger, E C; Stander, M A; Du Toit, W J

    2013-01-01

    A novel, robust and fast ultra-high performance liquid chromatography-MS method has been developed for the simultaneous quantification of reduced glutathione (GSH) and oxidised glutathione (GSSG) in grape juice, wine and model wine solution. Sample preparation is minimal and does not require derivatisation. The method has very good performance in terms of sensitivity and selectivity. The limit of detection was 0.002 and 0.001 mg L(-1) for GSH and GSSG, respectively. The amount of GSH and GSSG released by commercial glutathione-enriched inactivated dry yeast preparations (GSH-IDYs) into a model solution was assessed. Significant differences in the amount of GSH and/or GSSG released into a model wine by different GSH-IDYs were observed, with ethanol influencing this release under certain conditions. The GSH and GSSG levels in grape juice fermentations supplemented with GSH-IDY were also assessed in relation to different addition times during fermentation. GSH-IDY addition can lead to elevated wine GSH levels, provided the supplementation is done early during alcoholic fermentation.

  1. No-Reference Image Quality Assessment by Wide-Perceptual-Domain Scorer Ensemble Method.

    PubMed

    Liu, Tsung-Jung; Liu, Kuan-Hsien

    2018-03-01

    A no-reference (NR) learning-based approach to assess image quality is presented in this paper. The devised features are extracted from wide perceptual domains, including brightness, contrast, color, distortion, and texture. These features are used to train a model (scorer) which can predict scores. The scorer selection algorithms are utilized to help simplify the proposed system. In the final stage, the ensemble method is used to combine the prediction results from selected scorers. Two multiple-scale versions of the proposed approach are also presented along with the single-scale one. They turn out to have better performances than the original single-scale method. Because of having features from five different domains at multiple image scales and using the outputs (scores) from selected score prediction models as features for multi-scale or cross-scale fusion (i.e., ensemble), the proposed NR image quality assessment models are robust with respect to more than 24 image distortion types. They also can be used on the evaluation of images with authentic distortions. The extensive experiments on three well-known and representative databases confirm the performance robustness of our proposed model.

  2. How Can Historical Understanding Best be Assessed? Use of Prediction Tasks To Assess How Students Understand the Role of Causal Factors that Produce Historical Events.

    ERIC Educational Resources Information Center

    Alonso-Tapia, Jesus; Villa, Jose Luis

    1999-01-01

    Examines the viability of using hypothetical problems that need the application of causal models for their solution as a method to assessing understanding in the social sciences. Explains that this method was used to describe how seventh-grade students understand causal factors affecting the "discovery and colonization of America." (CMK)

  3. Analysis of academic programs: comparing nursing and other university majors in the application of a quality, potential and cost model.

    PubMed

    Booker, Kathy; Hilgenberg, Cheryl

    2010-01-01

    Nursing is often considered expensive in the cost analysis of academic programs. Yet nursing programs have the power to attract many students, and the national nursing shortage has resulted in a high demand for nurses. Methods to systematically assess programs across an entire university academic division are often dissimilar in technique and outcome. At a small, private, Midwestern university, a model for comprehensive program assessment, titled the Quality, Potential and Cost (QPC) model, was developed and applied to each major offered at the university through the collaborative effort of directors, chairs, deans, and the vice president for academic affairs. The QPC model provides a means of equalizing data so that single measures (such as cost) are not viewed in isolation. It also provides a common language to ensure that all academic leaders at an institution apply consistent methods for assessment of individual programs. The application of the QPC model allowed for consistent, fair assessments and the ability to allocate resources to programs according to strategic direction. In this article, the application of the QPC model to School of Nursing majors and other selected university majors will be illustrated. Copyright 2010 Elsevier Inc. All rights reserved.

  4. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.

  5. Asymptotic and resampling strategies for assessing and comparing indirect effects in multiple mediator models.

    PubMed

    Preacher, Kristopher J; Hayes, Andrew F

    2008-08-01

    Hypotheses involving mediation are common in the behavioral sciences. Mediation exists when a predictor affects a dependent variable indirectly through at least one intervening variable, or mediator. Methods to assess mediation involving multiple simultaneous mediators have received little attention in the methodological literature despite a clear need. We provide an overview of simple and multiple mediation and explore three approaches that can be used to investigate indirect processes, as well as methods for contrasting two or more mediators within a single model. We present an illustrative example, assessing and contrasting potential mediators of the relationship between the helpfulness of socialization agents and job satisfaction. We also provide SAS and SPSS macros, as well as Mplus and LISREL syntax, to facilitate the use of these methods in applications.

  6. Linking stressors and ecological responses

    USGS Publications Warehouse

    Gentile, J.H.; Solomon, K.R.; Butcher, J.B.; Harrass, M.; Landis, W.G.; Power, M.; Rattner, B.A.; Warren-Hicks, W.J.; Wenger, R.; Foran, Jeffery A.; Ferenc, Susan A.

    1999-01-01

    To characterize risk, it is necessary to quantify the linkages and interactions between chemical, physical and biological stressors and endpoints in the conceptual framework for ecological risk assessment (ERA). This can present challenges in a multiple stressor analysis, and it will not always be possible to develop a quantitative stressor-response profile. This review commences with a conceptual representation of the problem of developing a linkage analysis for multiple stressors and responses. The remainder of the review surveys a variety of mathematical and statistical methods (e.g., ranking methods, matrix models, multivariate dose-response for mixtures, indices, visualization, simulation modeling and decision-oriented methods) for accomplishing the linkage analysis for multiple stressors. Describing the relationships between multiple stressors and ecological effects are critical components of 'effects assessment' in the ecological risk assessment framework.

  7. Hood entry coefficients of compound exhaust hoods.

    PubMed

    Figueroa, Crescente E

    2011-12-01

    A traditional method for assessing the flow rate in ventilation systems is based on multiple readings of velocity or velocity pressure (VP) (usually 10 or 20 points) taken in ductwork sections located away from fittings (> seven × diameters of straight duct). This study seeks to eliminate the need for a multiple-point evaluation and replace it with a simplified method that requires only a single measurement of hood static pressure (SP(h)) taken at a more accessible location (< three × diameters of straight duct from the hood entry). The SP(h) method is widely used for the assessment of flow rate in simple hoods. However, industrial applications quite often use compound hoods that are regularly of the slot/plenum type. For these hoods, a "compound coefficient of entry" has not been published, which makes the use of the hood static pressure method unfeasible. This study proposes a model for the computation of a "compound coefficient of entry" and validates the use of this model to assess flow rate in two systems of well-defined geometry (multi-slotted/plenum and single-slotted/tapered or "fish-tail" types). When using a conservative value of the slot loss factor (1.78), the proposed model yielded an estimate of the volumetric flow rate within 10% of that provided by a more comprehensive method of assessment. The simplicity of the hood static pressure method makes it very desirable, even in the upper range of experimental error found in this study.

  8. Models and Methods of Aggregating Linguistic Information in Multi-criteria Hierarchical Quality Assessment Systems

    NASA Astrophysics Data System (ADS)

    Azarnova, T. V.; Titova, I. A.; Barkalov, S. A.

    2018-03-01

    The article presents an algorithm for obtaining an integral assessment of the quality of an organization from the perspective of customers, based on the method of aggregating linguistic information on a multilevel hierarchical system of quality assessment. The algorithm is of a constructive nature, it provides not only the possibility of obtaining an integral evaluation, but also the development of a quality improvement strategy based on the method of linguistic decomposition, which forms the minimum set of areas of work with clients whose quality change will allow obtaining the required level of integrated quality assessment.

  9. The Oral Minimal Model Method

    PubMed Central

    Cobelli, Claudio; Dalla Man, Chiara; Toffolo, Gianna; Basu, Rita; Vella, Adrian; Rizza, Robert

    2014-01-01

    The simultaneous assessment of insulin action, secretion, and hepatic extraction is key to understanding postprandial glucose metabolism in nondiabetic and diabetic humans. We review the oral minimal method (i.e., models that allow the estimation of insulin sensitivity, β-cell responsivity, and hepatic insulin extraction from a mixed-meal or an oral glucose tolerance test). Both of these oral tests are more physiologic and simpler to administer than those based on an intravenous test (e.g., a glucose clamp or an intravenous glucose tolerance test). The focus of this review is on indices provided by physiological-based models and their validation against the glucose clamp technique. We discuss first the oral minimal model method rationale, data, and protocols. Then we present the three minimal models and the indices they provide. The disposition index paradigm, a widely used β-cell function metric, is revisited in the context of individual versus population modeling. Adding a glucose tracer to the oral dose significantly enhances the assessment of insulin action by segregating insulin sensitivity into its glucose disposal and hepatic components. The oral minimal model method, by quantitatively portraying the complex relationships between the major players of glucose metabolism, is able to provide novel insights regarding the regulation of postprandial metabolism. PMID:24651807

  10. Assessing protein conformational sampling methods based on bivariate lag-distributions of backbone angles

    PubMed Central

    Maadooliat, Mehdi; Huang, Jianhua Z.

    2013-01-01

    Despite considerable progress in the past decades, protein structure prediction remains one of the major unsolved problems in computational biology. Angular-sampling-based methods have been extensively studied recently due to their ability to capture the continuous conformational space of protein structures. The literature has focused on using a variety of parametric models of the sequential dependencies between angle pairs along the protein chains. In this article, we present a thorough review of angular-sampling-based methods by assessing three main questions: What is the best distribution type to model the protein angles? What is a reasonable number of components in a mixture model that should be considered to accurately parameterize the joint distribution of the angles? and What is the order of the local sequence–structure dependency that should be considered by a prediction method? We assess the model fits for different methods using bivariate lag-distributions of the dihedral/planar angles. Moreover, the main information across the lags can be extracted using a technique called Lag singular value decomposition (LagSVD), which considers the joint distribution of the dihedral/planar angles over different lags using a nonparametric approach and monitors the behavior of the lag-distribution of the angles using singular value decomposition. As a result, we developed graphical tools and numerical measurements to compare and evaluate the performance of different model fits. Furthermore, we developed a web-tool (http://www.stat.tamu.edu/∼madoliat/LagSVD) that can be used to produce informative animations. PMID:22926831

  11. Cost-Effectiveness of HBV and HCV Screening Strategies – A Systematic Review of Existing Modelling Techniques

    PubMed Central

    Geue, Claudia; Wu, Olivia; Xin, Yiqiao; Heggie, Robert; Hutchinson, Sharon; Martin, Natasha K.; Fenwick, Elisabeth; Goldberg, David

    2015-01-01

    Introduction Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches. Methods A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions. Results The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology. Conclusion When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers. PMID:26689908

  12. Causal modelling applied to the risk assessment of a wastewater discharge.

    PubMed

    Paul, Warren L; Rokahr, Pat A; Webb, Jeff M; Rees, Gavin N; Clune, Tim S

    2016-03-01

    Bayesian networks (BNs), or causal Bayesian networks, have become quite popular in ecological risk assessment and natural resource management because of their utility as a communication and decision-support tool. Since their development in the field of artificial intelligence in the 1980s, however, Bayesian networks have evolved and merged with structural equation modelling (SEM). Unlike BNs, which are constrained to encode causal knowledge in conditional probability tables, SEMs encode this knowledge in structural equations, which is thought to be a more natural language for expressing causal information. This merger has clarified the causal content of SEMs and generalised the method such that it can now be performed using standard statistical techniques. As it was with BNs, the utility of this new generation of SEM in ecological risk assessment will need to be demonstrated with examples to foster an understanding and acceptance of the method. Here, we applied SEM to the risk assessment of a wastewater discharge to a stream, with a particular focus on the process of translating a causal diagram (conceptual model) into a statistical model which might then be used in the decision-making and evaluation stages of the risk assessment. The process of building and testing a spatial causal model is demonstrated using data from a spatial sampling design, and the implications of the resulting model are discussed in terms of the risk assessment. It is argued that a spatiotemporal causal model would have greater external validity than the spatial model, enabling broader generalisations to be made regarding the impact of a discharge, and greater value as a tool for evaluating the effects of potential treatment plant upgrades. Suggestions are made on how the causal model could be augmented to include temporal as well as spatial information, including suggestions for appropriate statistical models and analyses.

  13. Invited seminar, University of North Texas: An integrated eco-hydrologic modeling framework for assessing the effects of interacting stressors

    EPA Science Inventory

    The U.S. Environmental Protection Agency recently established the Ecosystem Services Research Program to help formulate methods and models for conducting comprehensive risk assessments that quantify how multiple ecosystem services interact and respond in concert to environmental ...

  14. Invited OSU class lecture: An integrated eco-hydrologic modeling framework for assessing the effects of interacting stressors on multiple ecosystem services

    EPA Science Inventory

    The U.S. Environmental Protection Agency recently established the Ecosystem Services Research Program to help formulate methods and models for conducting comprehensive risk assessments that quantify how multiple ecosystem services interact and respond in concert to environmental ...

  15. An integrated eco-hydrologic modeling framework for assessing the effects of interacting stressors on forest ecosystem services - ESRP mtg

    EPA Science Inventory

    The U.S. Environmental Protection Agency recently established the Ecosystem Services Research Program to help formulate methods and models for conducting comprehensive risk assessments that quantify how multiple ecosystem services interact and respond in concert to environmental ...

  16. An integrated eco-hydrologic modeling framework for assessing the effects of interacting stressors on multiple ecosystem services - 4/27/10

    EPA Science Inventory

    The U.S. Environmental Protection Agency recently established the Ecosystem Services Research Program to help formulate methods and models for conducting comprehensive risk assessments that quantify how multiple ecosystem services interact and respond in concert to environmental ...

  17. Population Validity for Educational Data Mining Models: A Case Study in Affect Detection

    ERIC Educational Resources Information Center

    Ocumpaugh, Jaclyn; Baker, Ryan; Gowda, Sujith; Heffernan, Neil; Heffernan, Cristina

    2014-01-01

    Information and communication technology (ICT)-enhanced research methods such as educational data mining (EDM) have allowed researchers to effectively model a broad range of constructs pertaining to the student, moving from traditional assessments of knowledge to assessment of engagement, meta-cognition, strategy and affect. The automated…

  18. Stepwise Analysis of Differential Item Functioning Based on Multiple-Group Partial Credit Model.

    ERIC Educational Resources Information Center

    Muraki, Eiji

    1999-01-01

    Extended an Item Response Theory (IRT) method for detection of differential item functioning to the partial credit model and applied the method to simulated data using a stepwise procedure. Then applied the stepwise DIF analysis based on the multiple-group partial credit model to writing trend data from the National Assessment of Educational…

  19. Assessing ecoregional-scale habitat suitability index models for priority landbirds

    Treesearch

    John M. Tirpak; D. Todd Jones-Farrand; Frank R. Thompson; Daniel J. Twedt; Charles K. Baxter; Jane A. Fitzgerald; William B. Uihlein

    2009-01-01

    Emerging methods in habitat and wildlife population modeling promise new horizons in conservation but only if these methods provide robust population-habitat linkages. We used Breeding Bird Survey (BBS) data to verify and validate newly developed habitat suitability index (HSI) models for 40 priority landbird species in the Central Hardwoods and West Gulf Coastal Plain...

  20. A closer look at cross-validation for assessing the accuracy of gene regulatory networks and models.

    PubMed

    Tabe-Bordbar, Shayan; Emad, Amin; Zhao, Sihai Dave; Sinha, Saurabh

    2018-04-26

    Cross-validation (CV) is a technique to assess the generalizability of a model to unseen data. This technique relies on assumptions that may not be satisfied when studying genomics datasets. For example, random CV (RCV) assumes that a randomly selected set of samples, the test set, well represents unseen data. This assumption doesn't hold true where samples are obtained from different experimental conditions, and the goal is to learn regulatory relationships among the genes that generalize beyond the observed conditions. In this study, we investigated how the CV procedure affects the assessment of supervised learning methods used to learn gene regulatory networks (or in other applications). We compared the performance of a regression-based method for gene expression prediction estimated using RCV with that estimated using a clustering-based CV (CCV) procedure. Our analysis illustrates that RCV can produce over-optimistic estimates of the model's generalizability compared to CCV. Next, we defined the 'distinctness' of test set from training set and showed that this measure is predictive of performance of the regression method. Finally, we introduced a simulated annealing method to construct partitions with gradually increasing distinctness and showed that performance of different gene expression prediction methods can be better evaluated using this method.

  1. Evaluating the impact of field-scale management strategies on sediment transport to the watershed outlet.

    PubMed

    Sommerlot, Andrew R; Pouyan Nejadhashemi, A; Woznicki, Sean A; Prohaska, Michael D

    2013-10-15

    Non-point source pollution from agricultural lands is a significant contributor of sediment pollution in United States lakes and streams. Therefore, quantifying the impact of individual field management strategies at the watershed-scale provides valuable information to watershed managers and conservation agencies to enhance decision-making. In this study, four methods employing some of the most cited models in field and watershed scale analysis were compared to find a practical yet accurate method for evaluating field management strategies at the watershed outlet. The models used in this study including field-scale model (the Revised Universal Soil Loss Equation 2 - RUSLE2), spatially explicit overland sediment delivery models (SEDMOD), and a watershed-scale model (Soil and Water Assessment Tool - SWAT). These models were used to develop four modeling strategies (methods) for the River Raisin watershed: Method 1) predefined field-scale subbasin and reach layers were used in SWAT model; Method 2) subbasin-scale sediment delivery ratio was employed; Method 3) results obtained from the field-scale RUSLE2 model were incorporated as point source inputs to the SWAT watershed model; and Method 4) a hybrid solution combining analyses from the RUSLE2, SEDMOD, and SWAT models. Method 4 was selected as the most accurate among the studied methods. In addition, the effectiveness of six best management practices (BMPs) in terms of the water quality improvement and associated cost were assessed. Economic analysis was performed using Method 4, and producer requested prices for BMPs were compared with prices defined by the Environmental Quality Incentives Program (EQIP). On a per unit area basis, producers requested higher prices than EQIP in four out of six BMP categories. Meanwhile, the true cost of sediment reduction at the field and watershed scales was greater than EQIP in five of six BMP categories according to producer requested prices. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Data collection costs in industrial environments for three occupational posture exposure assessment methods.

    PubMed

    Trask, Catherine; Mathiassen, Svend Erik; Wahlström, Jens; Heiden, Marina; Rezagholi, Mahmoud

    2012-06-27

    Documentation of posture measurement costs is rare and cost models that do exist are generally naïve. This paper provides a comprehensive cost model for biomechanical exposure assessment in occupational studies, documents the monetary costs of three exposure assessment methods for different stakeholders in data collection, and uses simulations to evaluate the relative importance of cost components. Trunk and shoulder posture variables were assessed for 27 aircraft baggage handlers for 3 full shifts each using three methods typical to ergonomic studies: self-report via questionnaire, observation via video film, and full-shift inclinometer registration. The cost model accounted for expenses related to meetings to plan the study, administration, recruitment, equipment, training of data collectors, travel, and onsite data collection. Sensitivity analyses were conducted using simulated study parameters and cost components to investigate the impact on total study cost. Inclinometry was the most expensive method (with a total study cost of € 66,657), followed by observation (€ 55,369) and then self report (€ 36,865). The majority of costs (90%) were borne by researchers. Study design parameters such as sample size, measurement scheduling and spacing, concurrent measurements, location and travel, and equipment acquisition were shown to have wide-ranging impacts on costs. This study provided a general cost modeling approach that can facilitate decision making and planning of data collection in future studies, as well as investigation into cost efficiency and cost efficient study design. Empirical cost data from a large field study demonstrated the usefulness of the proposed models.

  3. Multisensor satellite data for water quality analysis and water pollution risk assessment: decision making under deep uncertainty with fuzzy algorithm in framework of multimodel approach

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim

    2017-10-01

    Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.

  4. The importance of hydrological uncertainty assessment methods in climate change impact studies

    NASA Astrophysics Data System (ADS)

    Honti, M.; Scheidegger, A.; Stamm, C.

    2014-08-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980s with a recent boost after the publication of the IPCC AR4 report. From hundreds of impact studies a quasi-standard methodology has emerged, to a large extent shaped by the growing public demand for predicting how water resources management or flood protection should change in the coming decades. The "standard" workflow relies on a model cascade from global circulation model (GCM) predictions for selected IPCC scenarios to future catchment hydrology. Uncertainty is present at each level and propagates through the model cascade. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. Our hypothesis was that the relative importance of climatic and hydrologic uncertainty is (among other factors) heavily influenced by the uncertainty assessment method. To test this we carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on two small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment with two different likelihood functions. One was a time series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was an approximate likelihood function for the flow quantiles. The results showed that the expected climatic impact on flow quantiles was small compared to prediction uncertainty. The choice of uncertainty assessment method actually determined what sources of uncertainty could be identified at all. This demonstrated that one could arrive at rather different conclusions about the causes behind predictive uncertainty for the same hydrological model and calibration data when considering different objective functions for calibration.

  5. The Assessment of Hyperactivity in Preschool Populations: A Multidisciplinary Perspective.

    ERIC Educational Resources Information Center

    Rosenberg, Michael S.; And Others

    1989-01-01

    The variety of methods available for the assessment of hyperactivity in preschool populations is reviewed. Specific procedures for assessment are presented from a multidisciplinary perspective, integrating biophysical, behavioral, cognitive, and ecological models. (Author/JDD)

  6. The Risk Assessment Study for Electric Power Marketing Competitiveness Based on Cloud Model and TOPSIS

    NASA Astrophysics Data System (ADS)

    Li, Cunbin; Wang, Yi; Lin, Shuaishuai

    2017-09-01

    With the rapid development of the energy internet and the deepening of the electric power reform, the traditional marketing mode of electric power does not apply to most of electric power enterprises, so must seek a breakthrough, however, in the face of increasingly complex marketing information, how to make a quick, reasonable transformation, makes the electric power marketing competitiveness assessment more accurate and objective becomes a big problem. In this paper, cloud model and TOPSIS method is proposed. Firstly, build the electric power marketing competitiveness evaluation index system. Then utilize the cloud model to transform the qualitative evaluation of the marketing data into quantitative values and use the entropy weight method to weaken the subjective factors of evaluation index weight. Finally, by TOPSIS method the closeness degrees of alternatives are obtained. This method provides a novel solution for the electric power marketing competitiveness evaluation. Through the case analysis the effectiveness and feasibility of this model are verified.

  7. Performance measurement of PSF modeling reconstruction (True X) on Siemens Biograph TruePoint TrueV PET/CT.

    PubMed

    Lee, Young Sub; Kim, Jin Su; Kim, Kyeong Min; Kang, Joo Hyun; Lim, Sang Moo; Kim, Hee-Joung

    2014-05-01

    The Siemens Biograph TruePoint TrueV (B-TPTV) positron emission tomography (PET) scanner performs 3D PET reconstruction using a system matrix with point spread function (PSF) modeling (called the True X reconstruction). PET resolution was dramatically improved with the True X method. In this study, we assessed the spatial resolution and image quality on a B-TPTV PET scanner. In addition, we assessed the feasibility of animal imaging with a B-TPTV PET and compared it with a microPET R4 scanner. Spatial resolution was measured at center and at 8 cm offset from the center in transverse plane with warm background activity. True X, ordered subset expectation maximization (OSEM) without PSF modeling, and filtered back-projection (FBP) reconstruction methods were used. Percent contrast (% contrast) and percent background variability (% BV) were assessed according to NEMA NU2-2007. The recovery coefficient (RC), non-uniformity, spill-over ratio (SOR), and PET imaging of the Micro Deluxe Phantom were assessed to compare image quality of B-TPTV PET with that of the microPET R4. When True X reconstruction was used, spatial resolution was <3.65 mm with warm background activity. % contrast and % BV with True X reconstruction were higher than those with the OSEM reconstruction algorithm without PSF modeling. In addition, the RC with True X reconstruction was higher than that with the FBP method and the OSEM without PSF modeling method on the microPET R4. The non-uniformity with True X reconstruction was higher than that with FBP and OSEM without PSF modeling on microPET R4. SOR with True X reconstruction was better than that with FBP or OSEM without PSF modeling on the microPET R4. This study assessed the performance of the True X reconstruction. Spatial resolution with True X reconstruction was improved by 45 % and its % contrast was significantly improved compared to those with the conventional OSEM without PSF modeling reconstruction algorithm. The noise level was higher than that with the other reconstruction algorithm. Therefore, True X reconstruction should be used with caution when quantifying PET data.

  8. Risk assessment of tropical cyclone rainfall flooding in the Delaware River Basin

    NASA Astrophysics Data System (ADS)

    Lu, P.; Lin, N.; Smith, J. A.; Emanuel, K.

    2016-12-01

    Rainfall-induced inland flooding is a leading cause of death, injury, and property damage from tropical cyclones (TCs). In the context of climate change, it has been shown that extreme precipitation from TCs is likely to increase during the 21st century. Assessing the long-term risk of inland flooding associated with landfalling TCs is therefore an important task. Standard risk assessment techniques, which are based on observations from rain gauges and stream gauges, are not broadly applicable to TC induced flooding, since TCs are rare, extreme events with very limited historical observations at any specific location. Also, rain gauges and stream gauges can hardly capture the complex spatial variation of TC rainfall and flooding. Furthermore, the utility of historically based assessments is compromised by climate change. Regional dynamical downscaling models can resolve many features of TC precipitation. In terms of risk assessment, however, it is computationally demanding to run such models to obtain long-term climatology of TC induced flooding. Here we apply a computationally efficient climatological-hydrological method to assess the risk of inland flooding associated with landfalling TCs. It includes: 1) a deterministic TC climatology modeling method to generate large numbers of synthetic TCs with physically correlated characteristics (i.e., track, intensity, size) under observed and projected climates; 2) a simple physics-based tropical cyclone rainfall model which is able to simulate rainfall fields associated with each synthetic storm; 3) a hydrologic modeling system that takes in rainfall fields to simulate flood peaks over an entire drainage basin. We will present results of this method applied to the Delaware River Basin in the mid-Atlantic US.

  9. A Comparison of Lifting-Line and CFD Methods with Flight Test Data from a Research Puma Helicopter

    NASA Technical Reports Server (NTRS)

    Bousman, William G.; Young, Colin; Toulmay, Francois; Gilbert, Neil E.; Strawn, Roger C.; Miller, Judith V.; Maier, Thomas H.; Costes, Michel; Beaumier, Philippe

    1996-01-01

    Four lifting-line methods were compared with flight test data from a research Puma helicopter and the accuracy assessed over a wide range of flight speeds. Hybrid Computational Fluid Dynamics (CFD) methods were also examined for two high-speed conditions. A parallel analytical effort was performed with the lifting-line methods to assess the effects of modeling assumptions and this provided insight into the adequacy of these methods for load predictions.

  10. Reconsidering the psychometrics of quality of life assessment in light of response shift and appraisal

    PubMed Central

    Schwartz, Carolyn E; Rapkin, Bruce D

    2004-01-01

    The increasing evidence for response shift phenomena in quality of life (QOL) assessment points to the necessity to reconsider both the measurement model and the application of psychometric analyses. The proposed psychometric model posits that the QOL true score is always contingent upon parameters of the appraisal process. This new model calls into question existing methods for establishing the reliability and validity of QOL assessment tools and suggests several new approaches for describing the psychometric properties of these scales. Recommendations for integrating the assessment of appraisal into QOL research and clinical practice are discussed. PMID:15038830

  11. Construction risk assessment of deep foundation pit in metro station based on G-COWA method

    NASA Astrophysics Data System (ADS)

    You, Weibao; Wang, Jianbo; Zhang, Wei; Liu, Fangmeng; Yang, Diying

    2018-05-01

    In order to get an accurate understanding of the construction safety of deep foundation pit in metro station and reduce the probability and loss of risk occurrence, a risk assessment method based on G-COWA is proposed. Firstly, relying on the specific engineering examples and the construction characteristics of deep foundation pit, an evaluation index system based on the five factors of “human, management, technology, material and environment” is established. Secondly, the C-OWA operator is introduced to realize the evaluation index empowerment and weaken the negative influence of expert subjective preference. The gray cluster analysis and fuzzy comprehensive evaluation method are combined to construct the construction risk assessment model of deep foundation pit, which can effectively solve the uncertainties. Finally, the model is applied to the actual project of deep foundation pit of Qingdao Metro North Station, determine its construction risk rating is “medium”, evaluate the model is feasible and reasonable. And then corresponding control measures are put forward and useful reference are provided.

  12. Project risk management in the construction of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya

    2018-03-01

    This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.

  13. Geomatic methods at the service of water resources modelling

    NASA Astrophysics Data System (ADS)

    Molina, José-Luis; Rodríguez-Gonzálvez, Pablo; Molina, Mª Carmen; González-Aguilera, Diego; Espejo, Fernando

    2014-02-01

    Acquisition, management and/or use of spatial information are crucial for the quality of water resources studies. In this sense, several geomatic methods arise at the service of water modelling, aiming the generation of cartographic products, especially in terms of 3D models and orthophotos. They may also perform as tools for problem solving and decision making. However, choosing the right geomatic method is still a challenge in this field. That is mostly due to the complexity of the different applications and variables involved for water resources management. This study is aimed to provide a guide to best practices in this context by tackling a deep review of geomatic methods and their suitability assessment for the following study types: Surface Hydrology, Groundwater Hydrology, Hydraulics, Agronomy, Morphodynamics and Geotechnical Processes. This assessment is driven by several decision variables grouped in two categories, classified depending on their nature as geometric or radiometric. As a result, the reader comes with the best choice/choices for the method to use, depending on the type of water resources modelling study in hand.

  14. Methods and Model Dependency of Extreme Event Attribution: The 2015 European Drought

    NASA Astrophysics Data System (ADS)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Vautard, Robert; van Oldenborgh, Geert J.; Wilcox, Laura; Seneviratne, Sonia I.

    2017-10-01

    Science on the role of anthropogenic influence on extreme weather events, such as heatwaves or droughts, has evolved rapidly in the past years. The approach of "event attribution" compares the occurrence-probability of an event in the present, factual climate with its probability in a hypothetical, counterfactual climate without human-induced climate change. Several methods can be used for event attribution, based on climate model simulations and observations, and usually researchers only assess a subset of methods and data sources. Here, we explore the role of methodological choices for the attribution of the 2015 meteorological summer drought in Europe. We present contradicting conclusions on the relevance of human influence as a function of the chosen data source and event attribution methodology. Assessments using the maximum number of models and counterfactual climates with pre-industrial greenhouse gas concentrations point to an enhanced drought risk in Europe. However, other evaluations show contradictory evidence. These results highlight the need for a multi-model and multi-method framework in event attribution research, especially for events with a low signal-to-noise ratio and high model dependency such as regional droughts.

  15. Assessment of sustainable urban transport development based on entropy and unascertained measure.

    PubMed

    Li, Yancang; Yang, Jing; Shi, Huawang; Li, Yijie

    2017-01-01

    To find a more effective method for the assessment of sustainable urban transport development, the comprehensive assessment model of sustainable urban transport development was established based on the unascertained measure. On the basis of considering the factors influencing urban transport development, the comprehensive assessment indexes were selected, including urban economical development, transport demand, environment quality and energy consumption, and the assessment system of sustainable urban transport development was proposed. In view of different influencing factors of urban transport development, the index weight was calculated through the entropy weight coefficient method. Qualitative and quantitative analyses were conducted according to the actual condition. Then, the grade was obtained by using the credible degree recognition criterion from which the urban transport development level can be determined. Finally, a comprehensive assessment method for urban transport development was introduced. The application practice showed that the method can be used reasonably and effectively for the comprehensive assessment of urban transport development.

  16. Instability risk assessment of construction waste pile slope based on fuzzy entropy

    NASA Astrophysics Data System (ADS)

    Ma, Yong; Xing, Huige; Yang, Mao; Nie, Tingting

    2018-05-01

    Considering the nature and characteristics of construction waste piles, this paper analyzed the factors affecting the stability of the slope of construction waste piles, and established the system of the assessment indexes for the slope failure risks of construction waste piles. Based on the basic principles and methods of fuzzy mathematics, the factor set and the remark set were established. The membership grade of continuous factor indexes is determined using the "ridge row distribution" function, while that for the discrete factor indexes was determined by the Delphi Method. For the weight of factors, the subjective weight was determined by the Analytic Hierarchy Process (AHP) and objective weight by the entropy weight method. And the distance function was introduced to determine the combination coefficient. This paper established a fuzzy comprehensive assessment model of slope failure risks of construction waste piles, and assessed pile slopes in the two dimensions of hazard and vulnerability. The root mean square of the hazard assessment result and vulnerability assessment result was the final assessment result. The paper then used a certain construction waste pile slope as the example for analysis, assessed the risks of the four stages of a landfill, verified the assessment model and analyzed the slope's failure risks and preventive measures against a slide.

  17. Toward a Model-Based Approach to the Clinical Assessment of Personality Psychopathology

    PubMed Central

    Eaton, Nicholas R.; Krueger, Robert F.; Docherty, Anna R.; Sponheim, Scott R.

    2015-01-01

    Recent years have witnessed tremendous growth in the scope and sophistication of statistical methods available to explore the latent structure of psychopathology, involving continuous, discrete, and hybrid latent variables. The availability of such methods has fostered optimism that they can facilitate movement from classification primarily crafted through expert consensus to classification derived from empirically-based models of psychopathological variation. The explication of diagnostic constructs with empirically supported structures can then facilitate the development of assessment tools that appropriately characterize these constructs. Our goal in this paper is to illustrate how new statistical methods can inform conceptualization of personality psychopathology and therefore its assessment. We use magical thinking as example, because both theory and earlier empirical work suggested the possibility of discrete aspects to the latent structure of personality psychopathology, particularly forms of psychopathology involving distortions of reality testing, yet other data suggest that personality psychopathology is generally continuous in nature. We directly compared the fit of a variety of latent variable models to magical thinking data from a sample enriched with clinically significant variation in psychotic symptomatology for explanatory purposes. Findings generally suggested a continuous latent variable model best represented magical thinking, but results varied somewhat depending on different indices of model fit. We discuss the implications of the findings for classification and applied personality assessment. We also highlight some limitations of this type of approach that are illustrated by these data, including the importance of substantive interpretation, in addition to use of model fit indices, when evaluating competing structural models. PMID:24007309

  18. CD-SEM metrology and OPC modeling for 2D patterning in advanced technology nodes (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wallow, Thomas I.; Zhang, Chen; Fumar-Pici, Anita; Chen, Jun; Laenens, Bart; Spence, Christopher A.; Rio, David; van Adrichem, Paul; Dillen, Harm; Wang, Jing; Yang, Peng-Cheng; Gillijns, Werner; Jaenen, Patrick; van Roey, Frieda; van de Kerkhove, Jeroen; Babin, Sergey

    2017-03-01

    In the course of assessing OPC compact modeling capabilities and future requirements, we chose to investigate the interface between CD-SEM metrology methods and OPC modeling in some detail. Two linked observations motivated our study: 1) OPC modeling is, in principle, agnostic of metrology methods and best practice implementation. 2) Metrology teams across the industry use a wide variety of equipment, hardware settings, and image/data analysis methods to generate the large volumes of CD-SEM measurement data that are required for OPC in advanced technology nodes. Initial analyses led to the conclusion that many independent best practice metrology choices based on systematic study as well as accumulated institutional knowledge and experience can be reasonably made. Furthermore, these choices can result in substantial variations in measurement of otherwise identical model calibration and verification patterns. We will describe several experimental 2D test cases (i.e., metal, via/cut layers) that examine how systematic changes in metrology practice impact both the metrology data itself and the resulting full chip compact model behavior. Assessment of specific methodology choices will include: • CD-SEM hardware configurations and settings: these may range from SEM beam conditions (voltage, current, etc.,) to magnification, to frame integration optimizations that balance signal-to-noise vs. resist damage. • Image and measurement optimization: these may include choice of smoothing filters for noise suppression, threshold settings, etc. • Pattern measurement methodologies: these may include sampling strategies, CD- and contour- based approaches, and various strategies to optimize the measurement of complex 2D shapes. In addition, we will present conceptual frameworks and experimental methods that allow practitioners of OPC metrology to assess impacts of metrology best practice choices on model behavior. Finally, we will also assess requirements posed by node scaling on OPC model accuracy, and evaluate potential consequences for CD-SEM metrology capabilities and practices.

  19. Virtual reality based adaptive dose assessment method for arbitrary geometries in nuclear facility decommissioning.

    PubMed

    Liu, Yong-Kuo; Chao, Nan; Xia, Hong; Peng, Min-Jun; Ayodeji, Abiodun

    2018-05-17

    This paper presents an improved and efficient virtual reality-based adaptive dose assessment method (VRBAM) applicable to the cutting and dismantling tasks in nuclear facility decommissioning. The method combines the modeling strength of virtual reality with the flexibility of adaptive technology. The initial geometry is designed with the three-dimensional computer-aided design tools, and a hybrid model composed of cuboids and a point-cloud is generated automatically according to the virtual model of the object. In order to improve the efficiency of dose calculation while retaining accuracy, the hybrid model is converted to a weighted point-cloud model, and the point kernels are generated by adaptively simplifying the weighted point-cloud model according to the detector position, an approach that is suitable for arbitrary geometries. The dose rates are calculated with the Point-Kernel method. To account for radiation scattering effects, buildup factors are calculated with the Geometric-Progression formula in the fitting function. The geometric modeling capability of VRBAM was verified by simulating basic geometries, which included a convex surface, a concave surface, a flat surface and their combination. The simulation results show that the VRBAM is more flexible and superior to other approaches in modeling complex geometries. In this paper, the computation time and dose rate results obtained from the proposed method were also compared with those obtained using the MCNP code and an earlier virtual reality-based method (VRBM) developed by the same authors. © 2018 IOP Publishing Ltd.

  20. Estimating Gravity Biases with Wavelets in Support of a 1-cm Accurate Geoid Model

    NASA Astrophysics Data System (ADS)

    Ahlgren, K.; Li, X.

    2017-12-01

    Systematic errors that reside in surface gravity datasets are one of the major hurdles in constructing a high-accuracy geoid model at high resolutions. The National Oceanic and Atmospheric Administration's (NOAA) National Geodetic Survey (NGS) has an extensive historical surface gravity dataset consisting of approximately 10 million gravity points that are known to have systematic biases at the mGal level (Saleh et al. 2013). As most relevant metadata is absent, estimating and removing these errors to be consistent with a global geopotential model and airborne data in the corresponding wavelength is quite a difficult endeavor. However, this is crucial to support a 1-cm accurate geoid model for the United States. With recently available independent gravity information from GRACE/GOCE and airborne gravity from the NGS Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project, several different methods of bias estimation are investigated which utilize radial basis functions and wavelet decomposition. We estimate a surface gravity value by incorporating a satellite gravity model, airborne gravity data, and forward-modeled topography at wavelet levels according to each dataset's spatial wavelength. Considering the estimated gravity values over an entire gravity survey, an estimate of the bias and/or correction for the entire survey can be found and applied. In order to assess the accuracy of each bias estimation method, two techniques are used. First, each bias estimation method is used to predict the bias for two high-quality (unbiased and high accuracy) geoid slope validation surveys (GSVS) (Smith et al. 2013 & Wang et al. 2017). Since these surveys are unbiased, the various bias estimation methods should reflect that and provide an absolute accuracy metric for each of the bias estimation methods. Secondly, the corrected gravity datasets from each of the bias estimation methods are used to build a geoid model. The accuracy of each geoid model provides an additional metric to assess the performance of each bias estimation method. The geoid model accuracies are assessed using the two GSVS lines and GPS-leveling data across the United States.

  1. Assessing methane emission estimation methods based on atmospheric measurements from oil and gas production using LES simulations

    NASA Astrophysics Data System (ADS)

    Saide, P. E.; Steinhoff, D.; Kosovic, B.; Weil, J.; Smith, N.; Blewitt, D.; Delle Monache, L.

    2017-12-01

    There are a wide variety of methods that have been proposed and used to estimate methane emissions from oil and gas production by using air composition and meteorology observations in conjunction with dispersion models. Although there has been some verification of these methodologies using controlled releases and concurrent atmospheric measurements, it is difficult to assess the accuracy of these methods for more realistic scenarios considering factors such as terrain, emissions from multiple components within a well pad, and time-varying emissions representative of typical operations. In this work we use a large-eddy simulation (LES) to generate controlled but realistic synthetic observations, which can be used to test multiple source term estimation methods, also known as an Observing System Simulation Experiment (OSSE). The LES is based on idealized simulations of the Weather Research & Forecasting (WRF) model at 10 m horizontal grid-spacing covering an 8 km by 7 km domain with terrain representative of a region located in the Barnett shale. Well pads are setup in the domain following a realistic distribution and emissions are prescribed every second for the components of each well pad (e.g., chemical injection pump, pneumatics, compressor, tanks, and dehydrator) using a simulator driven by oil and gas production volume, composition and realistic operational conditions. The system is setup to allow assessments under different scenarios such as normal operations, during liquids unloading events, or during other prescribed operational upset events. Methane and meteorology model output are sampled following the specifications of the emission estimation methodologies and considering typical instrument uncertainties, resulting in realistic observations (see Figure 1). We will show the evaluation of several emission estimation methods including the EPA Other Test Method 33A and estimates using the EPA AERMOD regulatory model. We will also show source estimation results from advanced methods such as variational inverse modeling, and Bayesian inference and stochastic sampling techniques. Future directions including other types of observations, other hydrocarbons being considered, and assessment of additional emission estimation methods will be discussed.

  2. Development and evaluation of nursing user interface screens using multiple methods.

    PubMed

    Hyun, Sookyung; Johnson, Stephen B; Stetson, Peter D; Bakken, Suzanne

    2009-12-01

    Building upon the foundation of the Structured Narrative Electronic Health Record (EHR) model, we applied theory-based (combined Technology Acceptance Model and Task-Technology Fit Model) and user-centered methods to explore nurses' perceptions of functional requirements for an electronic nursing documentation system, design user interface screens reflective of the nurses' perspectives, and assess nurses' perceptions of the usability of the prototype user interface screens. The methods resulted in user interface screens that were perceived to be easy to use, potentially useful, and well-matched to nursing documentation tasks associated with Nursing Admission Assessment, Blood Administration, and Nursing Discharge Summary. The methods applied in this research may serve as a guide for others wishing to implement user-centered processes to develop or extend EHR systems. In addition, some of the insights obtained in this study may be informative to the development of safe and efficient user interface screens for nursing document templates in EHRs.

  3. Quantification of myocardial perfusion based on signal intensity of flow sensitized MRI

    NASA Astrophysics Data System (ADS)

    Abeykoon, Sumeda B.

    The quantitative assessment of perfusion is important for early recognition of a variety of heart diseases, determination of disease severity and their cure. In conventional approach of measuring cardiac perfusion by arterial spin labeling, the relative difference in the apparent T1 relaxation times in response to selective and non-selective inversion of blood entering the region of interest is related to perfusion via a two-compartment tissue model. But accurate determination of T1 in small animal hearts is difficult and prone to errors due to long scan times. The purpose of this study is to develop a fast, robust and simple method to quantitatively assess myocardial perfusion using arterial spin labeling. The proposed method is based on signal intensities (SI) of inversion recovery slice-select, non-select and steady-state images. Especially in this method data are acquired at a single inversion time and at short repetition times. This study began by investigating the accuracy of assessment of perfusion using a two compartment system. First, determination of perfusion by T1 and SI were implemented to a simple, two-compartment phantom model. Mathematical model developed for full spin exchange models (in-vivo experiments) by solving a modified Bloch equation was modified to develop mathematical models (T1 and SI) for a phantom (zero spin exchange). The phantom result at different flow rates shows remarkable evidence of accuracy of the two-compartment model and SI, T1 methods: the SI method has less propagation error and less scan time. Next, twelve healthy C57BL/6 mice were scanned for quantitative perfusion assessment and three of them were repeatedly scanned at three different time points for a reproducibility test. The myocardial perfusion of healthy mice obtained by the SI-method, 5.7+/-1.6 ml/g/min, was similar (p=0.38) to that obtained by the conventional T1 method, 5.6+/- 2.3 ml/g/min. The reproducibility of the SI method shows acceptable results: the maximum percentage deviation is about 5%. Then the SI-method was used in comparison to a delayed enhanced method to qualitatively and quantitatively assess perfusion deficits in an ischemia-reperfusion (IR) mouse model. The infarcted region of the perfusion map is comparable to the hyper intense region of the delayed enhanced image of the IR mouse. The SI method also used to record a chronological comparison of perfusion on delta sarcoglycan null (DSG) mice. Perfusion of DSG and wild-type (WT) mice at ages of 12 weeks and 32 weeks were compared and percentage change of perfusion was estimated. The result shows that in DSG mice perfusion changes considerably. Finally, the SI method was implemented on a 3 Tesla Philip scanner by modifying to data acquisition method. The perfusion obtained in this is consistent with literature values but further adjustment of pulse sequence and modification of numerical solution is needed. The most important benefit of the SI method is that it reduces scan time 30%--40% and lessens motion artifacts of images compared to the T1 method. This study demonstrates that the signal intensity-based ASL method is a robust alternative to the conventional T1-method.

  4. Integrated Assessment of Health-related Economic Impacts of U.S. Air Pollution Policy

    NASA Astrophysics Data System (ADS)

    Saari, R. K.; Rausch, S.; Selin, N. E.

    2012-12-01

    We examine the environmental impacts, health-related economic benefits, and distributional effects of new US regulations to reduce smog from power plants, namely: the Cross-State Air Pollution Rule. Using integrated assessment methods, linking atmospheric and economic models, we assess the magnitude of economy-wide effects and distributional consequences that are not captured by traditional regulatory impact assessment methods. We study the Cross-State Air Pollution Rule, a modified allowance trading scheme that caps emissions of nitrogen oxides and sulfur dioxide from power plants in the eastern United States and thus reduces ozone and particulate matter pollution. We use results from the regulatory regional air quality model, CAMx (the Comprehensive Air Quality Model with extensions), and epidemiologic studies in BenMAP (Environmental Benefits Mapping and Analysis Program), to quantify differences in morbidities and mortalities due to this policy. To assess the economy-wide and distributional consequences of these health impacts, we apply a recently developed economic and policy model, the US Regional Energy and Environmental Policy Model (USREP), a multi-region, multi-sector, multi-household, recursive dynamic computable general equilibrium economic model of the US that provides a detailed representation of the energy sector, and the ability to represent energy and environmental policies. We add to USREP a representation of air pollution impacts, including the estimation and valuation of health outcomes and their effects on health services, welfare, and factor markets. We find that the economic welfare benefits of the Rule are underestimated by traditional methods, which omit economy-wide impacts. We also quantify the distribution of benefits, which have varying effects across US regions, income groups, and pollutants, and we identify factors influencing this distribution, including the geographic variation of pollution and population as well as underlying economic conditions.

  5. Improvement of the edge method for on-orbit MTF measurement.

    PubMed

    Viallefont-Robinet, Françoise; Léger, Dominique

    2010-02-15

    The edge method is a widely used way to assess the on-orbit Modulation Transfer Function (MTF). Since good quality is required for the edge, the higher the spatial resolution, the better the results are. In this case, an artificial target can be built and used to ensure a good edge quality. For moderate spatial resolutions, only natural targets are available. Hence the edge quality is unknown and generally rather poor. Improvements of the method have been researched in order to compensate for the poor quality of natural edges. This has been done through the use of symmetry and/or a transfer function model, which enables the elimination of noise. This has also been used for artificial target. In this case, the use of the model overcomes the incomplete sampling when the target is too small or gives the opportunity to assess the defocus of the sensor. This paper begins with a recall of the method followed by a presentation of the changes relying on transfer function parametric model. The transfer function model and the process corresponding to the changes are described. Applications of these changes for several satellites of the French spatial agency are presented: for SPOT 1, it enables to assess XS MTF with natural edges, for SPOT 5, it enables to use the Salon-de-Provence artificial target for MTF assessment in the HM mode, and for the foreseen Pleiades, it enables to estimate the defocus.

  6. Milestone-specific, Observed data points for evaluating levels of performance (MODEL) assessment strategy for anesthesiology residency programs.

    PubMed

    Nagy, Christopher J; Fitzgerald, Brian M; Kraus, Gregory P

    2014-01-01

    Anesthesiology residency programs will be expected to have Milestones-based evaluation systems in place by July 2014 as part of the Next Accreditation System. The San Antonio Uniformed Services Health Education Consortium (SAUSHEC) anesthesiology residency program developed and implemented a Milestones-based feedback and evaluation system a year ahead of schedule. It has been named the Milestone-specific, Observed Data points for Evaluating Levels of performance (MODEL) assessment strategy. The "MODEL Menu" and the "MODEL Blueprint" are tools that other anesthesiology residency programs can use in developing their own Milestones-based feedback and evaluation systems prior to ACGME-required implementation. Data from our early experience with the streamlined MODEL blueprint assessment strategy showed substantially improved faculty compliance with reporting requirements. The MODEL assessment strategy provides programs with a workable assessment method for residents, and important Milestones data points to programs for ACGME reporting.

  7. Partial corrosion casting to assess cochlear vasculature in mouse models of presbycusis and CMV infection.

    PubMed

    Carraro, Mattia; Park, Albert H; Harrison, Robert V

    2016-02-01

    Some forms of sensorineural hearing loss involve damage or degenerative changes to the stria vascularis and/or other vascular structures in the cochlea. In animal models, many methods for anatomical assessment of cochlear vasculature exist, each with advantages and limitations. One methodology, corrosion casting, has proved useful in some species, however in the mouse model this technique is difficult to achieve because digestion of non vascular tissue results in collapse of the delicate cast specimen. We have developed a partial corrosion cast method that allows visualization of vasculature along much of the cochlear length but maintains some structural integrity of the specimen. We provide a detailed step-by-step description of this novel technique. We give some illustrative examples of the use of the method in mouse models of presbycusis and cytomegalovirus (CMV) infection. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Perspectives to performance of environment and health assessments and models--from outputs to outcomes?

    PubMed

    Pohjola, Mikko V; Pohjola, Pasi; Tainio, Marko; Tuomisto, Jouni T

    2013-06-26

    The calls for knowledge-based policy and policy-relevant research invoke a need to evaluate and manage environment and health assessments and models according to their societal outcomes. This review explores how well the existing approaches to assessment and model performance serve this need. The perspectives to assessment and model performance in the scientific literature can be called: (1) quality assurance/control, (2) uncertainty analysis, (3) technical assessment of models, (4) effectiveness and (5) other perspectives, according to what is primarily seen to constitute the goodness of assessments and models. The categorization is not strict and methods, tools and frameworks in different perspectives may overlap. However, altogether it seems that most approaches to assessment and model performance are relatively narrow in their scope. The focus in most approaches is on the outputs and making of assessments and models. Practical application of the outputs and the consequential outcomes are often left unaddressed. It appears that more comprehensive approaches that combine the essential characteristics of different perspectives are needed. This necessitates a better account of the mechanisms of collective knowledge creation and the relations between knowledge and practical action. Some new approaches to assessment, modeling and their evaluation and management span the chain from knowledge creation to societal outcomes, but the complexity of evaluating societal outcomes remains a challenge.

  9. A model for the rapid assessment of the impact of aviation noise near airports.

    PubMed

    Torija, Antonio J; Self, Rod H; Flindell, Ian H

    2017-02-01

    This paper introduces a simplified model [Rapid Aviation Noise Evaluator (RANE)] for the calculation of aviation noise within the context of multi-disciplinary strategic environmental assessment where input data are both limited and constrained by compatibility requirements against other disciplines. RANE relies upon the concept of noise cylinders around defined flight-tracks with the Noise Radius determined from publicly available Noise-Power-Distance curves rather than the computationally intensive multiple point-to-point grid calculation with subsequent ISO-contour interpolation methods adopted in the FAA's Integrated Noise Model (INM) and similar models. Preliminary results indicate that for simple single runway scenarios, changes in airport noise contour areas can be estimated with minimal uncertainty compared against grid-point calculation methods such as INM. In situations where such outputs are all that is required for preliminary strategic environmental assessment, there are considerable benefits in reduced input data and computation requirements. Further development of the noise-cylinder-based model (such as the incorporation of lateral attenuation, engine-installation-effects or horizontal track dispersion via the assumption of more complex noise surfaces formed around the flight-track) will allow for more complex assessment to be carried out. RANE is intended to be incorporated into technology evaluators for the noise impact assessment of novel aircraft concepts.

  10. Sabotage at Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purvis, James W.

    1999-07-21

    Recently there has been a noted worldwide increase in violent actions including attempted sabotage at nuclear power plants. Several organizations, such as the International Atomic Energy Agency and the US Nuclear Regulatory Commission, have guidelines, recommendations, and formal threat- and risk-assessment processes for the protection of nuclear assets. Other examples are the former Defense Special Weapons Agency, which used a risk-assessment model to evaluate force-protection security requirements for terrorist incidents at DOD military bases. The US DOE uses a graded approach to protect its assets based on risk and vulnerability assessments. The Federal Aviation Administration and Federal Bureau of Investigationmore » conduct joint threat and vulnerability assessments on high-risk US airports. Several private companies under contract to government agencies use formal risk-assessment models and methods to identify security requirements. The purpose of this paper is to survey these methods and present an overview of all potential types of sabotage at nuclear power plants. The paper discusses emerging threats and current methods of choice for sabotage--especially vehicle bombs and chemical attacks. Potential consequences of sabotage acts, including economic and political; not just those that may result in unacceptable radiological exposure to the public, are also discussed. Applicability of risk-assessment methods and mitigation techniques are also presented.« less

  11. IN-RESIDENCE, MULTIPLE ROUTE EXPOSURES TO CHLORPYRIFOS AND DIAZINON ESTIMATED BY INDIRECT METHOD MODELS

    EPA Science Inventory

    One of the objectives of the National Human Exposure Assessment Survey (NHEXAS) is to estimate exposures to several pollutants in multiple media and determine their distributions for the population of Arizona. This paper presents modeling methods used to estimate exposure dist...

  12. A time series modeling approach in risk appraisal of violent and sexual recidivism.

    PubMed

    Bani-Yaghoub, Majid; Fedoroff, J Paul; Curry, Susan; Amundsen, David E

    2010-10-01

    For over half a century, various clinical and actuarial methods have been employed to assess the likelihood of violent recidivism. Yet there is a need for new methods that can improve the accuracy of recidivism predictions. This study proposes a new time series modeling approach that generates high levels of predictive accuracy over short and long periods of time. The proposed approach outperformed two widely used actuarial instruments (i.e., the Violence Risk Appraisal Guide and the Sex Offender Risk Appraisal Guide). Furthermore, analysis of temporal risk variations based on specific time series models can add valuable information into risk assessment and management of violent offenders.

  13. Introduction: Special issue on advances in topobathymetric mapping, models, and applications

    USGS Publications Warehouse

    Gesch, Dean B.; Brock, John C.; Parrish, Christopher E.; Rogers, Jeffrey N.; Wright, C. Wayne

    2016-01-01

    Detailed knowledge of near-shore topography and bathymetry is required for many geospatial data applications in the coastal environment. New data sources and processing methods are facilitating development of seamless, regional-scale topobathymetric digital elevation models. These elevation models integrate disparate multi-sensor, multi-temporal topographic and bathymetric datasets to provide a coherent base layer for coastal science applications such as wetlands mapping and monitoring, sea-level rise assessment, benthic habitat mapping, erosion monitoring, and storm impact assessment. The focus of this special issue is on recent advances in the source data, data processing and integration methods, and applications of topobathymetric datasets.

  14. Landslide hazard assessment: recent trends and techniques.

    PubMed

    Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S

    2013-01-01

    Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.

  15. Personality Assessment in the Schools: Issues and Procedures for School Psychologists.

    ERIC Educational Resources Information Center

    Knoff, Howard M.

    1983-01-01

    A conceptual model for school-based personality assessment, methods to integrate behavioral and projective assessment procedures, and issues surrounding the use of projective tests are presented. Ways to maximize the personality assessment process for use in placement and programing decisions are suggested. (Author/DWH)

  16. Mathematical modelling and quantitative methods.

    PubMed

    Edler, L; Poirier, K; Dourson, M; Kleiner, J; Mileson, B; Nordmann, H; Renwick, A; Slob, W; Walton, K; Würtzen, G

    2002-01-01

    The present review reports on the mathematical methods and statistical techniques presently available for hazard characterisation. The state of the art of mathematical modelling and quantitative methods used currently for regulatory decision-making in Europe and additional potential methods for risk assessment of chemicals in food and diet are described. Existing practices of JECFA, FDA, EPA, etc., are examined for their similarities and differences. A framework is established for the development of new and improved quantitative methodologies. Areas for refinement, improvement and increase of efficiency of each method are identified in a gap analysis. Based on this critical evaluation, needs for future research are defined. It is concluded from our work that mathematical modelling of the dose-response relationship would improve the risk assessment process. An adequate characterisation of the dose-response relationship by mathematical modelling clearly requires the use of a sufficient number of dose groups to achieve a range of different response levels. This need not necessarily lead to an increase in the total number of animals in the study if an appropriate design is used. Chemical-specific data relating to the mode or mechanism of action and/or the toxicokinetics of the chemical should be used for dose-response characterisation whenever possible. It is concluded that a single method of hazard characterisation would not be suitable for all kinds of risk assessments, and that a range of different approaches is necessary so that the method used is the most appropriate for the data available and for the risk characterisation issue. Future refinements to dose-response characterisation should incorporate more clearly the extent of uncertainty and variability in the resulting output.

  17. Development of Human Posture Simulation Method for Assessing Posture Angles and Spinal Loads

    PubMed Central

    Lu, Ming-Lun; Waters, Thomas; Werren, Dwight

    2015-01-01

    Video-based posture analysis employing a biomechanical model is gaining a growing popularity for ergonomic assessments. A human posture simulation method of estimating multiple body postural angles and spinal loads from a video record was developed to expedite ergonomic assessments. The method was evaluated by a repeated measures study design with three trunk flexion levels, two lift asymmetry levels, three viewing angles and three trial repetitions as experimental factors. The study comprised two phases evaluating the accuracy of simulating self and other people’s lifting posture via a proxy of a computer-generated humanoid. The mean values of the accuracy of simulating self and humanoid postures were 12° and 15°, respectively. The repeatability of the method for the same lifting condition was excellent (~2°). The least simulation error was associated with side viewing angle. The estimated back compressive force and moment, calculated by a three dimensional biomechanical model, exhibited a range of 5% underestimation. The posture simulation method enables researchers to simultaneously quantify body posture angles and spinal loading variables with accuracy and precision comparable to on-screen posture matching methods. PMID:26361435

  18. Development of a lumbar EMG-based coactivation index for the assessment of complex dynamic tasks.

    PubMed

    Le, Peter; Aurand, Alexander; Walter, Benjamin A; Best, Thomas M; Khan, Safdar N; Mendel, Ehud; Marras, William S

    2018-03-01

    The objective of this study was to develop and test an EMG-based coactivation index and compare it to a coactivation index defined by a biologically assisted lumbar spine model to differentiate between tasks. The purpose was to provide a universal approach to assess coactivation of a multi-muscle system when a computational model is not accessible. The EMG-based index developed utilised anthropometric-defined muscle characteristics driven by torso kinematics and EMG. Muscles were classified as agonists/antagonists based upon 'simulated' moments of the muscles relative to the total 'simulated' moment. Different tasks were used to test the range of the index including lifting, pushing and Valsalva. Results showed that the EMG-based index was comparable to the index defined by a biologically assisted model (r 2  = 0.78). Overall, the EMG-based index provides a universal, usable method to assess the neuromuscular effort associated with coactivation for complex dynamic tasks when the benefit of a biomechanical model is not available. Practitioner Summary: A universal coactivation index for the lumbar spine was developed to assess complex dynamic tasks. This method was validated relative to a model-based index for use when a high-end computational model is not available. Its simplicity allows for fewer inputs and usability for assessment of task ergonomics and rehabilitation.

  19. Economic Assessment: A Model for Assessing Ability to Pay.

    ERIC Educational Resources Information Center

    Andre, Patricia; And Others

    1978-01-01

    Accurate assessment of the client's ability to pay is the cornerstone to fee collections in any service organization. York County Counseling Services implemented a new method of fee assessment and collection based on the principles of providing a service worth paying for, accurate assessment of ability to pay, and a budget-payment system. (Author)

  20. Qualitative approaches to use of the RE-AIM framework: rationale and methods.

    PubMed

    Holtrop, Jodi Summers; Rabin, Borsika A; Glasgow, Russell E

    2018-03-13

    There have been over 430 publications using the RE-AIM model for planning and evaluation of health programs and policies, as well as numerous applications of the model in grant proposals and national programs. Full use of the model includes use of qualitative methods to understand why and how results were obtained on different RE-AIM dimensions, however, recent reviews have revealed that qualitative methods have been used infrequently. Having quantitative and qualitative methods and results iteratively inform each other should enhance understanding and lessons learned. Because there have been few published examples of qualitative approaches and methods using RE-AIM for planning or assessment and no guidance on how qualitative approaches can inform these processes, we provide guidance on qualitative methods to address the RE-AIM model and its various dimensions. The intended audience is researchers interested in applying RE-AIM or similar implementation models, but the methods discussed should also be relevant to those in community or clinical settings. We present directions for, examples of, and guidance on how qualitative methods can be used to address each of the five RE-AIM dimensions. Formative qualitative methods can be helpful in planning interventions and designing for dissemination. Summative qualitative methods are useful when used in an iterative, mixed methods approach for understanding how and why different patterns of results occur. In summary, qualitative and mixed methods approaches to RE-AIM help understand complex situations and results, why and how outcomes were obtained, and contextual factors not easily assessed using quantitative measures.

  1. Modeling fate and transport of "Contaminants of Emerging Concern" (CECs): is the Soil Water Assessment Tool (SWAT) the appropriate model?

    USDA-ARS?s Scientific Manuscript database

    Background/Question/Methods As the scientific and regulatory communities realize the significant environmental impacts and ubiquity of “contaminants of emerging concern” (CECs), it is increasingly imperative to develop quantitative assessment tools to evaluate and predict the fate and transport of...

  2. Evaluating Rater Accuracy in Rater-Mediated Assessments Using an Unfolding Model

    ERIC Educational Resources Information Center

    Wang, Jue; Engelhard, George, Jr.; Wolfe, Edward W.

    2016-01-01

    The number of performance assessments continues to increase around the world, and it is important to explore new methods for evaluating the quality of ratings obtained from raters. This study describes an unfolding model for examining rater accuracy. Accuracy is defined as the difference between observed and expert ratings. Dichotomous accuracy…

  3. A Conceptual Model and Assessment Template for Capacity Evaluation in Adult Guardianship

    ERIC Educational Resources Information Center

    Moye, Jennifer; Butz, Steven W.; Marson, Daniel C.; Wood, Erica

    2007-01-01

    Purpose: We develop a conceptual model and associated assessment template that is usable across state jurisdictions for evaluating the independent-living capacity of older adults in guardianship proceedings. Design and Methods: We used an iterative process in which legal provisions for guardianship and prevailing clinical practices for capacity…

  4. Integrating Individual Differences in Career Assessment: The Atlas Model of Individual Differences and the Strong Ring

    ERIC Educational Resources Information Center

    Armstrong, Patrick Ian; Rounds, James

    2010-01-01

    Career assessment methods often include measures of individual differences constructs, such as interests, personality, abilities, and values. Although many researchers have recently called for the development of integrated models, career counseling professionals have long faced the challenge of integrating this information into their practice. The…

  5. Using the Madeline Hunter Direct Instruction Model to Improve Outcomes Assessments in Marketing Programs

    ERIC Educational Resources Information Center

    Steward, Michelle D.; Martin, Gregory S.; Burns, Alvin C.; Bush, Ronald F.

    2010-01-01

    This study introduces marketing educators to the Madeline Hunter Direct Instruction Model (HDIM) as an approach to significantly and substantially improve student learning through course-embedded assessment. The effectiveness of the method is illustrated in three different marketing courses taught by three different marketing professors. The…

  6. Assessing the New Competencies for Resident Education: A Model from an Emergency Medicine Program.

    ERIC Educational Resources Information Center

    Reisdorff, Earl J.; Hayes, Oliver W.; Carlson, Dale J.; Walker, Gregory L.

    2001-01-01

    Based on the experience of Michigan State University's emergency medicine residency program, proposes a practical method for modifying an existing student evaluation format. The model provides a template other programs could use in assessing residents' acquisition of the knowledge, skills, and attitudes reflected in the six general competencies…

  7. Assessment of Computational Fluid Dynamics (CFD) Models for Shock Boundary-Layer Interaction

    NASA Technical Reports Server (NTRS)

    DeBonis, James R.; Oberkampf, William L.; Wolf, Richard T.; Orkwis, Paul D.; Turner, Mark G.; Babinsky, Holger

    2011-01-01

    A workshop on the computational fluid dynamics (CFD) prediction of shock boundary-layer interactions (SBLIs) was held at the 48th AIAA Aerospace Sciences Meeting. As part of the workshop numerous CFD analysts submitted solutions to four experimentally measured SBLIs. This paper describes the assessment of the CFD predictions. The assessment includes an uncertainty analysis of the experimental data, the definition of an error metric and the application of that metric to the CFD solutions. The CFD solutions provided very similar levels of error and in general it was difficult to discern clear trends in the data. For the Reynolds Averaged Navier-Stokes methods the choice of turbulence model appeared to be the largest factor in solution accuracy. Large-eddy simulation methods produced error levels similar to RANS methods but provided superior predictions of normal stresses.

  8. A robust method to forecast volcanic ash clouds

    USGS Publications Warehouse

    Denlinger, Roger P.; Pavolonis, Mike; Sieglaff, Justin

    2012-01-01

    Ash clouds emanating from volcanic eruption columns often form trails of ash extending thousands of kilometers through the Earth's atmosphere, disrupting air traffic and posing a significant hazard to air travel. To mitigate such hazards, the community charged with reducing flight risk must accurately assess risk of ash ingestion for any flight path and provide robust forecasts of volcanic ash dispersal. In response to this need, a number of different transport models have been developed for this purpose and applied to recent eruptions, providing a means to assess uncertainty in forecasts. Here we provide a framework for optimal forecasts and their uncertainties given any model and any observational data. This involves random sampling of the probability distributions of input (source) parameters to a transport model and iteratively running the model with different inputs, each time assessing the predictions that the model makes about ash dispersal by direct comparison with satellite data. The results of these comparisons are embodied in a likelihood function whose maximum corresponds to the minimum misfit between model output and observations. Bayes theorem is then used to determine a normalized posterior probability distribution and from that a forecast of future uncertainty in ash dispersal. The nature of ash clouds in heterogeneous wind fields creates a strong maximum likelihood estimate in which most of the probability is localized to narrow ranges of model source parameters. This property is used here to accelerate probability assessment, producing a method to rapidly generate a prediction of future ash concentrations and their distribution based upon assimilation of satellite data as well as model and data uncertainties. Applying this method to the recent eruption of Eyjafjallajökull in Iceland, we show that the 3 and 6 h forecasts of ash cloud location probability encompassed the location of observed satellite-determined ash cloud loads, providing an efficient means to assess all of the hazards associated with these ash clouds.

  9. Sensitivity of effective rainfall amount to land use description using GIS tool. Case of a small mediterranean catchment

    NASA Astrophysics Data System (ADS)

    Payraudeau, S.; Tournoud, M. G.; Cernesson, F.

    Distributed modelling in hydrology assess catchment subdivision to take into account physic characteristics. In this paper, we test the effect of land use aggregation scheme on catchment hydrological response. Evolution of intra-subcatchment land use is studied using statistic and entropy methods. The SCS-CN method is used to calculate effective rainfall which is here assimilated to hydrological response. Our purpose is to determine the existence of a critical threshold-area appropriate for the application of hydrological modelling. Land use aggregation effects on effective rainfall is assessed on small mediterranean catchment. The results show that land use aggregation and land use classification type have significant effects on hydrological modelling and in particular on effective rainfall modelling.

  10. Warfighter decision making performance analysis as an investment priority driver

    NASA Astrophysics Data System (ADS)

    Thornley, David J.; Dean, David F.; Kirk, James C.

    2010-04-01

    Estimating the relative value of alternative tactics, techniques and procedures (TTP) and information systems requires measures of the costs and benefits of each, and methods for combining and comparing those measures. The NATO Code of Best Practice for Command and Control Assessment explains that decision making quality would ideally be best assessed on outcomes. Lessons learned in practice can be assessed statistically to support this, but experimentation with alternate measures in live conflict is undesirable. To this end, the development of practical experimentation to parameterize effective constructive simulation and analytic modelling for system utility prediction is desirable. The Land Battlespace Systems Department of Dstl has modeled human development of situational awareness to support constructive simulation by empirically discovering how evidence is weighed according to circumstance, personality, training and briefing. The human decision maker (DM) provides the backbone of the information processing activity associated with military engagements because of inherent uncertainty associated with combat operations. To develop methods for representing the process in order to assess equipment and non-technological interventions such as training and TTPs we are developing componentized or modularized timed analytic stochastic model components and instruments as part of a framework to support quantitative assessment of intelligence production and consumption methods in a human decision maker-centric mission space. In this paper, we formulate an abstraction of the human intelligence fusion process from the Defence Science and Technology Laboratory's (Dstl's) INCIDER model to include in our framework, and synthesize relevant cost and benefit characteristics.

  11. 75 FR 2523 - Office of Innovation and Improvement; Overview Information; Arts in Education Model Development...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-15

    ... that is based on rigorous scientifically based research methods to assess the effectiveness of a...) Relies on measurements or observational methods that provide reliable and valid data across evaluators... of innovative, cohesive models that are based on research and have demonstrated that they effectively...

  12. The SIETTE Automatic Assessment Environment

    ERIC Educational Resources Information Center

    Conejo, Ricardo; Guzmán, Eduardo; Trella, Monica

    2016-01-01

    This article describes the evolution and current state of the domain-independent Siette assessment environment. Siette supports different assessment methods--including classical test theory, item response theory, and computer adaptive testing--and integrates them with multidimensional student models used by intelligent educational systems.…

  13. Comparison of prediction methods for octanol-air partition coefficients of diverse organic compounds.

    PubMed

    Fu, Zhiqiang; Chen, Jingwen; Li, Xuehua; Wang, Ya'nan; Yu, Haiying

    2016-04-01

    The octanol-air partition coefficient (KOA) is needed for assessing multimedia transport and bioaccumulability of organic chemicals in the environment. As experimental determination of KOA for various chemicals is costly and laborious, development of KOA estimation methods is necessary. We investigated three methods for KOA prediction, conventional quantitative structure-activity relationship (QSAR) models based on molecular structural descriptors, group contribution models based on atom-centered fragments, and a novel model that predicts KOA via solvation free energy from air to octanol phase (ΔGO(0)), with a collection of 939 experimental KOA values for 379 compounds at different temperatures (263.15-323.15 K) as validation or training sets. The developed models were evaluated with the OECD guidelines on QSAR models validation and applicability domain (AD) description. Results showed that although the ΔGO(0) model is theoretically sound and has a broad AD, the prediction accuracy of the model is the poorest. The QSAR models perform better than the group contribution models, and have similar predictability and accuracy with the conventional method that estimates KOA from the octanol-water partition coefficient and Henry's law constant. One QSAR model, which can predict KOA at different temperatures, was recommended for application as to assess the long-range transport potential of chemicals. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Assessing disease stress and modeling yield losses in alfalfa

    NASA Astrophysics Data System (ADS)

    Guan, Jie

    Alfalfa is the most important forage crop in the U.S. and worldwide. Fungal foliar diseases are believed to cause significant yield losses in alfalfa, yet, little quantitative information exists regarding the amount of crop loss. Different fungicides and application frequencies were used as tools to generate a range of foliar disease intensities in Ames and Nashua, IA. Visual disease assessments (disease incidence, disease severity, and percentage defoliation) were obtained weekly for each alfalfa growth cycle (two to three growing cycles per season). Remote sensing assessments were performed using a hand-held, multispectral radiometer to measure the amount and quality of sunlight reflected from alfalfa canopies. Factors such as incident radiation, sun angle, sensor height, and leaf wetness were all found to significantly affect the percentage reflectance of sunlight reflected from alfalfa canopies. The precision of visual and remote sensing assessment methods was quantified. Precision was defined as the intra-rater repeatability and inter-rater reliability of assessment methods. F-tests, slopes, intercepts, and coefficients of determination (R2) were used to compare assessment methods for precision. Results showed that among the three visual disease assessment methods (disease incidence, disease severity, and percentage defoliation), percentage defoliation had the highest intra-rater repeatability and inter-rater reliability. Remote sensing assessment method had better precision than the percentage defoliation assessment method based upon higher intra-rater repeatability and inter-rater reliability. Significant linear relationships between canopy reflectance (810 nm), percentage defoliation and yield were detected using linear regression and percentage reflectance (810 nm) assessments were found to have a stronger relationship with yield than percentage defoliation assessments. There were also significant linear relationships between percentage defoliation, dry weight, percentage reflectance (810 nm), and green leaf area index (GLAI). Percentage reflectance (810 nm) assessments had a stronger relationship with dry weight and green leaf area index than percentage defoliation assessments. Our research conclusively demonstrates that percentage reflectance measurements can be used to nondestructively assess green leaf area index which is a direct measure of plant health and an indirect measure of productivity. This research conclusively demonstrates that remote sensing is superior to visual assessment method to assess alfalfa stress and to model yield and GLAI in the alfalfa foliar disease pathosystem.

  15. Risk assessment of vector-borne diseases for public health governance.

    PubMed

    Sedda, L; Morley, D W; Braks, M A H; De Simone, L; Benz, D; Rogers, D J

    2014-12-01

    In the context of public health, risk governance (or risk analysis) is a framework for the assessment and subsequent management and/or control of the danger posed by an identified disease threat. Generic frameworks in which to carry out risk assessment have been developed by various agencies. These include monitoring, data collection, statistical analysis and dissemination. Due to the inherent complexity of disease systems, however, the generic approach must be modified for individual, disease-specific risk assessment frameworks. The analysis was based on the review of the current risk assessments of vector-borne diseases adopted by the main Public Health organisations (OIE, WHO, ECDC, FAO, CDC etc…). Literature, legislation and statistical assessment of the risk analysis frameworks. This review outlines the need for the development of a general public health risk assessment method for vector-borne diseases, in order to guarantee that sufficient information is gathered to apply robust models of risk assessment. Stochastic (especially spatial) methods, often in Bayesian frameworks are now gaining prominence in standard risk assessment procedures because of their ability to assess accurately model uncertainties. Risk assessment needs to be addressed quantitatively wherever possible, and submitted with its quality assessment in order to enable successful public health measures to be adopted. In terms of current practice, often a series of different models and analyses are applied to the same problem, with results and outcomes that are difficult to compare because of the unknown model and data uncertainties. Therefore, the risk assessment areas in need of further research are identified in this article. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  16. Traffic-Related Air Pollution and Childhood Asthma: Recent Advances and Remaining Gaps in the Exposure Assessment Methods.

    PubMed

    Khreis, Haneen; Nieuwenhuijsen, Mark J

    2017-03-17

    Background : Current levels of traffic-related air pollution (TRAP) are associated with the development of childhood asthma, although some inconsistencies and heterogeneity remain. An important part of the uncertainty in studies of TRAP-associated asthma originates from uncertainties in the TRAP exposure assessment and assignment methods. In this work, we aim to systematically review the exposure assessment methods used in the epidemiology of TRAP and childhood asthma, highlight recent advances, remaining research gaps and make suggestions for further research. Methods : We systematically reviewed epidemiological studies published up until 8 September 2016 and available in Embase, Ovid MEDLINE (R), and "Transport database". We included studies which examined the association between children's exposure to TRAP metrics and their risk of "asthma" incidence or lifetime prevalence, from birth to the age of 18 years old. Results : We found 42 studies which examined the associations between TRAP and subsequent childhood asthma incidence or lifetime prevalence, published since 1999. Land-use regression modelling was the most commonly used method and nitrogen dioxide (NO₂) was the most commonly used pollutant in the exposure assessments. Most studies estimated TRAP exposure at the residential address and only a few considered the participants' mobility. TRAP exposure was mostly assessed at the birth year and only a few studies considered different and/or multiple exposure time windows. We recommend that further work is needed including e.g., the use of new exposure metrics such as the composition of particulate matter, oxidative potential and ultra-fine particles, improved modelling e.g., by combining different exposure assessment models, including mobility of the participants, and systematically investigating different exposure time windows. Conclusions : Although our previous meta-analysis found statistically significant associations for various TRAP exposures and subsequent childhood asthma, further refinement of the exposure assessment may improve the risk estimates, and shed light on critical exposure time windows, putative agents, underlying mechanisms and drivers of heterogeneity.

  17. Traffic-Related Air Pollution and Childhood Asthma: Recent Advances and Remaining Gaps in the Exposure Assessment Methods

    PubMed Central

    Khreis, Haneen; Nieuwenhuijsen, Mark J.

    2017-01-01

    Background: Current levels of traffic-related air pollution (TRAP) are associated with the development of childhood asthma, although some inconsistencies and heterogeneity remain. An important part of the uncertainty in studies of TRAP-associated asthma originates from uncertainties in the TRAP exposure assessment and assignment methods. In this work, we aim to systematically review the exposure assessment methods used in the epidemiology of TRAP and childhood asthma, highlight recent advances, remaining research gaps and make suggestions for further research. Methods: We systematically reviewed epidemiological studies published up until 8 September 2016 and available in Embase, Ovid MEDLINE (R), and “Transport database”. We included studies which examined the association between children’s exposure to TRAP metrics and their risk of “asthma” incidence or lifetime prevalence, from birth to the age of 18 years old. Results: We found 42 studies which examined the associations between TRAP and subsequent childhood asthma incidence or lifetime prevalence, published since 1999. Land-use regression modelling was the most commonly used method and nitrogen dioxide (NO2) was the most commonly used pollutant in the exposure assessments. Most studies estimated TRAP exposure at the residential address and only a few considered the participants’ mobility. TRAP exposure was mostly assessed at the birth year and only a few studies considered different and/or multiple exposure time windows. We recommend that further work is needed including e.g., the use of new exposure metrics such as the composition of particulate matter, oxidative potential and ultra-fine particles, improved modelling e.g., by combining different exposure assessment models, including mobility of the participants, and systematically investigating different exposure time windows. Conclusions: Although our previous meta-analysis found statistically significant associations for various TRAP exposures and subsequent childhood asthma, further refinement of the exposure assessment may improve the risk estimates, and shed light on critical exposure time windows, putative agents, underlying mechanisms and drivers of heterogeneity. PMID:28304360

  18. Towards the optimal fusion of high-resolution Digital Elevation Models for detailed urban flood assessment

    NASA Astrophysics Data System (ADS)

    Leitão, J. P.; de Sousa, L. M.

    2018-06-01

    Newly available, more detailed and accurate elevation data sets, such as Digital Elevation Models (DEMs) generated on the basis of imagery from terrestrial LiDAR (Light Detection and Ranging) systems or Unmanned Aerial Vehicles (UAVs), can be used to improve flood-model input data and consequently increase the accuracy of the flood modelling results. This paper presents the first application of the MBlend merging method and assesses the impact of combining different DEMs on flood modelling results. It was demonstrated that different raster merging methods can have different and substantial impacts on these results. In addition to the influence associated with the method used to merge the original DEMs, the magnitude of the impact also depends on (i) the systematic horizontal and vertical differences of the DEMs, and (ii) the orientation between the DEM boundary and the terrain slope. The greater water depth and flow velocity differences between the flood modelling results obtained using the reference DEM and the merged DEMs ranged from -9.845 to 0.002 m, and from 0.003 to 0.024 m s-1 respectively; these differences can have a significant impact on flood hazard estimates. In most of the cases investigated in this study, the differences from the reference DEM results were smaller for the MBlend method than for the results of the two conventional methods. This study highlighted the importance of DEM merging when conducting flood modelling and provided hints on the best DEM merging methods to use.

  19. The heritability of cluster A personality disorders assessed by both personal interview and questionnaire.

    PubMed

    Kendler, Kenneth S; Myers, John; Torgersen, Svenn; Neale, Michael C; Reichborn-Kjennerud, Ted

    2007-05-01

    Personality disorders (PDs) as assessed by questionnaires and personal interviews are heritable. However, we know neither how much unreliability of measurement impacts on heritability estimates nor whether the genetic and environmental risk factors assessed by these two methods are the same. We wish to know whether the same set of PD vulnerability factors are assessed by these two methods. A total of 3334 young adult twin pairs from the Norwegian Institute of Public Health Twin Panel (NIPHTP) completed a questionnaire containing 91 PD items. One to 6 years later, 1386 of these pairs were interviewed with the Structured Interview for DSM-IV Personality (SIDP-IV). Self-report items predicting interview results were selected by regression. Measurement models were fitted using Mx. In the best-fit models, the latent liabilities to paranoid personality disorder (PPD), schizoid personality disorder (SPD) and schizotypal personality disorder (STPD) were all highly heritable with no evidence of shared environmental effects. For PPD and STPD, only unique environmental effects were specific to the interview measure whereas both environmental and genetic effects were found to be specific to the questionnaire assessment. For SPD, the best-fit model contained genetic and environmental effects specific to both forms of assessment. The latent liabilities to the cluster A PDs are highly heritable but are assessed by current methods with only moderate reliability. The personal interviews assessed the genetic risk for the latent trait with excellent specificity for PPD and STPD and good specificity for SPD. However, for all three PDs, the questionnaires were less specific, also indexing an independent set of genetic risk factors.

  20. Review of early assessment models of innovative medical technologies.

    PubMed

    Fasterholdt, Iben; Krahn, Murray; Kidholm, Kristian; Yderstræde, Knud Bonnet; Pedersen, Kjeld Møller

    2017-08-01

    Hospitals increasingly make decisions regarding the early development of and investment in technologies, but a formal evaluation model for assisting hospitals early on in assessing the potential of innovative medical technologies is lacking. This article provides an overview of models for early assessment in different health organisations and discusses which models hold most promise for hospital decision makers. A scoping review of published studies between 1996 and 2015 was performed using nine databases. The following information was collected: decision context, decision problem, and a description of the early assessment model. 2362 articles were identified and 12 studies fulfilled the inclusion criteria. An additional 12 studies were identified and included in the review by searching reference lists. The majority of the 24 early assessment studies were variants of traditional cost-effectiveness analysis. Around one fourth of the studies presented an evaluation model with a broader focus than cost-effectiveness. Uncertainty was mostly handled by simple sensitivity or scenario analysis. This review shows that evaluation models using known methods assessing cost-effectiveness are most prevalent in early assessment, but seems ill-suited for early assessment in hospitals. Four models provided some usable elements for the development of a hospital-based model. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  1. Computational assessment of model-based wave separation using a database of virtual subjects.

    PubMed

    Hametner, Bernhard; Schneider, Magdalena; Parragh, Stephanie; Wassertheurer, Siegfried

    2017-11-07

    The quantification of arterial wave reflection is an important area of interest in arterial pulse wave analysis. It can be achieved by wave separation analysis (WSA) if both the aortic pressure waveform and the aortic flow waveform are known. For better applicability, several mathematical models have been established to estimate aortic flow solely based on pressure waveforms. The aim of this study is to investigate and verify the model-based wave separation of the ARCSolver method on virtual pulse wave measurements. The study is based on an open access virtual database generated via simulations. Seven cardiac and arterial parameters were varied within physiological healthy ranges, leading to a total of 3325 virtual healthy subjects. For assessing the model-based ARCSolver method computationally, this method was used to perform WSA based on the aortic root pressure waveforms of the virtual patients. Asa reference, the values of WSA using both the pressure and flow waveforms provided by the virtual database were taken. The investigated parameters showed a good overall agreement between the model-based method and the reference. Mean differences and standard deviations were -0.05±0.02AU for characteristic impedance, -3.93±1.79mmHg for forward pressure amplitude, 1.37±1.56mmHg for backward pressure amplitude and 12.42±4.88% for reflection magnitude. The results indicate that the mathematical blood flow model of the ARCSolver method is a feasible surrogate for a measured flow waveform and provides a reasonable way to assess arterial wave reflection non-invasively in healthy subjects. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Assessing Consumer Responses to PREPs: A Review of Tobacco Industry and Independent Research Methods

    PubMed Central

    Rees, Vaughan W.; Kreslake, Jennifer M.; Cummings, K. Michael; O'Connor, Richard J.; Hatsukami, Dorothy K.; Parascandola, Mark; Shields, Peter G.; Connolly, Gregory N.

    2009-01-01

    Objective Internal tobacco industry documents and the mainstream literature are reviewed to identify methods and measures for evaluating tobacco consumer response. The review aims to outline areas in which established methods exist, identify gaps in current methods for assessing CR, and consider how these methods might be applied to evaluate PREPs and new products. Methods Internal industry research reviewed included published papers, manuscript drafts, presentations, protocols, and instruments relating to consumer response measures were identified and analyzed. Peer-reviewed research was identified using PubMed and Scopus. Results Industry research on consumer response focuses on product development and marketing. To develop and refine new products, the tobacco industry has developed notable strategies for assessing consumers' sensory and subjective responses to product design characteristics. Independent research is often conducted to gauge the likelihood of future product adoption by measuring consumers' risk perceptions, responses to product, and product acceptability. Conclusions A model which conceptualizes consumer response as comprising the separate, but interacting domains of product perceptions and response to product is outlined. Industry and independent research supports the dual domain model, and provides a wide range of methods for assessment of the construct components of consumer response. Further research is needed to validate consumer response constructs, determine the relationship between consumer response and tobacco user behavior, and improve reliability of consumer response measures. Scientifically rigorous consumer response assessment methods will provide a needed empirical basis for future regulation of PREPs and new products, to counteract tobacco industry influence on consumers, and enhance the public health. PMID:19959675

  3. The CanMEDS role of Collaborator: How is it taught and assessed according to faculty and residents?

    PubMed Central

    Berger, Elizabeth; Chan, Ming-Ka; Kuper, Ayelet; Albert, Mathieu; Jenkins, Deirdre; Harrison, Megan; Harris, Ilene

    2012-01-01

    OBJECTIVE: To explore the perspectives of paediatric residents and faculty regarding how the Collaborator role is taught and assessed. METHODS: Using a constructivist grounded theory approach, focus groups at four Canadian universities were conducted. Data were analyzed iteratively for emergent themes. RESULTS: Residents reported learning about collaboration through faculty role modelling but did not perceive that it was part of the formal curriculum. Faculty reported that they were not trained in how to effectively model this role. Both groups reported a need for training in conflict management, particularly as it applies to intraprofessional (physician-to-physician) relationships. Finally, the participants asserted that current methods to assess residents on their performance as collaborators are suboptimal. CONCLUSIONS: The Collaborator role should be a formal part of the residency curriculum. Residents need to be better educated with regard to managing conflict and handling intraprofessional relationships. Finally, innovative methods of assessing residents on this non-medical expert role need to be created. PMID:24294063

  4. A method to assess the situation of air combat based on the missile attack zone

    NASA Astrophysics Data System (ADS)

    Shi, Zhenqing; Liang, Xiao Long; Zhang, Jiaqiang; Liu, Liu

    2018-04-01

    Aiming at the question that we rarely consider the impact of target's attack zone in traditional situation assessment so that the assessment result is not comprehensive enough, a method that takes target's attack zone into account is presented. This paper has obtained the attack zone and the non-escape zone as the basis for quantitative analysis using the rapid simulation method and the air-to-air missile mathematical model. The situation of air combat is assessed by the ratio of the advantage function values of both sides, and the advantage function is constructed based on some influential factors such as height, speed, distance and angle. The simulation results have shown the effectiveness of this method.

  5. Life cycle impact assessment modeling for particulate matter: A new approach based on physico-chemical particle properties.

    PubMed

    Notter, Dominic A

    2015-09-01

    Particulate matter (PM) causes severe damage to human health globally. Airborne PM is a mixture of solid and liquid droplets suspended in air. It consists of organic and inorganic components, and the particles of concern range in size from a few nanometers to approximately 10μm. The complexity of PM is considered to be the reason for the poor understanding of PM and may also be the reason why PM in environmental impact assessment is poorly defined. Currently, life cycle impact assessment is unable to differentiate highly toxic soot particles from relatively harmless sea salt. The aim of this article is to present a new impact assessment for PM where the impact of PM is modeled based on particle physico-chemical properties. With the new method, 2781 characterization factors that account for particle mass, particle number concentration, particle size, chemical composition and solubility were calculated. Because particle sizes vary over four orders of magnitudes, a sound assessment of PM requires that the exposure model includes deposition of particles in the lungs and that the fate model includes coagulation as a removal mechanism for ultrafine particles. The effects model combines effects from particle size, solubility and chemical composition. The first results from case studies suggest that PM that stems from emissions generally assumed to be highly toxic (e.g. biomass combustion and fossil fuel combustion) might lead to results that are similar compared with an assessment of PM using established methods. However, if harmless PM emissions are emitted, established methods enormously overestimate the damage. The new impact assessment allows a high resolution of the damage allocatable to different size fractions or chemical components. This feature supports a more efficient optimization of processes and products when combating air pollution. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. VoroMQA: Assessment of protein structure quality using interatomic contact areas.

    PubMed

    Olechnovič, Kliment; Venclovas, Česlovas

    2017-06-01

    In the absence of experimentally determined protein structure many biological questions can be addressed using computational structural models. However, the utility of protein structural models depends on their quality. Therefore, the estimation of the quality of predicted structures is an important problem. One of the approaches to this problem is the use of knowledge-based statistical potentials. Such methods typically rely on the statistics of distances and angles of residue-residue or atom-atom interactions collected from experimentally determined structures. Here, we present VoroMQA (Voronoi tessellation-based Model Quality Assessment), a new method for the estimation of protein structure quality. Our method combines the idea of statistical potentials with the use of interatomic contact areas instead of distances. Contact areas, derived using Voronoi tessellation of protein structure, are used to describe and seamlessly integrate both explicit interactions between protein atoms and implicit interactions of protein atoms with solvent. VoroMQA produces scores at atomic, residue, and global levels, all in the fixed range from 0 to 1. The method was tested on the CASP data and compared to several other single-model quality assessment methods. VoroMQA showed strong performance in the recognition of the native structure and in the structural model selection tests, thus demonstrating the efficacy of interatomic contact areas in estimating protein structure quality. The software implementation of VoroMQA is freely available as a standalone application and as a web server at http://bioinformatics.lt/software/voromqa. Proteins 2017; 85:1131-1145. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  7. Comparison of Dissolution Similarity Assessment Methods for Products with Large Variations: f2 Statistics and Model-Independent Multivariate Confidence Region Procedure for Dissolution Profiles of Multiple Oral Products.

    PubMed

    Yoshida, Hiroyuki; Shibata, Hiroko; Izutsu, Ken-Ichi; Goda, Yukihiro

    2017-01-01

    The current Japanese Ministry of Health Labour and Welfare (MHLW)'s Guideline for Bioequivalence Studies of Generic Products uses averaged dissolution rates for the assessment of dissolution similarity between test and reference formulations. This study clarifies how the application of model-independent multivariate confidence region procedure (Method B), described in the European Medical Agency and U.S. Food and Drug Administration guidelines, affects similarity outcomes obtained empirically from dissolution profiles with large variations in individual dissolution rates. Sixty-one datasets of dissolution profiles for immediate release, oral generic, and corresponding innovator products that showed large variation in individual dissolution rates in generic products were assessed on their similarity by using the f 2 statistics defined in the MHLW guidelines (MHLW f 2 method) and two different Method B procedures, including a bootstrap method applied with f 2 statistics (BS method) and a multivariate analysis method using the Mahalanobis distance (MV method). The MHLW f 2 and BS methods provided similar dissolution similarities between reference and generic products. Although a small difference in the similarity assessment may be due to the decrease in the lower confidence interval for expected f 2 values derived from the large variation in individual dissolution rates, the MV method provided results different from those obtained through MHLW f 2 and BS methods. Analysis of actual dissolution data for products with large individual variations would provide valuable information towards an enhanced understanding of these methods and their possible incorporation in the MHLW guidelines.

  8. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.

    PubMed

    Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R

    2014-03-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.

  9. Operation Reliability Assessment for Cutting Tools by Applying a Proportional Covariate Model to Condition Monitoring Information

    PubMed Central

    Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia

    2012-01-01

    The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools. PMID:23201980

  10. Assessment of a hybrid finite element-transfer matrix model for flat structures with homogeneous acoustic treatments.

    PubMed

    Alimonti, Luca; Atalla, Noureddine; Berry, Alain; Sgard, Franck

    2014-05-01

    Modeling complex vibroacoustic systems including poroelastic materials using finite element based methods can be unfeasible for practical applications. For this reason, analytical approaches such as the transfer matrix method are often preferred to obtain a quick estimation of the vibroacoustic parameters. However, the strong assumptions inherent within the transfer matrix method lead to a lack of accuracy in the description of the geometry of the system. As a result, the transfer matrix method is inherently limited to the high frequency range. Nowadays, hybrid substructuring procedures have become quite popular. Indeed, different modeling techniques are typically sought to describe complex vibroacoustic systems over the widest possible frequency range. As a result, the flexibility and accuracy of the finite element method and the efficiency of the transfer matrix method could be coupled in a hybrid technique to obtain a reduction of the computational burden. In this work, a hybrid methodology is proposed. The performances of the method in predicting the vibroacoutic indicators of flat structures with attached homogeneous acoustic treatments are assessed. The results prove that, under certain conditions, the hybrid model allows for a reduction of the computational effort while preserving enough accuracy with respect to the full finite element solution.

  11. Calibration of groundwater vulnerability mapping using the generalized reduced gradient method.

    PubMed

    Elçi, Alper

    2017-12-01

    Groundwater vulnerability assessment studies are essential in water resources management. Overlay-and-index methods such as DRASTIC are widely used for mapping of groundwater vulnerability, however, these methods mainly suffer from a subjective selection of model parameters. The objective of this study is to introduce a calibration procedure that results in a more accurate assessment of groundwater vulnerability. The improvement of the assessment is formulated as a parameter optimization problem using an objective function that is based on the correlation between actual groundwater contamination and vulnerability index values. The non-linear optimization problem is solved with the generalized-reduced-gradient (GRG) method, which is numerical algorithm based optimization method. To demonstrate the applicability of the procedure, a vulnerability map for the Tahtali stream basin is calibrated using nitrate concentration data. The calibration procedure is easy to implement and aims the maximization of correlation between observed pollutant concentrations and groundwater vulnerability index values. The influence of each vulnerability parameter in the calculation of the vulnerability index is assessed by performing a single-parameter sensitivity analysis. Results of the sensitivity analysis show that all factors are effective on the final vulnerability index. Calibration of the vulnerability map improves the correlation between index values and measured nitrate concentrations by 19%. The regression coefficient increases from 0.280 to 0.485. It is evident that the spatial distribution and the proportions of vulnerability class areas are significantly altered with the calibration process. Although the applicability of the calibration method is demonstrated on the DRASTIC model, the applicability of the approach is not specific to a certain model and can also be easily applied to other overlay-and-index methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Calibration of groundwater vulnerability mapping using the generalized reduced gradient method

    NASA Astrophysics Data System (ADS)

    Elçi, Alper

    2017-12-01

    Groundwater vulnerability assessment studies are essential in water resources management. Overlay-and-index methods such as DRASTIC are widely used for mapping of groundwater vulnerability, however, these methods mainly suffer from a subjective selection of model parameters. The objective of this study is to introduce a calibration procedure that results in a more accurate assessment of groundwater vulnerability. The improvement of the assessment is formulated as a parameter optimization problem using an objective function that is based on the correlation between actual groundwater contamination and vulnerability index values. The non-linear optimization problem is solved with the generalized-reduced-gradient (GRG) method, which is numerical algorithm based optimization method. To demonstrate the applicability of the procedure, a vulnerability map for the Tahtali stream basin is calibrated using nitrate concentration data. The calibration procedure is easy to implement and aims the maximization of correlation between observed pollutant concentrations and groundwater vulnerability index values. The influence of each vulnerability parameter in the calculation of the vulnerability index is assessed by performing a single-parameter sensitivity analysis. Results of the sensitivity analysis show that all factors are effective on the final vulnerability index. Calibration of the vulnerability map improves the correlation between index values and measured nitrate concentrations by 19%. The regression coefficient increases from 0.280 to 0.485. It is evident that the spatial distribution and the proportions of vulnerability class areas are significantly altered with the calibration process. Although the applicability of the calibration method is demonstrated on the DRASTIC model, the applicability of the approach is not specific to a certain model and can also be easily applied to other overlay-and-index methods.

  13. Environmental impact assessment for alternative-energy power plants in México.

    PubMed

    González-Avila, María E; Beltrán-Morales, Luis Felipe; Braker, Elizabeth; Ortega-Rubio, Alfredo

    2006-07-01

    Ten Environmental Impact Assessment Reports (EIAR) were reviewed for projects involving alternative power plants in Mexico developed during the last twelve years. Our analysis focused on the methods used to assess the impacts produced by hydroelectric and geothermal power projects. These methods used to assess impacts in EIARs ranged from the most simple, descriptive criteria, to quantitative models. These methods are not concordant with the level of the EIAR required by the environmental authority or even, with the kind of project developed. It is concluded that there is no correlation between the tools used to assess impacts and the assigned type of the EIAR. Because the methods to assess impacts produced by these power projects have not changed during 2000 years, we propose a quantitative method, based on ecological criteria and tools, to assess the impacts produced by hydroelectric and geothermal plants, according to the specific characteristics of the project. The proposed method is supported by environmental norms, and can assist environmental authorities in assigning the correct level and tools to be applied to hydroelectric and geothermal projects. The proposed method can be adapted to other production activities in Mexico and to other countries.

  14. Short assessment of the Big Five: robust across survey methods except telephone interviewing.

    PubMed

    Lang, Frieder R; John, Dennis; Lüdtke, Oliver; Schupp, Jürgen; Wagner, Gert G

    2011-06-01

    We examined measurement invariance and age-related robustness of a short 15-item Big Five Inventory (BFI-S) of personality dimensions, which is well suited for applications in large-scale multidisciplinary surveys. The BFI-S was assessed in three different interviewing conditions: computer-assisted or paper-assisted face-to-face interviewing, computer-assisted telephone interviewing, and a self-administered questionnaire. Randomized probability samples from a large-scale German panel survey and a related probability telephone study were used in order to test method effects on self-report measures of personality characteristics across early, middle, and late adulthood. Exploratory structural equation modeling was used in order to test for measurement invariance of the five-factor model of personality trait domains across different assessment methods. For the short inventory, findings suggest strong robustness of self-report measures of personality dimensions among young and middle-aged adults. In old age, telephone interviewing was associated with greater distortions in reliable personality assessment. It is concluded that the greater mental workload of telephone interviewing limits the reliability of self-report personality assessment. Face-to-face surveys and self-administrated questionnaire completion are clearly better suited than phone surveys when personality traits in age-heterogeneous samples are assessed.

  15. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based onmore » this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.« less

  16. Are Imaging and Lesioning Convergent Methods for Assessing Functional Specialisation? Investigations Using an Artificial Neural Network

    ERIC Educational Resources Information Center

    Thomas, Michael S. C.; Purser, Harry R. M.; Tomlinson, Simon; Mareschal, Denis

    2012-01-01

    This article presents an investigation of the relationship between lesioning and neuroimaging methods of assessing functional specialisation, using synthetic brain imaging (SBI) and lesioning of a connectionist network of past-tense formation. The model comprised two processing "routes": one was a direct route between layers of input and output…

  17. Comparability of fish-based ecological quality assessments for geographically distinct Iberian regions.

    PubMed

    Segurado, P; Caiola, N; Pont, D; Oliveira, J M; Delaigue, O; Ferreira, M T

    2014-04-01

    In this work we compare two Iberian and a pan-European fish-based methods to assess ecological quality in rivers: the Fish-based Index of Biotic Integrity for Portuguese Wadeable Streams (F-IBIP), the Mediterranean Index of Biotic Integrity (IBIMED) and the pan-European Fish Index (EFI+). The results presented herein were developed in the context of the 2nd phase of the Intercalibration Exercise (IC), as required by the Water Frame Directive (WFD). The IC is aimed at ensuring comparability of the quality boundaries among the different WFD assessment methods developed by the Member States for each biological quality element. Although the two national assessment methods were developed for very distinct regions of Iberia (Western and Eastern Iberian Peninsula) they share the same methodological background: both are type-specific and guild-based multimetric indices. EFI+ is a multimetric guild-based model, but it is site-specific and uses a predictive modelling approach. The three indices were computed for all sites included in the Iberian Intercalibration database to allow the direct comparison, by means of linear regressions, of the resulting three quality values per site. The quality boundary harmonization between the two Iberian methods was only possible through an indirect comparison between the two indices, using EFI+ as a common metric. The three indices were also shown to be responsive to a common set of human induced pressures. This study highlights the need to develop general assessment methods adapted to wide geographical ranges with high species turnover to help intercalibrating assessment methods tailored for geographically more restricted regions. © 2013.

  18. Sediment transport in forested head water catchments - Calibration and validation of a soil erosion and landscape evolution model

    NASA Astrophysics Data System (ADS)

    Hancock, G. R.; Webb, A. A.; Turner, L.

    2017-11-01

    Sediment transport and soil erosion can be determined by a variety of field and modelling approaches. Computer based soil erosion and landscape evolution models (LEMs) offer the potential to be reliable assessment and prediction tools. An advantage of such models is that they provide both erosion and deposition patterns as well as total catchment sediment output. However, before use, like all models they require calibration and validation. In recent years LEMs have been used for a variety of both natural and disturbed landscape assessment. However, these models have not been evaluated for their reliability in steep forested catchments. Here, the SIBERIA LEM is calibrated and evaluated for its reliability for two steep forested catchments in south-eastern Australia. The model is independently calibrated using two methods. Firstly, hydrology and sediment transport parameters are inferred from catchment geomorphology and soil properties and secondly from catchment sediment transport and discharge data. The results demonstrate that both calibration methods provide similar parameters and reliable modelled sediment transport output. A sensitivity study of the input parameters demonstrates the model's sensitivity to correct parameterisation and also how the model could be used to assess potential timber harvesting as well as the removal of vegetation by fire.

  19. Area 2: Inexpensive Monitoring and Uncertainty Assessment of CO2 Plume Migration using Injection Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinivasan, Sanjay

    2014-09-30

    In-depth understanding of the long-term fate of CO₂ in the subsurface requires study and analysis of the reservoir formation, the overlaying caprock formation, and adjacent faults. Because there is significant uncertainty in predicting the location and extent of geologic heterogeneity that can impact the future migration of CO₂ in the subsurface, there is a need to develop algorithms that can reliably quantify this uncertainty in plume migration. This project is focused on the development of a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models thatmore » reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO₂ plume. Because only injection data is required, the method provides a very inexpensive method to map the migration of the plume and the associated uncertainty in migration paths. The model selection method developed as part of this project mainly consists of assessing the connectivity/dynamic characteristics of a large prior ensemble of models, grouping the models on the basis of their expected dynamic response, selecting the subgroup of models that most closely yield dynamic response closest to the observed dynamic data, and finally quantifying the uncertainty in plume migration using the selected subset of models. The main accomplishment of the project is the development of a software module within the SGEMS earth modeling software package that implements the model selection methodology. This software module was subsequently applied to analyze CO₂ plume migration in two field projects – the In Salah CO₂ Injection project in Algeria and CO₂ injection into the Utsira formation in Norway. These applications of the software revealed that the proxies developed in this project for quickly assessing the dynamic characteristics of the reservoir were highly efficient and yielded accurate grouping of reservoir models. The plume migration paths probabilistically assessed by the method were confirmed by field observations and auxiliary data. The report also documents the application of the software to answer practical questions such as the optimum location of monitoring wells to reliably assess the migration of CO₂ plume, the effect of CO₂-rock interactions on plume migration and the ability to detect the plume under those conditions and the effect of a slow, unresolved leak on the predictions of plume migration.« less

  20. An overview of data integration methods for regional assessment.

    PubMed

    Locantore, Nicholas W; Tran, Liem T; O'Neill, Robert V; McKinnis, Peter W; Smith, Elizabeth R; O'Connell, Michael

    2004-06-01

    The U.S. Environmental Protections Agency's (U.S. EPA) Regional Vulnerability Assessment(ReVA) program has focused much of its research over the last five years on developing and evaluating integration methods for spatial data. An initial strategic priority was to use existing data from monitoring programs, model results, and other spatial data. Because most of these data were not collected with an intention of integrating into a regional assessment of conditions and vulnerabilities, issues exist that may preclude the use of some methods or require some sort of data preparation. Additionally, to support multi-criteria decision-making, methods need to be able to address a series of assessment questions that provide insights into where environmental risks are a priority. This paper provides an overview of twelve spatial integration methods that can be applied towards regional assessment, along with preliminary results as to how sensitive each method is to data issues that will likely be encountered with the use of existing data.

  1. Assessment of active methods for removal of LEO debris

    NASA Astrophysics Data System (ADS)

    Hakima, Houman; Emami, M. Reza

    2018-03-01

    This paper investigates the applicability of five active methods for removal of large low Earth orbit debris. The removal methods, namely net, laser, electrodynamic tether, ion beam shepherd, and robotic arm, are selected based on a set of high-level space mission constraints. Mission level criteria are then utilized to assess the performance of each redirection method in light of the results obtained from a Monte Carlo simulation. The simulation provides an insight into the removal time, performance robustness, and propellant mass criteria for the targeted debris range. The remaining attributes are quantified based on the models provided in the literature, which take into account several important parameters pertaining to each removal method. The means of assigning attributes to each assessment criterion is discussed in detail. A systematic comparison is performed using two different assessment schemes: Analytical Hierarchy Process and utility-based approach. A third assessment technique, namely the potential-loss analysis, is utilized to highlight the effect of risks in each removal methods.

  2. A Comparison of Two Models for Cognitive Diagnosis. Research Report. ETS RR-04-02

    ERIC Educational Resources Information Center

    Yan, Duanli; Almond, Russell; Mislevy, Robert

    2004-01-01

    Diagnostic score reports linking assessment outcomes to instructional interventions are one of the most requested features of assessment products. There is a body of interesting work done in the last 20 years including Tatsuoka's rule space method (Tatsuoka, 1983), Haertal and Wiley's binary skills model (Haertal, 1984; Haertal & Wiley, 1993),…

  3. Alternative Methods for Assessing Mediation in Multilevel Data: The Advantages of Multilevel SEM

    ERIC Educational Resources Information Center

    Preacher, Kristopher J.; Zhang, Zhen; Zyphur, Michael J.

    2011-01-01

    Multilevel modeling (MLM) is a popular way of assessing mediation effects with clustered data. Two important limitations of this approach have been identified in prior research and a theoretical rationale has been provided for why multilevel structural equation modeling (MSEM) should be preferred. However, to date, no empirical evidence of MSEM's…

  4. In vivo Assessment and Potential Diagnosis of Xenobiotics that Perturb the Thyroid Pathway: Proteomic Analysis of Xenopus laevis Brain Tissue following Exposure to Model T4 Inhibitors

    EPA Science Inventory

    As part of a multi-endpoint systems approach to develop comprehensive methods for assessing endocrine stressors in vertebrates, differential protein profiling was used to investigate expression profiles in the brain of an amphibian model (Xenopus laevis) following in vivo exposur...

  5. A Causal Inference Analysis of the Effect of Wildland Fire ...

    EPA Pesticide Factsheets

    Wildfire smoke is a major contributor to ambient air pollution levels. In this talk, we develop a spatio-temporal model to estimate the contribution of fire smoke to overall air pollution in different regions of the country. We combine numerical model output with observational data within a causal inference framework. Our methods account for aggregation and potential bias of the numerical model simulation, and address uncertainty in the causal estimates. We apply the proposed method to estimation of ozone and fine particulate matter from wildland fires and the impact on health burden assessment. We develop a causal inference framework to assess contributions of fire to ambient PM in the presence of spatial interference.

  6. Quality assessment of protein model-structures based on structural and functional similarities.

    PubMed

    Konopka, Bogumil M; Nebel, Jean-Christophe; Kotulska, Malgorzata

    2012-09-21

    Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. GOBA--Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and one of CASP9, compared to the contest participants. Consequently, GOBA offers a novel single model quality assessment program that addresses the practical needs of biologists. In conjunction with other Model Quality Assessment Programs (MQAPs), it would prove useful for the evaluation of single protein models.

  7. Can bias correction and statistical downscaling methods improve the skill of seasonal precipitation forecasts?

    NASA Astrophysics Data System (ADS)

    Manzanas, R.; Lucero, A.; Weisheimer, A.; Gutiérrez, J. M.

    2018-02-01

    Statistical downscaling methods are popular post-processing tools which are widely used in many sectors to adapt the coarse-resolution biased outputs from global climate simulations to the regional-to-local scale typically required by users. They range from simple and pragmatic Bias Correction (BC) methods, which directly adjust the model outputs of interest (e.g. precipitation) according to the available local observations, to more complex Perfect Prognosis (PP) ones, which indirectly derive local predictions (e.g. precipitation) from appropriate upper-air large-scale model variables (predictors). Statistical downscaling methods have been extensively used and critically assessed in climate change applications; however, their advantages and limitations in seasonal forecasting are not well understood yet. In particular, a key problem in this context is whether they serve to improve the forecast quality/skill of raw model outputs beyond the adjustment of their systematic biases. In this paper we analyze this issue by applying two state-of-the-art BC and two PP methods to downscale precipitation from a multimodel seasonal hindcast in a challenging tropical region, the Philippines. To properly assess the potential added value beyond the reduction of model biases, we consider two validation scores which are not sensitive to changes in the mean (correlation and reliability categories). Our results show that, whereas BC methods maintain or worsen the skill of the raw model forecasts, PP methods can yield significant skill improvement (worsening) in cases for which the large-scale predictor variables considered are better (worse) predicted by the model than precipitation. For instance, PP methods are found to increase (decrease) model reliability in nearly 40% of the stations considered in boreal summer (autumn). Therefore, the choice of a convenient downscaling approach (either BC or PP) depends on the region and the season.

  8. Potential for Integrating Diffusion of Innovation Principles into Life Cycle Assessment of Emerging Technologies.

    PubMed

    Sharp, Benjamin E; Miller, Shelie A

    2016-03-15

    Life cycle assessment (LCA) measures cradle-to-grave environmental impacts of a product. To assess impacts of an emerging technology, LCA should be coupled with additional methods that estimate how that technology might be deployed. The extent and manner that an emerging technology diffuses throughout a region shapes the magnitude and type of environmental impacts. Diffusion of innovation is an established field of research that analyzes the adoption of new innovations, and its principles can be used to construct scenario models that enhance LCA of emerging technologies. Integrating diffusion modeling techniques with an LCA of emerging technology can provide estimates for the extent of market penetration, the displacement of existing systems, and the rate of adoption. Two general perspectives of application are macro-level diffusion models that use a function of time to represent adoption, and microlevel diffusion models that simulate adoption through interactions of individuals. Incorporating diffusion of innovation concepts complement existing methods within LCA to inform proactive environmental management of emerging technologies.

  9. TOPICAL REVIEW: Modelling the interaction of electromagnetic fields (10 MHz 10 GHz) with the human body: methods and applications

    NASA Astrophysics Data System (ADS)

    Hand, J. W.

    2008-08-01

    Numerical modelling of the interaction between electromagnetic fields (EMFs) and the dielectrically inhomogeneous human body provides a unique way of assessing the resulting spatial distributions of internal electric fields, currents and rate of energy deposition. Knowledge of these parameters is of importance in understanding such interactions and is a prerequisite when assessing EMF exposure or when assessing or optimizing therapeutic or diagnostic medical applications that employ EMFs. In this review, computational methods that provide this information through full time-dependent solutions of Maxwell's equations are summarized briefly. This is followed by an overview of safety- and medical-related applications where modelling has contributed significantly to development and understanding of the techniques involved. In particular, applications in the areas of mobile communications, magnetic resonance imaging, hyperthermal therapy and microwave radiometry are highlighted. Finally, examples of modelling the potentially new medical applications of recent technologies such as ultra-wideband microwaves are discussed.

  10. Issues in Benchmarking Human Reliability Analysis Methods: A Literature Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; Stacey M. L. Hendrickson; John A. Forester

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing pastmore » benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.« less

  11. Predictive Capability Maturity Model for computational modeling and simulation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2007-10-01

    The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronauticsmore » and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.« less

  12. An Assessment of Phylogenetic Tools for Analyzing the Interplay Between Interspecific Interactions and Phenotypic Evolution.

    PubMed

    Drury, J P; Grether, G F; Garland, T; Morlon, H

    2018-05-01

    Much ecological and evolutionary theory predicts that interspecific interactions often drive phenotypic diversification and that species phenotypes in turn influence species interactions. Several phylogenetic comparative methods have been developed to assess the importance of such processes in nature; however, the statistical properties of these methods have gone largely untested. Focusing mainly on scenarios of competition between closely-related species, we assess the performance of available comparative approaches for analyzing the interplay between interspecific interactions and species phenotypes. We find that many currently used statistical methods often fail to detect the impact of interspecific interactions on trait evolution, that sister-taxa analyses are particularly unreliable in general, and that recently developed process-based models have more satisfactory statistical properties. Methods for detecting predictors of species interactions are generally more reliable than methods for detecting character displacement. In weighing the strengths and weaknesses of different approaches, we hope to provide a clear guide for empiricists testing hypotheses about the reciprocal effect of interspecific interactions and species phenotypes and to inspire further development of process-based models.

  13. A Probabilistic Tsunami Hazard Study of the Auckland Region, Part II: Inundation Modelling and Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Lane, E. M.; Gillibrand, P. A.; Wang, X.; Power, W.

    2013-09-01

    Regional source tsunamis pose a potentially devastating hazard to communities and infrastructure on the New Zealand coast. But major events are very uncommon. This dichotomy of infrequent but potentially devastating hazards makes realistic assessment of the risk challenging. Here, we describe a method to determine a probabilistic assessment of the tsunami hazard by regional source tsunamis with an "Average Recurrence Interval" of 2,500-years. The method is applied to the east Auckland region of New Zealand. From an assessment of potential regional tsunamigenic events over 100,000 years, the inundation of the Auckland region from the worst 100 events is modelled using a hydrodynamic model and probabilistic inundation depths on a 2,500-year time scale were determined. Tidal effects on the potential inundation were included by coupling the predicted wave heights with the probability density function of tidal heights at the inundation site. Results show that the more exposed northern section of the east coast and outer islands in the Hauraki Gulf face the greatest hazard from regional tsunamis in the Auckland region. Incorporating tidal effects into predictions of inundation reduced the predicted hazard compared to modelling all the tsunamis arriving at high tide giving a more accurate hazard assessment on the specified time scale. This study presents the first probabilistic analysis of dynamic modelling of tsunami inundation for the New Zealand coast and as such provides the most comprehensive assessment of tsunami inundation of the Auckland region from regional source tsunamis available to date.

  14. Applying a weed risk assessment approach to GM crops.

    PubMed

    Keese, Paul K; Robold, Andrea V; Myers, Ruth C; Weisman, Sarah; Smith, Joe

    2014-12-01

    Current approaches to environmental risk assessment of genetically modified (GM) plants are modelled on chemical risk assessment methods, which have a strong focus on toxicity. There are additional types of harms posed by plants that have been extensively studied by weed scientists and incorporated into weed risk assessment methods. Weed risk assessment uses robust, validated methods that are widely applied to regulatory decision-making about potentially problematic plants. They are designed to encompass a broad variety of plant forms and traits in different environments, and can provide reliable conclusions even with limited data. The knowledge and experience that underpin weed risk assessment can be harnessed for environmental risk assessment of GM plants. A case study illustrates the application of the Australian post-border weed risk assessment approach to a representative GM plant. This approach is a valuable tool to identify potential risks from GM plants.

  15. Comparing exposure assessment methods for traffic-related air pollution in an adverse pregnancy outcome study.

    PubMed

    Wu, Jun; Wilhelm, Michelle; Chung, Judith; Ritz, Beate

    2011-07-01

    Previous studies reported adverse impacts of traffic-related air pollution exposure on pregnancy outcomes. Yet, little information exists on how effect estimates are impacted by the different exposure assessment methods employed in these studies. To compare effect estimates for traffic-related air pollution exposure and preeclampsia, preterm birth (gestational age less than 37 weeks), and very preterm birth (gestational age less than 30 weeks) based on four commonly used exposure assessment methods. We identified 81,186 singleton births during 1997-2006 at four hospitals in Los Angeles and Orange Counties, California. Exposures were assigned to individual subjects based on residential address at delivery using the nearest ambient monitoring station data [carbon monoxide (CO), nitrogen dioxide (NO(2)), nitric oxide (NO), nitrogen oxides (NO(x)), ozone (O(3)), and particulate matter less than 2.5 (PM(2.5)) or less than 10 (PM(10))μm in aerodynamic diameter], both unadjusted and temporally adjusted land-use regression (LUR) model estimates (NO, NO(2), and NO(x)), CALINE4 line-source air dispersion model estimates (NO(x) and PM(2.5)), and a simple traffic-density measure. We employed unconditional logistic regression to analyze preeclampsia in our birth cohort, while for gestational age-matched risk sets with preterm and very preterm birth we employed conditional logistic regression. We observed elevated risks for preeclampsia, preterm birth, and very preterm birth from maternal exposures to traffic air pollutants measured at ambient stations (CO, NO, NO(2), and NO(x)) and modeled through CALINE4 (NO(x) and PM(2.5)) and LUR (NO(2) and NO(x)). Increased risk of preterm birth and very preterm birth were also positively associated with PM(10) and PM(2.5) air pollution measured at ambient stations. For LUR-modeled NO(2) and NO(x) exposures, elevated risks for all the outcomes were observed in Los Angeles only--the region for which the LUR models were initially developed. Unadjusted LUR models often produced odds ratios somewhat larger in size than temporally adjusted models. The size of effect estimates was smaller for exposures based on simpler traffic density measures than the other exposure assessment methods. We generally confirmed that traffic-related air pollution was associated with adverse reproductive outcomes regardless of the exposure assessment method employed, yet the size of the estimated effect depended on how both temporal and spatial variations were incorporated into exposure assessment. The LUR model was not transferable even between two contiguous areas within the same large metropolitan area in Southern California. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. The Path to Graduation: A Model Interactive Web Site Design Supporting Doctoral Students

    ERIC Educational Resources Information Center

    Simmons-Johnson, Nicole

    2012-01-01

    Objective. This 2-phase mixed method study assessed 2nd-year doctoral students' and dissertation students' perceptions of the current Graduate School of Education dissertation support Web site, with implications for designing a model dissertation support Web site. Methods. Phase 1 collected quantitative and qualitative data through an…

  17. Modeling Peer Assessment as Agent Negotiation in a Computer Supported Collaborative Learning Environment

    ERIC Educational Resources Information Center

    Lai, K. Robert; Lan, Chung Hsien

    2006-01-01

    This work presents a novel method for modeling collaborative learning as multi-issue agent negotiation using fuzzy constraints. Agent negotiation is an iterative process, through which, the proposed method aggregates student marks to reduce personal bias. In the framework, students define individual fuzzy membership functions based on their…

  18. An improved method to represent DEM uncertainty in glacial lake outburst flood propagation using stochastic simulations

    NASA Astrophysics Data System (ADS)

    Watson, Cameron S.; Carrivick, Jonathan; Quincey, Duncan

    2015-10-01

    Modelling glacial lake outburst floods (GLOFs) or 'jökulhlaups', necessarily involves the propagation of large and often stochastic uncertainties throughout the source to impact process chain. Since flood routing is primarily a function of underlying topography, communication of digital elevation model (DEM) uncertainty should accompany such modelling efforts. Here, a new stochastic first-pass assessment technique was evaluated against an existing GIS-based model and an existing 1D hydrodynamic model, using three DEMs with different spatial resolution. The analysis revealed the effect of DEM uncertainty and model choice on several flood parameters and on the prediction of socio-economic impacts. Our new model, which we call MC-LCP (Monte Carlo Least Cost Path) and which is distributed in the supplementary information, demonstrated enhanced 'stability' when compared to the two existing methods, and this 'stability' was independent of DEM choice. The MC-LCP model outputs an uncertainty continuum within its extent, from which relative socio-economic risk can be evaluated. In a comparison of all DEM and model combinations, the Shuttle Radar Topography Mission (SRTM) DEM exhibited fewer artefacts compared to those with the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), and were comparable to those with a finer resolution Advanced Land Observing Satellite Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) derived DEM. Overall, we contend that the variability we find between flood routing model results suggests that consideration of DEM uncertainty and pre-processing methods is important when assessing flow routing and when evaluating potential socio-economic implications of a GLOF event. Incorporation of a stochastic variable provides an illustration of uncertainty that is important when modelling and communicating assessments of an inherently complex process.

  19. Assessment of eutrophication in estuaries: Pressure-state-response and source apportionment

    Treesearch

    David Whitall; Suzanne Bricker

    2006-01-01

    The National Estuarine Eutrophication Assessment (NEEA) Update Program is a management oriented program designed to improve monitoring and assessment efforts through the development of type specific classification of estuaries that will allow improved assessment methods and development of analytical and research models and tools for managers which will help guide and...

  20. New Directions in Assessing Mental Competence

    PubMed Central

    Silberfeld, Michel

    1992-01-01

    As guardianship laws have been reformed and the concept of partial guardianship has emerged, traditional definitions of competency have changed and traditional methods of competency assessment have become inadequate. Ways are needed to assess capacity for particular functions, or decision-specific competence. This article explores ethical, conceptual, and practical difficulties and outlines an assessment model. PMID:21221296

  1. Superitem Test: An Alternative Assessment Tool to Assess Students' Algebraic Solving Ability

    ERIC Educational Resources Information Center

    Lian, Lim Hooi; Yew, Wun Thiam; Idris, Noraini

    2010-01-01

    Superitem test based on the SOLO model (Structure of the Observing Learning Outcome) has become a powerful alternative assessment tool for monitoring the growth of students' cognitive ability in solving mathematics problems. This article focused on developing a superitem test to assess students' algebraic solving ability through interview method.…

  2. Assessment of an Unstructured-Grid Method for Predicting 3-D Turbulent Viscous Flows

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.

    1996-01-01

    A method Is presented for solving turbulent flow problems on three-dimensional unstructured grids. Spatial discretization Is accomplished by a cell-centered finite-volume formulation using an accurate lin- ear reconstruction scheme and upwind flux differencing. Time is advanced by an implicit backward- Euler time-stepping scheme. Flow turbulence effects are modeled by the Spalart-Allmaras one-equation model, which is coupled with a wall function to reduce the number of cells in the sublayer region of the boundary layer. A systematic assessment of the method is presented to devise guidelines for more strategic application of the technology to complex problems. The assessment includes the accuracy In predictions of skin-friction coefficient, law-of-the-wall behavior, and surface pressure for a flat-plate turbulent boundary layer, and for the ONERA M6 wing under a high Reynolds number, transonic, separated flow condition.

  3. Assessment of an Unstructured-Grid Method for Predicting 3-D Turbulent Viscous Flows

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.

    1996-01-01

    A method is presented for solving turbulent flow problems on three-dimensional unstructured grids. Spatial discretization is accomplished by a cell-centered finite-volume formulation using an accurate linear reconstruction scheme and upwind flux differencing. Time is advanced by an implicit backward-Euler time-stepping scheme. Flow turbulence effects are modeled by the Spalart-Allmaras one-equation model, which is coupled with a wall function to reduce the number of cells in the sublayer region of the boundary layer. A systematic assessment of the method is presented to devise guidelines for more strategic application of the technology to complex problems. The assessment includes the accuracy in predictions of skin-friction coefficient, law-of-the-wall behavior, and surface pressure for a flat-plate turbulent boundary layer, and for the ONERA M6 wing under a high Reynolds number, transonic, separated flow condition.

  4. Modelling Complex Fenestration Systems using physical and virtual models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thanachareonkit, Anothai; Scartezzini, Jean-Louis

    2010-04-15

    Physical or virtual models are commonly used to visualize the conceptual ideas of architects, lighting designers and researchers; they are also employed to assess the daylighting performance of buildings, particularly in cases where Complex Fenestration Systems (CFS) are considered. Recent studies have however revealed a general tendency of physical models to over-estimate this performance, compared to those of real buildings; these discrepancies can be attributed to several reasons. In order to identify the main error sources, a series of comparisons in-between a real building (a single office room within a test module) and the corresponding physical and virtual models wasmore » undertaken. The physical model was placed in outdoor conditions, which were strictly identical to those of the real building, as well as underneath a scanning sky simulator. The virtual model simulations were carried out by way of the Radiance program using the GenSky function; an alternative evaluation method, named Partial Daylight Factor method (PDF method), was also employed with the physical model together with sky luminance distributions acquired by a digital sky scanner during the monitoring of the real building. The overall daylighting performance of physical and virtual models were assessed and compared. The causes of discrepancies between the daylighting performance of the real building and the models were analysed. The main identified sources of errors are the reproduction of building details, the CFS modelling and the mocking-up of the geometrical and photometrical properties. To study the impact of these errors on daylighting performance assessment, computer simulation models created using the Radiance program were also used to carry out a sensitivity analysis of modelling errors. The study of the models showed that large discrepancies can occur in daylighting performance assessment. In case of improper mocking-up of the glazing for instance, relative divergences of 25-40% can be found in different room locations, suggesting that more light is entering than actually monitored in the real building. All these discrepancies can however be reduced by making an effort to carefully mock up the geometry and photometry of the real building. A synthesis is presented in this article which can be used as guidelines for daylighting designers to avoid or estimate errors during CFS daylighting performance assessment. (author)« less

  5. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  6. Assessment and Learning of Mathematics.

    ERIC Educational Resources Information Center

    Leder, Gilah C., Ed.

    This book addresses the link between student learning of mathematics, the teaching method adopted in the mathematics classroom, and the assessment procedures used to determine and measure student knowledge. Fifteen chapters address issues that include a review of different models of mathematics learning and assessment practices, three contrasting…

  7. Impacts of correcting the inter-variable correlation of climate model outputs on hydrological modeling

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Li, Chao; Brissette, François P.; Chen, Hua; Wang, Mingna; Essou, Gilles R. C.

    2018-05-01

    Bias correction is usually implemented prior to using climate model outputs for impact studies. However, bias correction methods that are commonly used treat climate variables independently and often ignore inter-variable dependencies. The effects of ignoring such dependencies on impact studies need to be investigated. This study aims to assess the impacts of correcting the inter-variable correlation of climate model outputs on hydrological modeling. To this end, a joint bias correction (JBC) method which corrects the joint distribution of two variables as a whole is compared with an independent bias correction (IBC) method; this is considered in terms of correcting simulations of precipitation and temperature from 26 climate models for hydrological modeling over 12 watersheds located in various climate regimes. The results show that the simulated precipitation and temperature are considerably biased not only in the individual distributions, but also in their correlations, which in turn result in biased hydrological simulations. In addition to reducing the biases of the individual characteristics of precipitation and temperature, the JBC method can also reduce the bias in precipitation-temperature (P-T) correlations. In terms of hydrological modeling, the JBC method performs significantly better than the IBC method for 11 out of the 12 watersheds over the calibration period. For the validation period, the advantages of the JBC method are greatly reduced as the performance becomes dependent on the watershed, GCM and hydrological metric considered. For arid/tropical and snowfall-rainfall-mixed watersheds, JBC performs better than IBC. For snowfall- or rainfall-dominated watersheds, however, the two methods behave similarly, with IBC performing somewhat better than JBC. Overall, the results emphasize the advantages of correcting the P-T correlation when using climate model-simulated precipitation and temperature to assess the impact of climate change on watershed hydrology. However, a thorough validation and a comparison with other methods are recommended before using the JBC method, since it may perform worse than the IBC method for some cases due to bias nonstationarity of climate model outputs.

  8. Application of the cognitive therapy model to initial crisis assessment.

    PubMed

    Calvert, Patricia; Palmer, Christine

    2003-03-01

    This article provides a background to the development of cognitive therapy and cognitive therapeutic skills with a specific focus on the treatment of a depressive episode. It discusses the utility of cognitive therapeutic strategies to the model of crisis theory and initial crisis assessment currently used by the Community Assessment & Treatment Team of Waitemata District Health Board on the North Shore of Auckland, New Zealand. A brief background to cognitive therapy is provided, followed by a comprehensive example of the use of the Socratic questioning method in guiding collaborative assessment and treatment of suicidality by nurses during the initial crisis assessment.

  9. The implementation of assessment model based on character building to improve students’ discipline and achievement

    NASA Astrophysics Data System (ADS)

    Rusijono; Khotimah, K.

    2018-01-01

    The purpose of this research was to investigate the effect of implementing the assessment model based on character building to improve discipline and student’s achievement. Assessment model based on character building includes three components, which are the behaviour of students, the efforts, and student’s achievement. This assessment model based on the character building is implemented in science philosophy and educational assessment courses, in Graduate Program of Educational Technology Department, Educational Faculty, Universitas Negeri Surabaya. This research used control group pre-test and post-test design. Data collection method used in this research were observation and test. The observation was used to collect the data about the disciplines of the student in the instructional process, while the test was used to collect the data about student’s achievement. Moreover, the study applied t-test to the analysis of data. The result of this research showed that assessment model based on character building improved discipline and student’s achievement.

  10. On the assessment of the added value of new predictive biomarkers.

    PubMed

    Chen, Weijie; Samuelson, Frank W; Gallas, Brandon D; Kang, Le; Sahiner, Berkman; Petrick, Nicholas

    2013-07-29

    The surge in biomarker development calls for research on statistical evaluation methodology to rigorously assess emerging biomarkers and classification models. Recently, several authors reported the puzzling observation that, in assessing the added value of new biomarkers to existing ones in a logistic regression model, statistical significance of new predictor variables does not necessarily translate into a statistically significant increase in the area under the ROC curve (AUC). Vickers et al. concluded that this inconsistency is because AUC "has vastly inferior statistical properties," i.e., it is extremely conservative. This statement is based on simulations that misuse the DeLong et al. method. Our purpose is to provide a fair comparison of the likelihood ratio (LR) test and the Wald test versus diagnostic accuracy (AUC) tests. We present a test to compare ideal AUCs of nested linear discriminant functions via an F test. We compare it with the LR test and the Wald test for the logistic regression model. The null hypotheses of these three tests are equivalent; however, the F test is an exact test whereas the LR test and the Wald test are asymptotic tests. Our simulation shows that the F test has the nominal type I error even with a small sample size. Our results also indicate that the LR test and the Wald test have inflated type I errors when the sample size is small, while the type I error converges to the nominal value asymptotically with increasing sample size as expected. We further show that the DeLong et al. method tests a different hypothesis and has the nominal type I error when it is used within its designed scope. Finally, we summarize the pros and cons of all four methods we consider in this paper. We show that there is nothing inherently less powerful or disagreeable about ROC analysis for showing the usefulness of new biomarkers or characterizing the performance of classification models. Each statistical method for assessing biomarkers and classification models has its own strengths and weaknesses. Investigators need to choose methods based on the assessment purpose, the biomarker development phase at which the assessment is being performed, the available patient data, and the validity of assumptions behind the methodologies.

  11. Status of turbulence modeling for hypersonic propulsion flowpaths

    NASA Astrophysics Data System (ADS)

    Georgiadis, Nicholas J.; Yoder, Dennis A.; Vyas, Manan A.; Engblom, William A.

    2014-06-01

    This report provides an assessment of current turbulent flow calculation methods for hypersonic propulsion flowpaths, particularly the scramjet engine. Emphasis is placed on Reynolds-averaged Navier-Stokes (RANS) methods, but some discussion of newer methods such as large eddy simulation (LES) is also provided. The report is organized by considering technical issues throughout the scramjet-powered vehicle flowpath, including laminar-to-turbulent boundary layer transition, shock wave/turbulent boundary layer interactions, scalar transport modeling (specifically the significance of turbulent Prandtl and Schmidt numbers), and compressible mixing. Unit problems are primarily used to conduct the assessment. In the combustor, results from calculations of a direct connect supersonic combustion experiment are also used to address the effects of turbulence model selection and in particular settings for the turbulent Prandtl and Schmidt numbers. It is concluded that RANS turbulence modeling shortfalls are still a major limitation to the accuracy of hypersonic propulsion simulations, whether considering individual components or an overall system. Newer methods such as LES-based techniques may be promising, but are not yet at a maturity to be used routinely by the hypersonic propulsion community. The need for fundamental experiments to provide data for turbulence model development and validation is discussed.

  12. Data accuracy assessment using enterprise architecture

    NASA Astrophysics Data System (ADS)

    Närman, Per; Holm, Hannes; Johnson, Pontus; König, Johan; Chenine, Moustafa; Ekstedt, Mathias

    2011-02-01

    Errors in business processes result in poor data accuracy. This article proposes an architecture analysis method which utilises ArchiMate and the Probabilistic Relational Model formalism to model and analyse data accuracy. Since the resources available for architecture analysis are usually quite scarce, the method advocates interviews as the primary data collection technique. A case study demonstrates that the method yields correct data accuracy estimates and is more resource-efficient than a competing sampling-based data accuracy estimation method.

  13. Recent developments in imaging system assessment methodology, FROC analysis and the search model.

    PubMed

    Chakraborty, Dev P

    2011-08-21

    A frequent problem in imaging is assessing whether a new imaging system is an improvement over an existing standard. Observer performance methods, in particular the receiver operating characteristic (ROC) paradigm, are widely used in this context. In ROC analysis lesion location information is not used and consequently scoring ambiguities can arise in tasks, such as nodule detection, involving finding localized lesions. This paper reviews progress in the free-response ROC (FROC) paradigm in which the observer marks and rates suspicious regions and the location information is used to determine whether lesions were correctly localized. Reviewed are FROC data analysis, a search-model for simulating FROC data, predictions of the model and a method for estimating the parameters. The search model parameters are physically meaningful quantities that can guide system optimization.

  14. Advantages of multigrid methods for certifying the accuracy of PDE modeling

    NASA Technical Reports Server (NTRS)

    Forester, C. K.

    1981-01-01

    Numerical techniques for assessing and certifying the accuracy of the modeling of partial differential equations (PDE) to the user's specifications are analyzed. Examples of the certification process with conventional techniques are summarized for the three dimensional steady state full potential and the two dimensional steady Navier-Stokes equations using fixed grid methods (FG). The advantages of the Full Approximation Storage (FAS) scheme of the multigrid technique of A. Brandt compared with the conventional certification process of modeling PDE are illustrated in one dimension with the transformed potential equation. Inferences are drawn for how MG will improve the certification process of the numerical modeling of two and three dimensional PDE systems. Elements of the error assessment process that are common to FG and MG are analyzed.

  15. Assessing consumer responses to potential reduced-exposure tobacco products: a review of tobacco industry and independent research methods.

    PubMed

    Rees, Vaughan W; Kreslake, Jennifer M; Cummings, K Michael; O'Connor, Richard J; Hatsukami, Dorothy K; Parascandola, Mark; Shields, Peter G; Connolly, Gregory N

    2009-12-01

    Internal tobacco industry documents and the mainstream literature are reviewed to identify methods and measures for evaluating tobacco consumer response. The review aims to outline areas in which established methods exist, identify gaps in current methods for assessing consumer response, and consider how these methods might be applied to evaluate potentially reduced exposure tobacco products and new products. Internal industry research reviewed included published articles, manuscript drafts, presentations, protocols, and instruments relating to consumer response measures were identified and analyzed. Peer-reviewed research was identified using PubMed and Scopus. Industry research on consumer response focuses on product development and marketing. To develop and refine new products, the tobacco industry has developed notable strategies for assessing consumers' sensory and subjective responses to product design characteristics. Independent research is often conducted to gauge the likelihood of future product adoption by measuring consumers' risk perceptions, responses to product, and product acceptability. A model that conceptualizes consumer response as comprising the separate, but interacting, domains of product perceptions and response to product is outlined. Industry and independent research supports the dual domain model and provides a wide range of methods for assessment of the construct components of consumer response. Further research is needed to validate consumer response constructs, determine the relationship between consumer response and tobacco user behavior, and improve reliability of consumer response measures. Scientifically rigorous consumer response assessment methods will provide a needed empirical basis for future regulation of potentially reduced-exposure tobacco products and new products, to counteract tobacco industry influence on consumers, and enhance the public health.

  16. Assessment of sustainable urban transport development based on entropy and unascertained measure

    PubMed Central

    Li, Yancang; Yang, Jing; Li, Yijie

    2017-01-01

    To find a more effective method for the assessment of sustainable urban transport development, the comprehensive assessment model of sustainable urban transport development was established based on the unascertained measure. On the basis of considering the factors influencing urban transport development, the comprehensive assessment indexes were selected, including urban economical development, transport demand, environment quality and energy consumption, and the assessment system of sustainable urban transport development was proposed. In view of different influencing factors of urban transport development, the index weight was calculated through the entropy weight coefficient method. Qualitative and quantitative analyses were conducted according to the actual condition. Then, the grade was obtained by using the credible degree recognition criterion from which the urban transport development level can be determined. Finally, a comprehensive assessment method for urban transport development was introduced. The application practice showed that the method can be used reasonably and effectively for the comprehensive assessment of urban transport development. PMID:29084281

  17. A new CFD based non-invasive method for functional diagnosis of coronary stenosis.

    PubMed

    Xie, Xinzhou; Zheng, Minwen; Wen, Didi; Li, Yabing; Xie, Songyun

    2018-03-22

    Accurate functional diagnosis of coronary stenosis is vital for decision making in coronary revascularization. With recent advances in computational fluid dynamics (CFD), fractional flow reserve (FFR) can be derived non-invasively from coronary computed tomography angiography images (FFR CT ) for functional measurement of stenosis. However, the accuracy of FFR CT is limited due to the approximate modeling approach of maximal hyperemia conditions. To overcome this problem, a new CFD based non-invasive method is proposed. Instead of modeling maximal hyperemia condition, a series of boundary conditions are specified and those simulated results are combined to provide a pressure-flow curve for a stenosis. Then, functional diagnosis of stenosis is assessed based on parameters derived from the obtained pressure-flow curve. The proposed method is applied to both idealized and patient-specific models, and validated with invasive FFR in six patients. Results show that additional hemodynamic information about the flow resistances of a stenosis is provided, which cannot be directly obtained from anatomy information. Parameters derived from the simulated pressure-flow curve show a linear and significant correlations with invasive FFR (r > 0.95, P < 0.05). The proposed method can assess flow resistances by the pressure-flow curve derived parameters without modeling of maximal hyperemia condition, which is a new promising approach for non-invasive functional assessment of coronary stenosis.

  18. Research on assessment methods for urban public transport development in China.

    PubMed

    Zou, Linghong; Dai, Hongna; Yao, Enjian; Jiang, Tian; Guo, Hongwei

    2014-01-01

    In recent years, with the rapid increase in urban population, the urban travel demands in Chinese cities have been increasing dramatically. As a result, developing comprehensive urban transport systems becomes an inevitable choice to meet the growing urban travel demands. In urban transport systems, public transport plays the leading role to promote sustainable urban development. This paper aims to establish an assessment index system for the development level of urban public transport consisting of a target layer, a criterion layer, and an index layer. Review on existing literature shows that methods used in evaluating urban public transport structure are dominantly qualitative. To overcome this shortcoming, fuzzy mathematics method is used for describing qualitative issues quantitatively, and AHP (analytic hierarchy process) is used to quantify expert's subjective judgment. The assessment model is established based on the fuzzy AHP. The weight of each index is determined through the AHP and the degree of membership of each index through the fuzzy assessment method to obtain the fuzzy synthetic assessment matrix. Finally, a case study is conducted to verify the rationality and practicability of the assessment system and the proposed assessment method.

  19. Deep Learning to Predict Falls in Older Adults Based on Daily-Life Trunk Accelerometry.

    PubMed

    Nait Aicha, Ahmed; Englebienne, Gwenn; van Schooten, Kimberley S; Pijnappels, Mirjam; Kröse, Ben

    2018-05-22

    Early detection of high fall risk is an essential component of fall prevention in older adults. Wearable sensors can provide valuable insight into daily-life activities; biomechanical features extracted from such inertial data have been shown to be of added value for the assessment of fall risk. Body-worn sensors such as accelerometers can provide valuable insight into fall risk. Currently, biomechanical features derived from accelerometer data are used for the assessment of fall risk. Here, we studied whether deep learning methods from machine learning are suited to automatically derive features from raw accelerometer data that assess fall risk. We used an existing dataset of 296 older adults. We compared the performance of three deep learning model architectures (convolutional neural network (CNN), long short-term memory (LSTM) and a combination of these two (ConvLSTM)) to each other and to a baseline model with biomechanical features on the same dataset. The results show that the deep learning models in a single-task learning mode are strong in recognition of identity of the subject, but that these models only slightly outperform the baseline method on fall risk assessment. When using multi-task learning, with gender and age as auxiliary tasks, deep learning models perform better. We also found that preprocessing of the data resulted in the best performance (AUC = 0.75). We conclude that deep learning models, and in particular multi-task learning, effectively assess fall risk on the basis of wearable sensor data.

  20. Deep Learning to Predict Falls in Older Adults Based on Daily-Life Trunk Accelerometry

    PubMed Central

    Englebienne, Gwenn; Pijnappels, Mirjam

    2018-01-01

    Early detection of high fall risk is an essential component of fall prevention in older adults. Wearable sensors can provide valuable insight into daily-life activities; biomechanical features extracted from such inertial data have been shown to be of added value for the assessment of fall risk. Body-worn sensors such as accelerometers can provide valuable insight into fall risk. Currently, biomechanical features derived from accelerometer data are used for the assessment of fall risk. Here, we studied whether deep learning methods from machine learning are suited to automatically derive features from raw accelerometer data that assess fall risk. We used an existing dataset of 296 older adults. We compared the performance of three deep learning model architectures (convolutional neural network (CNN), long short-term memory (LSTM) and a combination of these two (ConvLSTM)) to each other and to a baseline model with biomechanical features on the same dataset. The results show that the deep learning models in a single-task learning mode are strong in recognition of identity of the subject, but that these models only slightly outperform the baseline method on fall risk assessment. When using multi-task learning, with gender and age as auxiliary tasks, deep learning models perform better. We also found that preprocessing of the data resulted in the best performance (AUC = 0.75). We conclude that deep learning models, and in particular multi-task learning, effectively assess fall risk on the basis of wearable sensor data. PMID:29786659

  1. Development of Flight-Test Performance Estimation Techniques for Small Unmanned Aerial Systems

    NASA Astrophysics Data System (ADS)

    McCrink, Matthew Henry

    This dissertation provides a flight-testing framework for assessing the performance of fixed-wing, small-scale unmanned aerial systems (sUAS) by leveraging sub-system models of components unique to these vehicles. The development of the sub-system models, and their links to broader impacts on sUAS performance, is the key contribution of this work. The sub-system modeling and analysis focuses on the vehicle's propulsion, navigation and guidance, and airframe components. Quantification of the uncertainty in the vehicle's power available and control states is essential for assessing the validity of both the methods and results obtained from flight-tests. Therefore, detailed propulsion and navigation system analyses are presented to validate the flight testing methodology. Propulsion system analysis required the development of an analytic model of the propeller in order to predict the power available over a range of flight conditions. The model is based on the blade element momentum (BEM) method. Additional corrections are added to the basic model in order to capture the Reynolds-dependent scale effects unique to sUAS. The model was experimentally validated using a ground based testing apparatus. The BEM predictions and experimental analysis allow for a parameterized model relating the electrical power, measurable during flight, to the power available required for vehicle performance analysis. Navigation system details are presented with a specific focus on the sensors used for state estimation, and the resulting uncertainty in vehicle state. Uncertainty quantification is provided by detailed calibration techniques validated using quasi-static and hardware-in-the-loop (HIL) ground based testing. The HIL methods introduced use a soft real-time flight simulator to provide inertial quality data for assessing overall system performance. Using this tool, the uncertainty in vehicle state estimation based on a range of sensors, and vehicle operational environments is presented. The propulsion and navigation system models are used to evaluate flight-testing methods for evaluating fixed-wing sUAS performance. A brief airframe analysis is presented to provide a foundation for assessing the efficacy of the flight-test methods. The flight-testing presented in this work is focused on validating the aircraft drag polar, zero-lift drag coefficient, and span efficiency factor. Three methods are detailed and evaluated for estimating these design parameters. Specific focus is placed on the influence of propulsion and navigation system uncertainty on the resulting performance data. Performance estimates are used in conjunction with the propulsion model to estimate the impact sensor and measurement uncertainty on the endurance and range of a fixed-wing sUAS. Endurance and range results for a simplistic power available model are compared to the Reynolds-dependent model presented in this work. Additional parameter sensitivity analysis related to state estimation uncertainties encountered in flight-testing are presented. Results from these analyses indicate that the sub-system models introduced in this work are of first-order importance, on the order of 5-10% change in range and endurance, in assessing the performance of a fixed-wing sUAS.

  2. Mixture of autoregressive modeling orders and its implication on single trial EEG classification

    PubMed Central

    Atyabi, Adham; Shic, Frederick; Naples, Adam

    2016-01-01

    Autoregressive (AR) models are of commonly utilized feature types in Electroencephalogram (EEG) studies due to offering better resolution, smoother spectra and being applicable to short segments of data. Identifying correct AR’s modeling order is an open challenge. Lower model orders poorly represent the signal while higher orders increase noise. Conventional methods for estimating modeling order includes Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Final Prediction Error (FPE). This article assesses the hypothesis that appropriate mixture of multiple AR orders is likely to better represent the true signal compared to any single order. Better spectral representation of underlying EEG patterns can increase utility of AR features in Brain Computer Interface (BCI) systems by increasing timely & correctly responsiveness of such systems to operator’s thoughts. Two mechanisms of Evolutionary-based fusion and Ensemble-based mixture are utilized for identifying such appropriate mixture of modeling orders. The classification performance of the resultant AR-mixtures are assessed against several conventional methods utilized by the community including 1) A well-known set of commonly used orders suggested by the literature, 2) conventional order estimation approaches (e.g., AIC, BIC and FPE), 3) blind mixture of AR features originated from a range of well-known orders. Five datasets from BCI competition III that contain 2, 3 and 4 motor imagery tasks are considered for the assessment. The results indicate superiority of Ensemble-based modeling order mixture and evolutionary-based order fusion methods within all datasets. PMID:28740331

  3. Method for assessing coal-floor water-inrush risk based on the variable-weight model and unascertained measure theory

    NASA Astrophysics Data System (ADS)

    Wu, Qiang; Zhao, Dekang; Wang, Yang; Shen, Jianjun; Mu, Wenping; Liu, Honglei

    2017-11-01

    Water inrush from coal-seam floors greatly threatens mining safety in North China and is a complex process controlled by multiple factors. This study presents a mathematical assessment system for coal-floor water-inrush risk based on the variable-weight model (VWM) and unascertained measure theory (UMT). In contrast to the traditional constant-weight model (CWM), which assigns a fixed weight to each factor, the VWM varies with the factor-state value. The UMT employs the confidence principle, which is more effective in ordered partition problems than the maximum membership principle adopted in the former mathematical theory. The method is applied to the Datang Tashan Coal Mine in North China. First, eight main controlling factors are selected to construct the comprehensive evaluation index system. Subsequently, an incentive-penalty variable-weight model is built to calculate the variable weights of each factor. Then, the VWM-UMT model is established using the quantitative risk-grade divide of each factor according to the UMT. On this basis, the risk of coal-floor water inrush in Tashan Mine No. 8 is divided into five grades. For comparison, the CWM is also adopted for the risk assessment, and a differences distribution map is obtained between the two methods. Finally, the verification of water-inrush points indicates that the VWM-UMT model is powerful and more feasible and reasonable. The model has great potential and practical significance in future engineering applications.

  4. Cross-Participant EEG-Based Assessment of Cognitive Workload Using Multi-Path Convolutional Recurrent Neural Networks.

    PubMed

    Hefron, Ryan; Borghetti, Brett; Schubert Kabban, Christine; Christensen, James; Estepp, Justin

    2018-04-26

    Applying deep learning methods to electroencephalograph (EEG) data for cognitive state assessment has yielded improvements over previous modeling methods. However, research focused on cross-participant cognitive workload modeling using these techniques is underrepresented. We study the problem of cross-participant state estimation in a non-stimulus-locked task environment, where a trained model is used to make workload estimates on a new participant who is not represented in the training set. Using experimental data from the Multi-Attribute Task Battery (MATB) environment, a variety of deep neural network models are evaluated in the trade-space of computational efficiency, model accuracy, variance and temporal specificity yielding three important contributions: (1) The performance of ensembles of individually-trained models is statistically indistinguishable from group-trained methods at most sequence lengths. These ensembles can be trained for a fraction of the computational cost compared to group-trained methods and enable simpler model updates. (2) While increasing temporal sequence length improves mean accuracy, it is not sufficient to overcome distributional dissimilarities between individuals’ EEG data, as it results in statistically significant increases in cross-participant variance. (3) Compared to all other networks evaluated, a novel convolutional-recurrent model using multi-path subnetworks and bi-directional, residual recurrent layers resulted in statistically significant increases in predictive accuracy and decreases in cross-participant variance.

  5. Cross-Participant EEG-Based Assessment of Cognitive Workload Using Multi-Path Convolutional Recurrent Neural Networks

    PubMed Central

    Hefron, Ryan; Borghetti, Brett; Schubert Kabban, Christine; Christensen, James; Estepp, Justin

    2018-01-01

    Applying deep learning methods to electroencephalograph (EEG) data for cognitive state assessment has yielded improvements over previous modeling methods. However, research focused on cross-participant cognitive workload modeling using these techniques is underrepresented. We study the problem of cross-participant state estimation in a non-stimulus-locked task environment, where a trained model is used to make workload estimates on a new participant who is not represented in the training set. Using experimental data from the Multi-Attribute Task Battery (MATB) environment, a variety of deep neural network models are evaluated in the trade-space of computational efficiency, model accuracy, variance and temporal specificity yielding three important contributions: (1) The performance of ensembles of individually-trained models is statistically indistinguishable from group-trained methods at most sequence lengths. These ensembles can be trained for a fraction of the computational cost compared to group-trained methods and enable simpler model updates. (2) While increasing temporal sequence length improves mean accuracy, it is not sufficient to overcome distributional dissimilarities between individuals’ EEG data, as it results in statistically significant increases in cross-participant variance. (3) Compared to all other networks evaluated, a novel convolutional-recurrent model using multi-path subnetworks and bi-directional, residual recurrent layers resulted in statistically significant increases in predictive accuracy and decreases in cross-participant variance. PMID:29701668

  6. Quantum chemical approach for condensed-phase thermochemistry (V): Development of rigid-body type harmonic solvation model

    NASA Astrophysics Data System (ADS)

    Tarumi, Moto; Nakai, Hiromi

    2018-05-01

    This letter proposes an approximate treatment of the harmonic solvation model (HSM) assuming the solute to be a rigid body (RB-HSM). The HSM method can appropriately estimate the Gibbs free energy for condensed phases even where an ideal gas model used by standard quantum chemical programs fails. The RB-HSM method eliminates calculations for intra-molecular vibrations in order to reduce the computational costs. Numerical assessments indicated that the RB-HSM method can evaluate entropies and internal energies with the same accuracy as the HSM method but with lower calculation costs.

  7. Methods for assessing wall interference in the 2- by 2-foot adaptive-wall wind tunnel

    NASA Technical Reports Server (NTRS)

    Schairer, E. T.

    1986-01-01

    Discussed are two methods for assessing two-dimensional wall interference in the adaptive-wall test section of the NASA Ames 2 x 2-Foot Transonic Wind Tunnel: (1) a method for predicting free-air conditions near the walls of the test section (adaptive-wall methods); and (2) a method for estimating wall-induced velocities near the model (correction methods), both of which methods are based on measurements of either one or two components of flow velocity near the walls of the test section. Each method is demonstrated using simulated wind tunnel data and is compared with other methods of the same type. The two-component adaptive-wall and correction methods were found to be preferable to the corresponding one-component methods because: (1) they are more sensitive to, and give a more complete description of, wall interference; (2) they require measurements at fewer locations; (3) they can be used to establish free-stream conditions; and (4) they are independent of a description of the model and constants of integration.

  8. Perspectives to Performance of Environment and Health Assessments and Models—From Outputs to Outcomes?

    PubMed Central

    Pohjola, Mikko V.; Pohjola, Pasi; Tainio, Marko; Tuomisto, Jouni T.

    2013-01-01

    The calls for knowledge-based policy and policy-relevant research invoke a need to evaluate and manage environment and health assessments and models according to their societal outcomes. This review explores how well the existing approaches to assessment and model performance serve this need. The perspectives to assessment and model performance in the scientific literature can be called: (1) quality assurance/control, (2) uncertainty analysis, (3) technical assessment of models, (4) effectiveness and (5) other perspectives, according to what is primarily seen to constitute the goodness of assessments and models. The categorization is not strict and methods, tools and frameworks in different perspectives may overlap. However, altogether it seems that most approaches to assessment and model performance are relatively narrow in their scope. The focus in most approaches is on the outputs and making of assessments and models. Practical application of the outputs and the consequential outcomes are often left unaddressed. It appears that more comprehensive approaches that combine the essential characteristics of different perspectives are needed. This necessitates a better account of the mechanisms of collective knowledge creation and the relations between knowledge and practical action. Some new approaches to assessment, modeling and their evaluation and management span the chain from knowledge creation to societal outcomes, but the complexity of evaluating societal outcomes remains a challenge. PMID:23803642

  9. A comparison of different functions for predicted protein model quality assessment.

    PubMed

    Li, Juan; Fang, Huisheng

    2016-07-01

    In protein structure prediction, a considerable number of models are usually produced by either the Template-Based Method (TBM) or the ab initio prediction. The purpose of this study is to find the critical parameter in assessing the quality of the predicted models. A non-redundant template library was developed and 138 target sequences were modeled. The target sequences were all distant from the proteins in the template library and were aligned with template library proteins on the basis of the transformation matrix. The quality of each model was first assessed with QMEAN and its six parameters, which are C_β interaction energy (C_beta), all-atom pairwise energy (PE), solvation energy (SE), torsion angle energy (TAE), secondary structure agreement (SSA), and solvent accessibility agreement (SAE). Finally, the alignment score (score) was also used to assess the quality of model. Hence, a total of eight parameters (i.e., QMEAN, C_beta, PE, SE, TAE, SSA, SAE, score) were independently used to assess the quality of each model. The results indicate that SSA is the best parameter to estimate the quality of the model.

  10. Real-time flood forecasts & risk assessment using a possibility-theory based fuzzy neural network

    NASA Astrophysics Data System (ADS)

    Khan, U. T.

    2016-12-01

    Globally floods are one of the most devastating natural disasters and improved flood forecasting methods are essential for better flood protection in urban areas. Given the availability of high resolution real-time datasets for flood variables (e.g. streamflow and precipitation) in many urban areas, data-driven models have been effectively used to predict peak flow rates in river; however, the selection of input parameters for these types of models is often subjective. Additionally, the inherit uncertainty associated with data models along with errors in extreme event observations means that uncertainty quantification is essential. Addressing these concerns will enable improved flood forecasting methods and provide more accurate flood risk assessments. In this research, a new type of data-driven model, a quasi-real-time updating fuzzy neural network is developed to predict peak flow rates in urban riverine watersheds. A possibility-to-probability transformation is first used to convert observed data into fuzzy numbers. A possibility theory based training regime is them used to construct the fuzzy parameters and the outputs. A new entropy-based optimisation criterion is used to train the network. Two existing methods to select the optimum input parameters are modified to account for fuzzy number inputs, and compared. These methods are: Entropy-Wavelet-based Artificial Neural Network (EWANN) and Combined Neural Pathway Strength Analysis (CNPSA). Finally, an automated algorithm design to select the optimum structure of the neural network is implemented. The overall impact of each component of training this network is to replace the traditional ad hoc network configuration methods, with one based on objective criteria. Ten years of data from the Bow River in Calgary, Canada (including two major floods in 2005 and 2013) are used to calibrate and test the network. The EWANN method selected lagged peak flow as a candidate input, whereas the CNPSA method selected lagged precipitation and lagged mean daily flow as candidate inputs. Model performance metric show that the CNPSA method had higher performance (with an efficiency of 0.76). Model output was used to assess the risk of extreme peak flows for a given day using an inverse possibility-to-probability transformation.

  11. Assessing and Evaluating Multidisciplinary Translational Teams: A Mixed Methods Approach

    PubMed Central

    Wooten, Kevin C.; Rose, Robert M.; Ostir, Glenn V.; Calhoun, William J.; Ameredes, Bill T.; Brasier, Allan R.

    2014-01-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team type taxonomy. Based on team maturation and scientific progress, teams were designated as: a) early in development, b) traditional, c) process focused, or d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored. PMID:24064432

  12. Comparison of different uncertainty techniques in urban stormwater quantity and quality modelling.

    PubMed

    Dotto, Cintia B S; Mannina, Giorgio; Kleidorfer, Manfred; Vezzaro, Luca; Henrichs, Malte; McCarthy, David T; Freni, Gabriele; Rauch, Wolfgang; Deletic, Ana

    2012-05-15

    Urban drainage models are important tools used by both practitioners and scientists in the field of stormwater management. These models are often conceptual and usually require calibration using local datasets. The quantification of the uncertainty associated with the models is a must, although it is rarely practiced. The International Working Group on Data and Models, which works under the IWA/IAHR Joint Committee on Urban Drainage, has been working on the development of a framework for defining and assessing uncertainties in the field of urban drainage modelling. A part of that work is the assessment and comparison of different techniques generally used in the uncertainty assessment of the parameters of water models. This paper compares a number of these techniques: the Generalized Likelihood Uncertainty Estimation (GLUE), the Shuffled Complex Evolution Metropolis algorithm (SCEM-UA), an approach based on a multi-objective auto-calibration (a multialgorithm, genetically adaptive multi-objective method, AMALGAM) and a Bayesian approach based on a simplified Markov Chain Monte Carlo method (implemented in the software MICA). To allow a meaningful comparison among the different uncertainty techniques, common criteria have been set for the likelihood formulation, defining the number of simulations, and the measure of uncertainty bounds. Moreover, all the uncertainty techniques were implemented for the same case study, in which the same stormwater quantity and quality model was used alongside the same dataset. The comparison results for a well-posed rainfall/runoff model showed that the four methods provide similar probability distributions of model parameters, and model prediction intervals. For ill-posed water quality model the differences between the results were much wider; and the paper provides the specific advantages and disadvantages of each method. In relation to computational efficiency (i.e. number of iterations required to generate the probability distribution of parameters), it was found that SCEM-UA and AMALGAM produce results quicker than GLUE in terms of required number of simulations. However, GLUE requires the lowest modelling skills and is easy to implement. All non-Bayesian methods have problems with the way they accept behavioural parameter sets, e.g. GLUE, SCEM-UA and AMALGAM have subjective acceptance thresholds, while MICA has usually problem with its hypothesis on normality of residuals. It is concluded that modellers should select the method which is most suitable for the system they are modelling (e.g. complexity of the model's structure including the number of parameters), their skill/knowledge level, the available information, and the purpose of their study. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES

    DTIC Science & Technology

    2017-06-01

    FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and

  14. The Case for Dynamic Assessment in Speech and Language Therapy

    ERIC Educational Resources Information Center

    Hasson, Natalie

    2007-01-01

    This paper highlights the appeal of dynamic assessment (DA) for speech and language therapists (SLTs), and describes the usefulness of various DA models and methods. It describes the background to DA, and the uses to which DA has been put, by educational psychologists in the UK, and by SLTs in the USA. The research and development of methods of DA…

  15. The performance evaluation model of mining project founded on the weight optimization entropy value method

    NASA Astrophysics Data System (ADS)

    Mao, Chao; Chen, Shou

    2017-01-01

    According to the traditional entropy value method still have low evaluation accuracy when evaluating the performance of mining projects, a performance evaluation model of mineral project founded on improved entropy is proposed. First establish a new weight assignment model founded on compatible matrix analysis of analytic hierarchy process (AHP) and entropy value method, when the compatibility matrix analysis to achieve consistency requirements, if it has differences between subjective weights and objective weights, moderately adjust both proportions, then on this basis, the fuzzy evaluation matrix for performance evaluation. The simulation experiments show that, compared with traditional entropy and compatible matrix analysis method, the proposed performance evaluation model of mining project based on improved entropy value method has higher accuracy assessment.

  16. An assessment of the DORT method on simple scatterers using boundary element modelling.

    PubMed

    Gélat, P; Ter Haar, G; Saffari, N

    2015-05-07

    The ability to focus through ribs overcomes an important limitation of a high-intensity focused ultrasound (HIFU) system for the treatment of liver tumours. Whilst it is important to generate high enough acoustic pressures at the treatment location for tissue lesioning, it is also paramount to ensure that the resulting ultrasonic dose on the ribs remains below a specified threshold, since ribs both strongly absorb and reflect ultrasound. The DORT (décomposition de l'opérateur de retournement temporel) method has the ability to focus on and through scatterers immersed in an acoustic medium selectively without requiring prior knowledge of their location or geometry. The method requires a multi-element transducer and is implemented via a singular value decomposition of the measured matrix of inter-element transfer functions. The efficacy of a method of focusing through scatterers is often assessed by comparing the specific absorption rate (SAR) at the surface of the scatterer, and at the focal region. The SAR can be obtained from a knowledge of the acoustic pressure magnitude and the acoustic properties of the medium and scatterer. It is well known that measuring acoustic pressures with a calibrated hydrophone at or near a hard surface presents experimental challenges, potentially resulting in increased measurement uncertainties. Hence, the DORT method is usually assessed experimentally by measuring the SAR at locations on the surface of the scatterer after the latter has been removed from the acoustic medium. This is also likely to generate uncertainties in the acoustic pressure measurement. There is therefore a strong case for assessing the efficacy of the DORT method through a validated theoretical model. The boundary element method (BEM) applied to exterior acoustic scattering problems is well-suited for such an assessment. In this study, BEM was used to implement the DORT method theoretically on locally reacting spherical scatterers, and to assess its focusing capability relative to the spherical focusing case, binarised apodisation based on geometric ray tracing and the phase conjugation method.

  17. Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa

    NASA Technical Reports Server (NTRS)

    Roberts, J. Brent; Robertson, Franklin R.; Bosilovich, Michael; Lyon, Bradfield; Funk, Chris

    2013-01-01

    The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period

  18. Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Roberts, J. Brent; Bosilovich, Michael; Lyon, Bradfield

    2013-01-01

    The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period.

  19. Assessing Dimensionality of Noncompensatory Multidimensional Item Response Theory with Complex Structures

    ERIC Educational Resources Information Center

    Svetina, Dubravka

    2013-01-01

    The purpose of this study was to investigate the effect of complex structure on dimensionality assessment in noncompensatory multidimensional item response models using dimensionality assessment procedures based on DETECT (dimensionality evaluation to enumerate contributing traits) and NOHARM (normal ogive harmonic analysis robust method). Five…

  20. Chapter 4: Assessing the Air Pollution, Greenhouse Gas, Air Quality, and Health Benefits of Clean Energy Initiatives

    EPA Pesticide Factsheets

    Chapter 4 of Assessing the Multiple Benefits of Clean Energy helps state states understand the methods, models, opportunities, and issues associated with assessing the GHG, air pollution, air quality, and human health benefits of clean energy options.

  1. Standardised Library Instruction Assessment: An Institution-Specific Approach

    ERIC Educational Resources Information Center

    Staley, Shannon M.; Branch, Nicole A.; Hewitt, Tom L.

    2010-01-01

    Introduction: We explore the use of a psychometric model for locally-relevant, information literacy assessment, using an online tool for standardised assessment of student learning during discipline-based library instruction sessions. Method: A quantitative approach to data collection and analysis was used, employing standardised multiple-choice…

  2. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    USDA-ARS?s Scientific Manuscript database

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  3. Assessing the fit of site-occupancy models

    USGS Publications Warehouse

    MacKenzie, D.I.; Bailey, L.L.

    2004-01-01

    Few species are likely to be so evident that they will always be detected at a site when present. Recently a model has been developed that enables estimation of the proportion of area occupied, when the target species is not detected with certainty. Here we apply this modeling approach to data collected on terrestrial salamanders in the Plethodon glutinosus complex in the Great Smoky Mountains National Park, USA, and wish to address the question 'how accurately does the fitted model represent the data?' The goodness-of-fit of the model needs to be assessed in order to make accurate inferences. This article presents a method where a simple Pearson chi-square statistic is calculated and a parametric bootstrap procedure is used to determine whether the observed statistic is unusually large. We found evidence that the most global model considered provides a poor fit to the data, hence estimated an overdispersion factor to adjust model selection procedures and inflate standard errors. Two hypothetical datasets with known assumption violations are also analyzed, illustrating that the method may be used to guide researchers to making appropriate inferences. The results of a simulation study are presented to provide a broader view of the methods properties.

  4. An observational assessment method for aging laboratory rats

    EPA Science Inventory

    The growth of the aging population highlights the need for laboratory animal models to study the basic biological processes ofaging and susceptibility to toxic chemicals and disease. Methods to evaluate health ofaging animals over time are needed, especially efficient methods for...

  5. Assessment of Automated Measurement and Verification (M&V) Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Custodio, Claudine

    This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.

  6. The lack of theoretical support for using person trade-offs in QALY-type models.

    PubMed

    Østerdal, Lars Peter

    2009-10-01

    Considerable support for the use of person trade-off methods to assess the quality-adjustment factor in quality-adjusted life years (QALY) models has been expressed in the literature. The WHO has occasionally used similar methods to assess the disability weights for calculation of disability-adjusted life years (DALYs). This paper discusses the theoretical support for the use of person trade-offs in QALY-type measurement of (changes in) population health. It argues that measures of this type based on such quality-adjustment factors almost always violate the Pareto principle, and so lack normative justification.

  7. Exploring the Implications of N Measurement and Model Choice on Using Data for Policy and Land Management Decisions

    NASA Astrophysics Data System (ADS)

    Bell, M. D.; Walker, J. T.

    2017-12-01

    Atmospheric deposition of nitrogen compounds are determined using a variety of measurement and modeling methods. These values are then used to calculate fluxes to the ecosystem which can then be linked to ecological responses. But, for this data to be used outside of the system in which it is developed, it is necessary to understand how the deposition estimates relate to one another. Therefore, we first identified sources of "bulk" deposition data and compared methods, reliability of data, and consistency of results to one another. Then we looked at the variation within photochemical models that are used by Federal Agencies to evaluate national trends. Finally, we identified some best practices for researchers to consider if their assessment is intended for use at broader scales. Empirical measurements used in this assessment include passive collection of atmospheric molecules, throughfall deposition of precipitation, snowpack measurements, and using biomonitors such as lichen. The three most common photochemical models used to model deposition within the United States are CMAQ, CAMx, and TDep (which uses empirical data to refine modeled values). These models all use meteorological and emission data to estimate deposition at local, regional, or national scales. We identified the range of uncertainty that exists within the types of deposition measurements and how these vary over space and time. Uncertainty is assessed by comparing deposition estimates from differing collection methods and comparing modeled estimates to empirical deposition data. Each collection method has benefits and downfalls that need to be taken into account if the results are to be expanded outside of the research area. Comparing field measured values to modeled values highlight the importance of each in the greater goals of understanding current conditions and trends within deposition patterns in the US. While models work well on a larger scale, they cannot replicate the local heterogeneity that exists at a site. Often, each researcher has a favorite method of analysis, but if the data cannot be related to other efforts then it becomes harder to apply it to broader policy considerations.

  8. Comparative Assessment of Particulate Air Pollution Exposure from Municipal Solid Waste Incinerator Emissions

    PubMed Central

    Ashworth, Danielle C.; Fuller, Gary W.; Toledano, Mireille B.; Font, Anna; Elliott, Paul; Hansell, Anna L.; de Hoogh, Kees

    2013-01-01

    Background. Research to date on health effects associated with incineration has found limited evidence of health risks, but many previous studies have been constrained by poor exposure assessment. This paper provides a comparative assessment of atmospheric dispersion modelling and distance from source (a commonly used proxy for exposure) as exposure assessment methods for pollutants released from incinerators. Methods. Distance from source and the atmospheric dispersion model ADMS-Urban were used to characterise ambient exposures to particulates from two municipal solid waste incinerators (MSWIs) in the UK. Additionally an exploration of the sensitivity of the dispersion model simulations to input parameters was performed. Results. The model output indicated extremely low ground level concentrations of PM10, with maximum concentrations of <0.01 μg/m3. Proximity and modelled PM10 concentrations for both MSWIs at postcode level were highly correlated when using continuous measures (Spearman correlation coefficients ~ 0.7) but showed poor agreement for categorical measures (deciles or quintiles, Cohen's kappa coefficients ≤ 0.5). Conclusion. To provide the most appropriate estimate of ambient exposure from MSWIs, it is essential that incinerator characteristics, magnitude of emissions, and surrounding meteorological and topographical conditions are considered. Reducing exposure misclassification is particularly important in environmental epidemiology to aid detection of low-level risks. PMID:23935644

  9. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    NASA Astrophysics Data System (ADS)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  10. Are We Teaching Them Anything?: A Model for Measuring Methodology Skills in the Political Science Major

    ERIC Educational Resources Information Center

    Siver, Christi; Greenfest, Seth W.; Haeg, G. Claire

    2016-01-01

    While the literature emphasizes the importance of teaching political science students methods skills, there currently exists little guidance for how to assess student learning over the course of their time in the major. To address this gap, we develop a model set of assessment tools that may be adopted and adapted by political science departments…

  11. Assessing crown fire potential by linking models of surface and crown fire behavior

    Treesearch

    Joe H. Scott; Elizabeth D. Reinhardt

    2001-01-01

    Fire managers are increasingly concerned about the threat of crown fires, yet only now are quantitative methods for assessing crown fire hazard being developed. Links among existing mathematical models of fire behavior are used to develop two indices of crown fire hazard-the Torching Index and Crowning Index. These indices can be used to ordinate different forest...

  12. High-resolution spatial modeling of daily weather elements for a catchment in the Oregon Cascade Mountains, United States

    Treesearch

    Christopher Daly; Jonathan W. Smith; Joseph I. Smith; Robert B. McKane

    2007-01-01

    High-quality daily meteorological data at high spatial resolution are essential for a variety of hydrologic and ecological modeling applications that support environmental risk assessments and decisionmaking. This paper describes the development. application. and assessment of methods to construct daily high resolution (~50-m cell size) meteorological grids for the...

  13. Cost-effective water quality assessment through the integration of monitoring data and modeling results

    NASA Astrophysics Data System (ADS)

    Lobuglio, Joseph N.; Characklis, Gregory W.; Serre, Marc L.

    2007-03-01

    Sparse monitoring data and error inherent in water quality models make the identification of waters not meeting regulatory standards uncertain. Additional monitoring can be implemented to reduce this uncertainty, but it is often expensive. These costs are currently a major concern, since developing total maximum daily loads, as mandated by the Clean Water Act, will require assessing tens of thousands of water bodies across the United States. This work uses the Bayesian maximum entropy (BME) method of modern geostatistics to integrate water quality monitoring data together with model predictions to provide improved estimates of water quality in a cost-effective manner. This information includes estimates of uncertainty and can be used to aid probabilistic-based decisions concerning the status of a water (i.e., impaired or not impaired) and the level of monitoring needed to characterize the water for regulatory purposes. This approach is applied to the Catawba River reservoir system in western North Carolina as a means of estimating seasonal chlorophyll a concentration. Mean concentration and confidence intervals for chlorophyll a are estimated for 66 reservoir segments over an 11-year period (726 values) based on 219 measured seasonal averages and 54 model predictions. Although the model predictions had a high degree of uncertainty, integration of modeling results via BME methods reduced the uncertainty associated with chlorophyll estimates compared with estimates made solely with information from monitoring efforts. Probabilistic predictions of future chlorophyll levels on one reservoir are used to illustrate the cost savings that can be achieved by less extensive and rigorous monitoring methods within the BME framework. While BME methods have been applied in several environmental contexts, employing these methods as a means of integrating monitoring and modeling results, as well as application of this approach to the assessment of surface water monitoring networks, represent unexplored areas of research.

  14. Risk management modeling and its application in maritime safety

    NASA Astrophysics Data System (ADS)

    Qin, Ting-Rong; Chen, Wei-Jiong; Zeng, Xiang-Kun

    2008-12-01

    Quantified risk assessment (QRA) needs mathematicization of risk theory. However, attention has been paid almost exclusively to applications of assessment methods, which has led to neglect of research into fundamental theories, such as the relationships among risk, safety, danger, and so on. In order to solve this problem, as a first step, fundamental theoretical relationships about risk and risk management were analyzed for this paper in the light of mathematics, and then illustrated with some charts. Second, man-machine-environment-management (MMEM) theory was introduced into risk theory to analyze some properties of risk. On the basis of this, a three-dimensional model of risk management was established that includes: a goal dimension; a management dimension; an operation dimension. This goal management operation (GMO) model was explained and then emphasis was laid on the discussion of the risk flowchart (operation dimension), which lays the groundwork for further study of risk management and qualitative and quantitative assessment. Next, the relationship between Formal Safety Assessment (FSA) and Risk Management was researched. This revealed that the FSA method, which the international maritime organization (IMO) is actively spreading, comes from Risk Management theory. Finally, conclusion were made about how to apply this risk management method to concrete fields efficiently and conveniently, as well as areas where further research is required.

  15. An Evaluation of the Decision-Making Capacity Assessment Model

    PubMed Central

    Brémault-Phillips, Suzette C.; Parmar, Jasneet; Friesen, Steven; Rogers, Laura G.; Pike, Ashley; Sluggett, Bryan

    2016-01-01

    Background The Decision-Making Capacity Assessment (DMCA) Model includes a best-practice process and tools to assess DMCA, and implementation strategies at the organizational and assessor levels to support provision of DMCAs across the care continuum. A Developmental Evaluation of the DMCA Model was conducted. Methods A mixed methods approach was used. Survey (N = 126) and focus group (N = 49) data were collected from practitioners utilizing the Model. Results Strengths of the Model include its best-practice and implementation approach, applicability to independent practitioners and inter-professional teams, focus on training/mentoring to enhance knowledge/skills, and provision of tools/processes. Post-training, participants agreed that they followed the Model’s guiding principles (90%), used problem-solving (92%), understood discipline-specific roles (87%), were confident in their knowledge of DMCAs (75%) and pertinent legislation (72%), accessed consultative services (88%), and received management support (64%). Model implementation is impeded when role clarity, physician engagement, inter-professional buy-in, accountability, dedicated resources, information sharing systems, and remuneration are lacking. Dedicated resources, job descriptions inclusive of DMCAs, ongoing education/mentoring supports, access to consultative services, and appropriate remuneration would support implementation. Conclusions The DMCA Model offers practitioners, inter-professional teams, and organizations a best-practice and implementation approach to DMCAs. Addressing barriers and further contextualizing the Model would be warranted. PMID:27729947

  16. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    NASA Astrophysics Data System (ADS)

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The probability to obtain a safety factor below 1 represents the probability of occurrence of a landslide for a given triggering event. The dispersion of the distribution gives the uncertainty of the result. Finally, a map is created, displaying a probability of occurrence for each computing cell of the studied area. In order to take into account the land-uses change, a complementary module integrating the vegetation effects on soil properties has been recently developed. Last years, the model has been applied at different scales for different geomorphological environments: (i) at regional scale (1:50,000-1:25,000) in French West Indies and French Polynesian islands (ii) at local scale (i.e.1:10,000) for two complex mountainous areas; (iii) at the site-specific scale (1:2,000) for one landslide. For each study the 3D geotechnical model has been adapted. The different studies have allowed : (i) to discuss the different factors included in the model especially the initial 3D geotechnical models; (ii) to precise the location of probable failure following different hydrological scenarii; (iii) to test the effects of climatic change and land-use on slopes for two cases. In that way, future changes in temperature, precipitation and vegetation cover can be analyzed, permitting to address the impacts of global change on landslides. Finally, results show that it is possible to obtain reliable information about future slope failures at different scale of work for different scenarii with an integrated approach. The final information about landslide susceptibility (i.e. probability of failure) can be integrated in landslide hazard assessment and could be an essential information source for future land planning. As it has been performed in the ANR Project SAMCO (Society Adaptation for coping with Mountain risks in a global change COntext), this analysis constitutes a first step in the chain for risk assessment for different climate and economical development scenarios, to evaluate the resilience of mountainous areas.

  17. Exposure assessment models for elemental components of particulate matter in an urban environment: A comparison of regression and random forest approaches

    NASA Astrophysics Data System (ADS)

    Brokamp, Cole; Jandarov, Roman; Rao, M. B.; LeMasters, Grace; Ryan, Patrick

    2017-02-01

    Exposure assessment for elemental components of particulate matter (PM) using land use modeling is a complex problem due to the high spatial and temporal variations in pollutant concentrations at the local scale. Land use regression (LUR) models may fail to capture complex interactions and non-linear relationships between pollutant concentrations and land use variables. The increasing availability of big spatial data and machine learning methods present an opportunity for improvement in PM exposure assessment models. In this manuscript, our objective was to develop a novel land use random forest (LURF) model and compare its accuracy and precision to a LUR model for elemental components of PM in the urban city of Cincinnati, Ohio. PM smaller than 2.5 μm (PM2.5) and eleven elemental components were measured at 24 sampling stations from the Cincinnati Childhood Allergy and Air Pollution Study (CCAAPS). Over 50 different predictors associated with transportation, physical features, community socioeconomic characteristics, greenspace, land cover, and emission point sources were used to construct LUR and LURF models. Cross validation was used to quantify and compare model performance. LURF and LUR models were created for aluminum (Al), copper (Cu), iron (Fe), potassium (K), manganese (Mn), nickel (Ni), lead (Pb), sulfur (S), silicon (Si), vanadium (V), zinc (Zn), and total PM2.5 in the CCAAPS study area. LURF utilized a more diverse and greater number of predictors than LUR and LURF models for Al, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all showed a decrease in fractional predictive error of at least 5% compared to their LUR models. LURF models for Al, Cu, Fe, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all had a cross validated fractional predictive error less than 30%. Furthermore, LUR models showed a differential exposure assessment bias and had a higher prediction error variance. Random forest and other machine learning methods may provide more accurate exposure assessment.

  18. Exposure assessment models for elemental components of particulate matter in an urban environment: A comparison of regression and random forest approaches.

    PubMed

    Brokamp, Cole; Jandarov, Roman; Rao, M B; LeMasters, Grace; Ryan, Patrick

    2017-02-01

    Exposure assessment for elemental components of particulate matter (PM) using land use modeling is a complex problem due to the high spatial and temporal variations in pollutant concentrations at the local scale. Land use regression (LUR) models may fail to capture complex interactions and non-linear relationships between pollutant concentrations and land use variables. The increasing availability of big spatial data and machine learning methods present an opportunity for improvement in PM exposure assessment models. In this manuscript, our objective was to develop a novel land use random forest (LURF) model and compare its accuracy and precision to a LUR model for elemental components of PM in the urban city of Cincinnati, Ohio. PM smaller than 2.5 μm (PM2.5) and eleven elemental components were measured at 24 sampling stations from the Cincinnati Childhood Allergy and Air Pollution Study (CCAAPS). Over 50 different predictors associated with transportation, physical features, community socioeconomic characteristics, greenspace, land cover, and emission point sources were used to construct LUR and LURF models. Cross validation was used to quantify and compare model performance. LURF and LUR models were created for aluminum (Al), copper (Cu), iron (Fe), potassium (K), manganese (Mn), nickel (Ni), lead (Pb), sulfur (S), silicon (Si), vanadium (V), zinc (Zn), and total PM2.5 in the CCAAPS study area. LURF utilized a more diverse and greater number of predictors than LUR and LURF models for Al, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all showed a decrease in fractional predictive error of at least 5% compared to their LUR models. LURF models for Al, Cu, Fe, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all had a cross validated fractional predictive error less than 30%. Furthermore, LUR models showed a differential exposure assessment bias and had a higher prediction error variance. Random forest and other machine learning methods may provide more accurate exposure assessment.

  19. Exposure assessment models for elemental components of particulate matter in an urban environment: A comparison of regression and random forest approaches

    PubMed Central

    Brokamp, Cole; Jandarov, Roman; Rao, M.B.; LeMasters, Grace; Ryan, Patrick

    2017-01-01

    Exposure assessment for elemental components of particulate matter (PM) using land use modeling is a complex problem due to the high spatial and temporal variations in pollutant concentrations at the local scale. Land use regression (LUR) models may fail to capture complex interactions and non-linear relationships between pollutant concentrations and land use variables. The increasing availability of big spatial data and machine learning methods present an opportunity for improvement in PM exposure assessment models. In this manuscript, our objective was to develop a novel land use random forest (LURF) model and compare its accuracy and precision to a LUR model for elemental components of PM in the urban city of Cincinnati, Ohio. PM smaller than 2.5 μm (PM2.5) and eleven elemental components were measured at 24 sampling stations from the Cincinnati Childhood Allergy and Air Pollution Study (CCAAPS). Over 50 different predictors associated with transportation, physical features, community socioeconomic characteristics, greenspace, land cover, and emission point sources were used to construct LUR and LURF models. Cross validation was used to quantify and compare model performance. LURF and LUR models were created for aluminum (Al), copper (Cu), iron (Fe), potassium (K), manganese (Mn), nickel (Ni), lead (Pb), sulfur (S), silicon (Si), vanadium (V), zinc (Zn), and total PM2.5 in the CCAAPS study area. LURF utilized a more diverse and greater number of predictors than LUR and LURF models for Al, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all showed a decrease in fractional predictive error of at least 5% compared to their LUR models. LURF models for Al, Cu, Fe, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all had a cross validated fractional predictive error less than 30%. Furthermore, LUR models showed a differential exposure assessment bias and had a higher prediction error variance. Random forest and other machine learning methods may provide more accurate exposure assessment. PMID:28959135

  20. Teaching genetics using hands-on models, problem solving, and inquiry-based methods

    NASA Astrophysics Data System (ADS)

    Hoppe, Stephanie Ann

    Teaching genetics can be challenging because of the difficulty of the content and misconceptions students might hold. This thesis focused on using hands-on model activities, problem solving, and inquiry-based teaching/learning methods in order to increase student understanding in an introductory biology class in the area of genetics. Various activities using these three methods were implemented into the classes to address any misconceptions and increase student learning of the difficult concepts. The activities that were implemented were shown to be successful based on pre-post assessment score comparison. The students were assessed on the subjects of inheritance patterns, meiosis, and protein synthesis and demonstrated growth in all of the areas. It was found that hands-on models, problem solving, and inquiry-based activities were more successful in learning concepts in genetics and the students were more engaged than tradition styles of lecture.

  1. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  2. Assessment of risk due to the use of carbon fiber composites in commercial and general aviation

    NASA Technical Reports Server (NTRS)

    Fiksel, J.; Rosenfield, D.; Kalelkar, A.

    1980-01-01

    The development of a national risk profile for the total annual aircraft losses due to carbon fiber composite (CFC) usage through 1993 is discussed. The profile was developed using separate simulation methods for commercial and general aviation aircraft. A Monte Carlo method which was used to assess the risk in commercial aircraft is described. The method projects the potential usage of CFC through 1993, investigates the incidence of commercial aircraft fires, models the potential release and dispersion of carbon fibers from a fire, and estimates potential economic losses due to CFC damaging electronic equipment. The simulation model for the general aviation aircraft is described. The model emphasizes variations in facility locations and release conditions, estimates distribution of CFC released in general aviation aircraft accidents, and tabulates the failure probabilities and aggregate economic losses in the accidents.

  3. Using random forest for the risk assessment of coal-floor water inrush in Panjiayao Coal Mine, northern China

    NASA Astrophysics Data System (ADS)

    Zhao, Dekang; Wu, Qiang; Cui, Fangpeng; Xu, Hua; Zeng, Yifan; Cao, Yufei; Du, Yuanze

    2018-04-01

    Coal-floor water-inrush incidents account for a large proportion of coal mine disasters in northern China, and accurate risk assessment is crucial for safe coal production. A novel and promising assessment model for water inrush is proposed based on random forest (RF), which is a powerful intelligent machine-learning algorithm. RF has considerable advantages, including high classification accuracy and the capability to evaluate the importance of variables; in particularly, it is robust in dealing with the complicated and non-linear problems inherent in risk assessment. In this study, the proposed model is applied to Panjiayao Coal Mine, northern China. Eight factors were selected as evaluation indices according to systematic analysis of the geological conditions and a field survey of the study area. Risk assessment maps were generated based on RF, and the probabilistic neural network (PNN) model was also used for risk assessment as a comparison. The results demonstrate that the two methods are consistent in the risk assessment of water inrush at the mine, and RF shows a better performance compared to PNN with an overall accuracy higher by 6.67%. It is concluded that RF is more practicable to assess the water-inrush risk than PNN. The presented method will be helpful in avoiding water inrush and also can be extended to various engineering applications.

  4. Physically-Based Assessment of Intrinsic Groundwater Resource Vulnerability in AN Urban Catchment

    NASA Astrophysics Data System (ADS)

    Graf, T.; Therrien, R.; Lemieux, J.; Molson, J. W.

    2013-12-01

    Several methods exist to assess intrinsic groundwater (re)source vulnerability for the purpose of sustainable groundwater management and protection. However, several methods are empirical and limited in their application to specific types of hydrogeological systems. Recent studies suggest that a physically-based approach could be better suited to provide a general, conceptual and operational basis for groundwater vulnerability assessment. A novel method for physically-based assessment of intrinsic aquifer vulnerability is currently under development and tested to explore the potential of an integrated modelling approach, combining groundwater travel time probability and future scenario modelling in conjunction with the fully integrated HydroGeoSphere model. To determine the intrinsic groundwater resource vulnerability, a fully coupled 2D surface water and 3D variably-saturated groundwater flow model in conjunction with a 3D geological model (GoCAD) has been developed for a case study of the Rivière Saint-Charles (Québec/Canada) regional scale, urban watershed. The model has been calibrated under transient flow conditions for the hydrogeological, variably-saturated subsurface system, coupled with the overland flow zone by taking into account monthly recharge variation and evapotranspiration. To better determine the intrinsic groundwater vulnerability, two independent approaches are considered and subsequently combined in a simple, holistic multi-criteria-decision analyse. Most data for the model comes from an extensive hydrogeological database for the watershed, whereas data gaps have been complemented via field tests and literature review. The subsurface is composed of nine hydrofacies, ranging from unconsolidated fluvioglacial sediments to low permeability bedrock. The overland flow zone is divided into five major zones (Urban, Rural, Forest, River and Lake) to simulate the differences in landuse, whereas the unsaturated zone is represented via the model integrated Van-Genuchten function. The model setup and optimisation turn out to be the most challenging part because of the non-trivial nature (due to the highly non-linear PDEs) of the coupling procedure between the surface and subsurface domain, while keeping realistic parameter ranges and obtaining realistic simulation results in both domains. The model calibration is based on water level monitoring as well as daily mean river discharge measurement at different gauge stations within the catchment. It is intended to create multiple model outcomes for the numerical modelling of the groundwater vulnerability to take into account uncertainty due to the model input data. The next step of the overall vulnerability assessment consists in modelling future vulnerability scenario(s), applying realistic changes to the model by using PEST with SENSAN for subsequent sensitivity analysis. The PEST model could also potentially be used for a model recalibration as a function of the model parameters sensitivity (simple perturbation method). Preliminary results showing a good fit between the observed and simulated water levels and hydrographs. However the simulated water depth at the overland flow domain as well as the simulated saturation distribution in the porous media domain are still showing room for improvement of the numerical model.

  5. Constructing a Grounded Theory of E-Learning Assessment

    ERIC Educational Resources Information Center

    Alonso-Díaz, Laura; Yuste-Tosina, Rocío

    2015-01-01

    This study traces the development of a grounded theory of assessment in e-learning environments, a field in need of research to establish the parameters of an assessment that is both reliable and worthy of higher learning accreditation. Using grounded theory as a research method, we studied an e-assessment model that does not require physical…

  6. Summarization as the base for text assessment

    NASA Astrophysics Data System (ADS)

    Karanikolas, Nikitas N.

    2015-02-01

    We present a model that apply shallow text summarization as a cheap (in resources needed) process for Automatic (machine based) free text answer Assessment (AA). The evaluation of the proposed method induces the inference that the Conventional Assessment (CA, man made assessment of free text answers) does not have an obvious mechanical replacement. However, this is a research challenge.

  7. Assessment and Continuous Quality Improvement: A North American Case Study.

    ERIC Educational Resources Information Center

    Tait, Jo; Knight, Peter

    1995-01-01

    Examines a five-stage assessment model used at James Madison University (Virginia) to assess student learning and to guide policy. Includes identification of program objectives, selection or design of methods that measure those outcomes, analysis of assessment data, application of data for decision making, and use of data to bid for state funding.…

  8. On the Latent Regression Model of Item Response Theory. Research Report. ETS RR-07-12

    ERIC Educational Resources Information Center

    Antal, Tamás

    2007-01-01

    Full account of the latent regression model for the National Assessment of Educational Progress is given. The treatment includes derivation of the EM algorithm, Newton-Raphson method, and the asymptotic standard errors. The paper also features the use of the adaptive Gauss-Hermite numerical integration method as a basic tool to evaluate…

  9. Assessing the Reliability of Curriculum-Based Measurement: An Application of Latent Growth Modeling

    ERIC Educational Resources Information Center

    Yeo, Seungsoo; Kim, Dong-Il; Branum-Martin, Lee; Wayman, Miya Miura; Espin, Christine A.

    2012-01-01

    The purpose of this study was to demonstrate the use of Latent Growth Modeling (LGM) as a method for estimating reliability of Curriculum-Based Measurement (CBM) progress-monitoring data. The LGM approach permits the error associated with each measure to differ at each time point, thus providing an alternative method for examining of the…

  10. Methods and Models of the Hanford Internal Dosimetry Program, PNNL-MA-860

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, Eugene H.; Bihl, Donald E.; Maclellan, Jay A.

    2003-01-03

    This manual describes the technical basis for the design of the routine radiobioassay monitoring program and assessments of internal dose. Its purpose is to provide a historical record of the methods, models, and assumptions used for internal dosimetry at Hanford, and serve as a technical reference for radiation protection and dosimetry staff.

  11. Statistical power of intervention analyses: simulation and empirical application to treated lumber prices

    Treesearch

    Jeffrey P. Prestemon

    2009-01-01

    Timber product markets are subject to large shocks deriving from natural disturbances and policy shifts. Statistical modeling of shocks is often done to assess their economic importance. In this article, I simulate the statistical power of univariate and bivariate methods of shock detection using time series intervention models. Simulations show that bivariate methods...

  12. Analytical resource assessment method for continuous (unconventional) oil and gas accumulations - The "ACCESS" Method

    USGS Publications Warehouse

    Crovelli, Robert A.; revised by Charpentier, Ronald R.

    2012-01-01

    The U.S. Geological Survey (USGS) periodically assesses petroleum resources of areas within the United States and the world. The purpose of this report is to explain the development of an analytic probabilistic method and spreadsheet software system called Analytic Cell-Based Continuous Energy Spreadsheet System (ACCESS). The ACCESS method is based upon mathematical equations derived from probability theory. The ACCESS spreadsheet can be used to calculate estimates of the undeveloped oil, gas, and NGL (natural gas liquids) resources in a continuous-type assessment unit. An assessment unit is a mappable volume of rock in a total petroleum system. In this report, the geologic assessment model is defined first, the analytic probabilistic method is described second, and the spreadsheet ACCESS is described third. In this revised version of Open-File Report 00-044 , the text has been updated to reflect modifications that were made to the ACCESS program. Two versions of the program are added as appendixes.

  13. An airport community noise-impact assessment model

    NASA Technical Reports Server (NTRS)

    Deloach, R.

    1980-01-01

    A computer model was developed to assess the noise impact of an airport on the community which it serves. Assessments are made using the Fractional Impact Method by which a single number describes the community aircraft noise environment in terms of exposed population and multiple event noise level. The model is comprised of three elements: a conventional noise footprint model, a site specific population distribution model, and a dose response transfer function. The footprint model provides the noise distribution for a given aircraft operating scenario. This is combined with the site specific population distribution obtained from a national census data base to yield the number of residents exposed to a given level of noise. The dose response relationship relates noise exposure levels to the percentage of individuals highly annoyed by those levels.

  14. REGIONAL VULNERABILITY ASSESSMENT (REVA) IMPROVING ENVIRONMENTAL DECISION MAKING THROUGH CLIENT PARTNERSHIPS

    EPA Science Inventory

    The Regional Vulnerability Assessment (ReV A) Program is an applied research program t,1at is focusing on using spatial information and model results to support environmental decision-making at regional- down to local-scales. Re VA has developed analysis and assessment methods to...

  15. Students Explaining Science—Assessment of Science Communication Competence

    NASA Astrophysics Data System (ADS)

    Kulgemeyer, Christoph; Schecker, Horst

    2013-12-01

    Science communication competence (SCC) is an important educational goal in the school science curricula of several countries. However, there is a lack of research about the structure and the assessment of SCC. This paper specifies the theoretical framework of SCC by a competence model. We developed a qualitative assessment method for SCC that is based on an expert-novice dialog: an older student (explainer, expert) explains a physics phenomenon to a younger peer (addressee, novice) in a controlled test setting. The explanations are video-recorded and analysed by qualitative content analysis. The method was applied in a study with 46 secondary school students as explainers. Our aims were (a) to evaluate whether our model covers the relevant features of SCC, (b) to validate the assessment method and (c) to find characteristics of addressee-adequate explanations. A performance index was calculated to quantify the explainers' levels of competence on an ordinal scale. We present qualitative and quantitative evidence that the index is adequate for assessment purposes. It correlates with results from a written SCC test and a perspective taking test (convergent validity). Addressee-adequate explanations can be characterized by use of graphical representations and deliberate switches between scientific and everyday language.

  16. Analytical Solutions for Rumor Spreading Dynamical Model in a Social Network

    NASA Astrophysics Data System (ADS)

    Fallahpour, R.; Chakouvari, S.; Askari, H.

    2015-03-01

    In this paper, Laplace Adomian decomposition method is utilized for evaluating of spreading model of rumor. Firstly, a succinct review is constructed on the subject of using analytical methods such as Adomian decomposion method, Variational iteration method and Homotopy Analysis method for epidemic models and biomathematics. In continue a spreading model of rumor with consideration of forgetting mechanism is assumed and subsequently LADM is exerted for solving of it. By means of the aforementioned method, a general solution is achieved for this problem which can be readily employed for assessing of rumor model without exerting any computer program. In addition, obtained consequences for this problem are discussed for different cases and parameters. Furthermore, it is shown the method is so straightforward and fruitful for analyzing equations which have complicated terms same as rumor model. By employing numerical methods, it is revealed LADM is so powerful and accurate for eliciting solutions of this model. Eventually, it is concluded that this method is so appropriate for this problem and it can provide researchers a very powerful vehicle for scrutinizing rumor models in diverse kinds of social networks such as Facebook, YouTube, Flickr, LinkedIn and Tuitor.

  17. Invited review: A position on the Global Livestock Environmental Assessment Model (GLEAM).

    PubMed

    MacLeod, M J; Vellinga, T; Opio, C; Falcucci, A; Tempio, G; Henderson, B; Makkar, H; Mottet, A; Robinson, T; Steinfeld, H; Gerber, P J

    2018-02-01

    The livestock sector is one of the fastest growing subsectors of the agricultural economy and, while it makes a major contribution to global food supply and economic development, it also consumes significant amounts of natural resources and alters the environment. In order to improve our understanding of the global environmental impact of livestock supply chains, the Food and Agriculture Organization of the United Nations has developed the Global Livestock Environmental Assessment Model (GLEAM). The purpose of this paper is to provide a review of GLEAM. Specifically, it explains the model architecture, methods and functionality, that is the types of analysis that the model can perform. The model focuses primarily on the quantification of greenhouse gases emissions arising from the production of the 11 main livestock commodities. The model inputs and outputs are managed and produced as raster data sets, with spatial resolution of 0.05 decimal degrees. The Global Livestock Environmental Assessment Model v1.0 consists of five distinct modules: (a) the Herd Module; (b) the Manure Module; (c) the Feed Module; (d) the System Module; (e) the Allocation Module. In terms of the modelling approach, GLEAM has several advantages. For example spatial information on livestock distributions and crops yields enables rations to be derived that reflect the local availability of feed resources in developing countries. The Global Livestock Environmental Assessment Model also contains a herd model that enables livestock statistics to be disaggregated and variation in livestock performance and management to be captured. Priorities for future development of GLEAM include: improving data quality and the methods used to perform emissions calculations; extending the scope of the model to include selected additional environmental impacts and to enable predictive modelling; and improving the utility of GLEAM output.

  18. MODELS AND METHODS FOR PETROLEUM HYDROCARBON RISK ASSESSMENT: ONSITE, LUSTRISK, AND HSSM

    EPA Science Inventory

    U.S. EPA has developed three tiers of models for analysis of fuel releases from underground storage tank (UST) systems: 1) OnSite; 2) LUSTRisk, and 3) the Hydrocarbon Spill Screening Model (HSSM). The tiered approach to modeling allows users to select a model based upon the amoun...

  19. Linking Air Quality and Watershed Models for Environmental Assessments: Analysis of the Effects of Model-Specific Precipitation Estimates on Calculated Water Flux

    EPA Science Inventory

    Directly linking air quality and watershed models could provide an effective method for estimating spatially-explicit inputs of atmospheric contaminants to watershed biogeochemical models. However, to adequately link air and watershed models for wet deposition estimates, each mod...

  20. Evapotranspiration Calculations for an Alpine Marsh Meadow Site in Three-river Headwater Region

    NASA Astrophysics Data System (ADS)

    Zhou, B.; Xiao, H.

    2016-12-01

    Daily radiation and meteorological data were collected at an alpine marsh meadow site in the Three-river Headwater Region(THR). Use them to assess radiation models determined after comparing the performance between Zuo model and the model recommend by FAO56P-M.Four methods, FAO56P-M, Priestley-Taylor, Hargreaves, and Makkink methods were applied to determine daily reference evapotranspiration( ETr) for the growing season and built the empirical models for estimating daily actual evapotranspiration ETa between ETr derived from the four methods and evapotranspiration derived from Bowen Ratio method on alpine marsh meadow in this region. After comparing the performance of four empirical models by RMSE, MAE and AI, it showed these models all can get the better estimated daily ETaon alpine marsh meadow in this region, and the best performance of the FAO56 P-M, Makkink empirical model were better than Priestley-Taylor and Hargreaves model.

  1. Assessment of the Casualty Risk of Multiple Meteorological Hazards in China

    PubMed Central

    Xu, Wei; Zhuo, Li; Zheng, Jing; Ge, Yi; Gu, Zhihui; Tian, Yugang

    2016-01-01

    A study of the frequency, intensity, and risk of extreme climatic events or natural hazards is important for assessing the impacts of climate change. Many models have been developed to assess the risk of multiple hazards, however, most of the existing approaches can only model the relative levels of risk. This paper reports the development of a method for the quantitative assessment of the risk of multiple hazards based on information diffusion. This method was used to assess the risks of loss of human lives from 11 types of meteorological hazards in China at the prefectural and provincial levels. Risk curves of multiple hazards were obtained for each province and the risks of 10-year, 20-year, 50-year, and 100-year return periods were mapped. The results show that the provinces (municipalities, autonomous regions) in southeastern China are at higher risk of multiple meteorological hazards as a result of their geographical location and topography. The results of this study can be used as references for the management of meteorological disasters in China. The model can be used to quantitatively calculate the risks of casualty, direct economic losses, building collapse, and agricultural losses for any hazards at different spatial scales. PMID:26901210

  2. Assessment of the Casualty Risk of Multiple Meteorological Hazards in China.

    PubMed

    Xu, Wei; Zhuo, Li; Zheng, Jing; Ge, Yi; Gu, Zhihui; Tian, Yugang

    2016-02-17

    A study of the frequency, intensity, and risk of extreme climatic events or natural hazards is important for assessing the impacts of climate change. Many models have been developed to assess the risk of multiple hazards, however, most of the existing approaches can only model the relative levels of risk. This paper reports the development of a method for the quantitative assessment of the risk of multiple hazards based on information diffusion. This method was used to assess the risks of loss of human lives from 11 types of meteorological hazards in China at the prefectural and provincial levels. Risk curves of multiple hazards were obtained for each province and the risks of 10-year, 20-year, 50-year, and 100-year return periods were mapped. The results show that the provinces (municipalities, autonomous regions) in southeastern China are at higher risk of multiple meteorological hazards as a result of their geographical location and topography. The results of this study can be used as references for the management of meteorological disasters in China. The model can be used to quantitatively calculate the risks of casualty, direct economic losses, building collapse, and agricultural losses for any hazards at different spatial scales.

  3. Augmented assessment as a means to augmented reality.

    PubMed

    Bergeron, Bryan

    2006-01-01

    Rigorous scientific assessment of educational technologies typically lags behind the availability of the technologies by years because of the lack of validated instruments and benchmarks. Even when the appropriate assessment instruments are available, they may not be applied because of time and monetary constraints. Work in augmented reality, instrumented mannequins, serious gaming, and similar promising educational technologies that haven't undergone timely, rigorous evaluation, highlights the need for assessment methodologies that address the limitations of traditional approaches. The most promising augmented assessment solutions incorporate elements of rapid prototyping used in the software industry, simulation-based assessment techniques modeled after methods used in bioinformatics, and object-oriented analysis methods borrowed from object oriented programming.

  4. Model-based registration for assessment of spinal deformities in idiopathic scoliosis

    NASA Astrophysics Data System (ADS)

    Forsberg, Daniel; Lundström, Claes; Andersson, Mats; Knutsson, Hans

    2014-01-01

    Detailed analysis of spinal deformity is important within orthopaedic healthcare, in particular for assessment of idiopathic scoliosis. This paper addresses this challenge by proposing an image analysis method, capable of providing a full three-dimensional spine characterization. The proposed method is based on the registration of a highly detailed spine model to image data from computed tomography. The registration process provides an accurate segmentation of each individual vertebra and the ability to derive various measures describing the spinal deformity. The derived measures are estimated from landmarks attached to the spine model and transferred to the patient data according to the registration result. Evaluation of the method provides an average point-to-surface error of 0.9 mm ± 0.9 (comparing segmentations), and an average target registration error of 2.3 mm ± 1.7 (comparing landmarks). Comparing automatic and manual measurements of axial vertebral rotation provides a mean absolute difference of 2.5° ± 1.8, which is on a par with other computerized methods for assessing axial vertebral rotation. A significant advantage of our method, compared to other computerized methods for rotational measurements, is that it does not rely on vertebral symmetry for computing the rotational measures. The proposed method is fully automatic and computationally efficient, only requiring three to four minutes to process an entire image volume covering vertebrae L5 to T1. Given the use of landmarks, the method can be readily adapted to estimate other measures describing a spinal deformity by changing the set of employed landmarks. In addition, the method has the potential to be utilized for accurate segmentations of the vertebrae in routine computed tomography examinations, given the relatively low point-to-surface error.

  5. Expectation Maximization Algorithm for Box-Cox Transformation Cure Rate Model and Assessment of Model Misspecification Under Weibull Lifetimes.

    PubMed

    Pal, Suvra; Balakrishnan, Narayanaswamy

    2018-05-01

    In this paper, we develop likelihood inference based on the expectation maximization algorithm for the Box-Cox transformation cure rate model assuming the lifetimes to follow a Weibull distribution. A simulation study is carried out to demonstrate the performance of the proposed estimation method. Through Monte Carlo simulations, we also study the effect of model misspecification on the estimate of cure rate. Finally, we analyze a well-known data on melanoma with the model and the inferential method developed here.

  6. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE PAGES

    Jakeman, Anthony J.; Jakeman, John Davis

    2018-03-14

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  7. Integrating info-gap decision theory with robust population management: a case study using the Mountain Plover.

    PubMed

    van der Burg, Max Post; Tyre, Andrew J

    2011-01-01

    Wildlife managers often make decisions under considerable uncertainty. In the most extreme case, a complete lack of data leads to uncertainty that is unquantifiable. Information-gap decision theory deals with assessing management decisions under extreme uncertainty, but it is not widely used in wildlife management. So too, robust population management methods were developed to deal with uncertainties in multiple-model parameters. However, the two methods have not, as yet, been used in tandem to assess population management decisions. We provide a novel combination of the robust population management approach for matrix models with the information-gap decision theory framework for making conservation decisions under extreme uncertainty. We applied our model to the problem of nest survival management in an endangered bird species, the Mountain Plover (Charadrius montanus). Our results showed that matrix sensitivities suggest that nest management is unlikely to have a strong effect on population growth rate, confirming previous analyses. However, given the amount of uncertainty about adult and juvenile survival, our analysis suggested that maximizing nest marking effort was a more robust decision to maintain a stable population. Focusing on the twin concepts of opportunity and robustness in an information-gap model provides a useful method of assessing conservation decisions under extreme uncertainty.

  8. Assessment of the GHG Reduction Potential from Energy Crops Using a Combined LCA and Biogeochemical Process Models: A Review

    PubMed Central

    Jiang, Dong; Hao, Mengmeng; Wang, Qiao; Huang, Yaohuan; Fu, Xinyu

    2014-01-01

    The main purpose for developing biofuel is to reduce GHG (greenhouse gas) emissions, but the comprehensive environmental impact of such fuels is not clear. Life cycle analysis (LCA), as a complete comprehensive analysis method, has been widely used in bioenergy assessment studies. Great efforts have been directed toward establishing an efficient method for comprehensively estimating the greenhouse gas (GHG) emission reduction potential from the large-scale cultivation of energy plants by combining LCA with ecosystem/biogeochemical process models. LCA presents a general framework for evaluating the energy consumption and GHG emission from energy crop planting, yield acquisition, production, product use, and postprocessing. Meanwhile, ecosystem/biogeochemical process models are adopted to simulate the fluxes and storage of energy, water, carbon, and nitrogen in the soil-plant (energy crops) soil continuum. Although clear progress has been made in recent years, some problems still exist in current studies and should be addressed. This paper reviews the state-of-the-art method for estimating GHG emission reduction through developing energy crops and introduces in detail a new approach for assessing GHG emission reduction by combining LCA with biogeochemical process models. The main achievements of this study along with the problems in current studies are described and discussed. PMID:25045736

  9. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakeman, Anthony J.; Jakeman, John Davis

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  10. Can mixed assessment methods make biology classes more equitable?

    PubMed

    Cotner, Sehoya; Ballen, Cissy J

    2017-01-01

    Many factors have been proposed to explain the attrition of women in science, technology, engineering and math fields, among them the lower performance of women in introductory courses resulting from deficits in incoming preparation. We focus on the impact of mixed methods of assessment, which minimizes the impact of high-stakes exams and rewards other methods of assessment such as group participation, low-stakes quizzes and assignments, and in-class activities. We hypothesized that these mixed methods would benefit individuals who otherwise underperform on high-stakes tests. Here, we analyze gender-based performance trends in nine large (N > 1000 students) introductory biology courses in fall 2016. Females underperformed on exams compared to their male counterparts, a difference that does not exist with other methods of assessment that compose course grade. Further, we analyzed three case studies of courses that transitioned their grading schemes to either de-emphasize or emphasize exams as a proportion of total course grade. We demonstrate that the shift away from an exam emphasis consequently benefits female students, thereby closing gaps in overall performance. Further, the exam performance gap itself is reduced when the exams contribute less to overall course grade. We discuss testable predictions that follow from our hypothesis, and advocate for the use of mixed methods of assessments (possibly as part of an overall shift to active learning techniques). We conclude by challenging the student deficit model, and suggest a course deficit model as explanatory of these performance gaps, whereby the microclimate of the classroom can either raise or lower barriers to success for underrepresented groups in STEM.

  11. Can mixed assessment methods make biology classes more equitable?

    PubMed Central

    Ballen, Cissy J.

    2017-01-01

    Many factors have been proposed to explain the attrition of women in science, technology, engineering and math fields, among them the lower performance of women in introductory courses resulting from deficits in incoming preparation. We focus on the impact of mixed methods of assessment, which minimizes the impact of high-stakes exams and rewards other methods of assessment such as group participation, low-stakes quizzes and assignments, and in-class activities. We hypothesized that these mixed methods would benefit individuals who otherwise underperform on high-stakes tests. Here, we analyze gender-based performance trends in nine large (N > 1000 students) introductory biology courses in fall 2016. Females underperformed on exams compared to their male counterparts, a difference that does not exist with other methods of assessment that compose course grade. Further, we analyzed three case studies of courses that transitioned their grading schemes to either de-emphasize or emphasize exams as a proportion of total course grade. We demonstrate that the shift away from an exam emphasis consequently benefits female students, thereby closing gaps in overall performance. Further, the exam performance gap itself is reduced when the exams contribute less to overall course grade. We discuss testable predictions that follow from our hypothesis, and advocate for the use of mixed methods of assessments (possibly as part of an overall shift to active learning techniques). We conclude by challenging the student deficit model, and suggest a course deficit model as explanatory of these performance gaps, whereby the microclimate of the classroom can either raise or lower barriers to success for underrepresented groups in STEM. PMID:29281676

  12. TWINTAN: A program for transonic wall interference assessment in two-dimensional wind tunnels

    NASA Technical Reports Server (NTRS)

    Kemp, W. B., Jr.

    1980-01-01

    A method for assessing the wall interference in transonic two dimensional wind tunnel test was developed and implemented in a computer program. The method involves three successive solutions of the transonic small disturbance potential equation to define the wind tunnel flow, the perturbation attriburable to the model, and the equivalent free air flow around the model. Input includes pressure distributions on the model and along the top and bottom tunnel walls which are used as boundary conditions for the wind tunnel flow. The wall induced perturbation fields is determined as the difference between the perturbation in the tunnel flow solution and the perturbation attributable to the model. The methodology used in the program is described and detailed descriptions of the computer program input and output are presented. Input and output for a sample case are given.

  13. [Criteria for assessing severely hot environments: from the WBGT index to the PHS (predicted heat strain) model].

    PubMed

    d'Ambrosio, Francesca Romana; Palella, B I; Riccio, G; Alfano, G

    2004-01-01

    The present study deals with the main methods for assessment of hot environments: i.e., WBGT, SWreq and PHS. It is stressed how the WBGT index, which is strictly empirical, although a very practical tool for the assessment of the hot environments, can only be used for a rough evaluation of heat stress, and especially for a not very high metabolic rate (M<175 W/m2). On the contrary, the SWreq method, which is based on both subject-environment heat exchange and the effect of clothing, allows a better assessment of the work situation with a general reduction of the exposure limits with respect to WBGT, especially in non-uniform environments (ta not equal to tr). However, it should be noted that application of SWreq is required by the ISO standard 7243 when the WBGT limit values are exceeded. In this study interest was extensively focused on the "Predicted Heat Strain" method, highlighting via a special software the differences in heat stress assessment related to this new approach, which will be adopted by the ISO in the next revision of standard 7933. The PHS method, unlike SWreq, allows the prediction of the time-response of the main physiological variables of interest (i.e., skin temperature, core temperature and sweat rate). Moreover thanks to better modelling of heat exchanges, the PHS method allows account to be taken of both movement and clothing effects, resulting in even more reduced exposure.

  14. Distribution of stress on TMJ disc induced by use of chincup therapy: assessment by the finite element method.

    PubMed

    Calçada, Flávio Siqueira; Guimarães, Antônio Sérgio; Teixeira, Marcelo Lucchesi; Takamatsu, Flávio Atsushi

    2017-01-01

    To assess the distribution of stress produced on TMJ disc by chincup therapy, by means of the finite element method. a simplified three-dimensional TMJ disc model was developed by using Rhinoceros 3D software, and exported to ANSYS software. A 4.9N load was applied on the inferior surface of the model at inclinations of 30, 40, and 50 degrees to the mandibular plane (GoMe). ANSYS was used to analyze stress distribution on the TMJ disc for the different angulations, by means of finite element method. The results showed that the tensile and compressive stresses concentrations were higher on the inferior surface of the model. More presence of tensile stress was found in the middle-anterior region of the model and its location was not altered in the three directions of load application. There was more presence of compressive stress in the middle and mid-posterior regions, but when a 50o inclined load was applied, concentration in the middle region was prevalent. Tensile and compressive stresses intensities progressively diminished as the load was more vertically applied. stress induced by the chincup therapy is mainly located on the inferior surface of the model. Loads at greater angles to the mandibular plane produced distribution of stresses with lower intensity and a concentration of compressive stresses in the middle region. The simplified three-dimensional model proved useful for assessing the distribution of stresses on the TMJ disc induced by the chincup therapy.

  15. Estimating, Testing, and Comparing Specific Effects in Structural Equation Models: The Phantom Model Approach

    ERIC Educational Resources Information Center

    Macho, Siegfried; Ledermann, Thomas

    2011-01-01

    The phantom model approach for estimating, testing, and comparing specific effects within structural equation models (SEMs) is presented. The rationale underlying this novel method consists in representing the specific effect to be assessed as a total effect within a separate latent variable model, the phantom model that is added to the main…

  16. Enhanced Decision Analysis Support System.

    DTIC Science & Technology

    1981-03-01

    autorrares "i., the method for determining preferences when multiple and competing attributes are involved. Worth assessment is used as the model which...1967 as a method for determining preferenoe when multiple and competing attributes are involved (Rf 10). The tern worth can be - equated to other... competing objectives. After some discussion, the group decided that the problem could best be decided using the worth assessment procedure. They

  17. Assessing the Utility of Item Response Theory Models: Differential Item Functioning.

    ERIC Educational Resources Information Center

    Scheuneman, Janice Dowd

    The current status of item response theory (IRT) is discussed. Several IRT methods exist for assessing whether an item is biased. Focus is on methods proposed by L. M. Rudner (1975), F. M. Lord (1977), D. Thissen et al. (1988) and R. L. Linn and D. Harnisch (1981). Rudner suggested a measure of the area lying between the two item characteristic…

  18. Annual Research Review: Embracing Not Erasing Contextual Variability in Children's Behavior--Theory and Utility in the Selection and Use of Methods and Informants in Developmental Psychopathology

    ERIC Educational Resources Information Center

    Dirks, Melanie A.; De Los Reyes, Andres; Briggs-Gowan, Margaret; Cella, David; Wakschlag, Lauren S.

    2012-01-01

    This paper examines the selection and use of multiple methods and informants for the assessment of disruptive behavior syndromes and attention deficit/hyperactivity disorder, providing a critical discussion of (a) the bidirectional linkages between theoretical models of childhood psychopathology and current assessment techniques; and (b) current…

  19. Bladder Cancer Treatment Response Assessment in CT using Radiomics with Deep-Learning.

    PubMed

    Cha, Kenny H; Hadjiiski, Lubomir; Chan, Heang-Ping; Weizer, Alon Z; Alva, Ajjai; Cohan, Richard H; Caoili, Elaine M; Paramagul, Chintana; Samala, Ravi K

    2017-08-18

    Cross-sectional X-ray imaging has become the standard for staging most solid organ malignancies. However, for some malignancies such as urinary bladder cancer, the ability to accurately assess local extent of the disease and understand response to systemic chemotherapy is limited with current imaging approaches. In this study, we explored the feasibility that radiomics-based predictive models using pre- and post-treatment computed tomography (CT) images might be able to distinguish between bladder cancers with and without complete chemotherapy responses. We assessed three unique radiomics-based predictive models, each of which employed different fundamental design principles ranging from a pattern recognition method via deep-learning convolution neural network (DL-CNN), to a more deterministic radiomics feature-based approach and then a bridging method between the two, utilizing a system which extracts radiomics features from the image patterns. Our study indicates that the computerized assessment using radiomics information from the pre- and post-treatment CT of bladder cancer patients has the potential to assist in assessment of treatment response.

  20. An integrated environmental and health performance quantification model for pre-occupancy phase of buildings in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xiaodong, E-mail: eastdawn@tsinghua.edu.cn; Su, Shu, E-mail: sushuqh@163.com; Zhang, Zhihui, E-mail: zhzhg@tsinghua.edu.cn

    To comprehensively pre-evaluate the damages to both the environment and human health due to construction activities in China, this paper presents an integrated building environmental and health performance (EHP) assessment model based on the Building Environmental Performance Analysis System (BEPAS) and the Building Health Impact Analysis System (BHIAS) models and offers a new inventory data estimation method. The new model follows the life cycle assessment (LCA) framework and the inventory analysis step involves bill of quantity (BOQ) data collection, consumption data formation, and environmental profile transformation. The consumption data are derived from engineering drawings and quotas to conduct the assessmentmore » before construction for pre-evaluation. The new model classifies building impacts into three safeguard areas: ecosystems, natural resources and human health. Thus, this model considers environmental impacts as well as damage to human wellbeing. The monetization approach, distance-to-target method and panel method are considered as optional weighting approaches. Finally, nine residential buildings of different structural types are taken as case studies to test the operability of the integrated model through application. The results indicate that the new model can effectively pre-evaluate building EHP and the structure type significantly affects the performance of residential buildings.« less

  1. Integrating Machine Learning into a Crowdsourced Model for Earthquake-Induced Damage Assessment

    NASA Technical Reports Server (NTRS)

    Rebbapragada, Umaa; Oommen, Thomas

    2011-01-01

    On January 12th, 2010, a catastrophic 7.0M earthquake devastated the country of Haiti. In the aftermath of an earthquake, it is important to rapidly assess damaged areas in order to mobilize the appropriate resources. The Haiti damage assessment effort introduced a promising model that uses crowdsourcing to map damaged areas in freely available remotely-sensed data. This paper proposes the application of machine learning methods to improve this model. Specifically, we apply work on learning from multiple, imperfect experts to the assessment of volunteer reliability, and propose the use of image segmentation to automate the detection of damaged areas. We wrap both tasks in an active learning framework in order to shift volunteer effort from mapping a full catalog of images to the generation of high-quality training data. We hypothesize that the integration of machine learning into this model improves its reliability, maintains the speed of damage assessment, and allows the model to scale to higher data volumes.

  2. Ultrasound viscoelasticity assessment using an adaptive torsional shear wave propagation method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ouared, Abderrahmane; Kazemirad, Siavash; Montagnon, Emmanuel

    2016-04-15

    Purpose: Different approaches have been used in dynamic elastography to assess mechanical properties of biological tissues. Most techniques are based on a simple inversion based on the measurement of the shear wave speed to assess elasticity, whereas some recent strategies use more elaborated analytical or finite element method (FEM) models. In this study, a new method is proposed for the quantification of both shear storage and loss moduli of confined lesions, in the context of breast imaging, using adaptive torsional shear waves (ATSWs) generated remotely with radiation pressure. Methods: A FEM model was developed to solve the inverse wave propagationmore » problem and obtain viscoelastic properties of interrogated media. The inverse problem was formulated and solved in the frequency domain and its robustness to noise and geometric constraints was evaluated. The proposed model was validated in vitro with two independent rheology methods on several homogeneous and heterogeneous breast tissue-mimicking phantoms over a broad range of frequencies (up to 400 Hz). Results: Viscoelastic properties matched benchmark rheology methods with discrepancies of 8%–38% for the shear modulus G′ and 9%–67% for the loss modulus G″. The robustness study indicated good estimations of storage and loss moduli (maximum mean errors of 19% on G′ and 32% on G″) for signal-to-noise ratios between 19.5 and 8.5 dB. Larger errors were noticed in the case of biases in lesion dimension and position. Conclusions: The ATSW method revealed that it is possible to estimate the viscoelasticity of biological tissues with torsional shear waves when small biases in lesion geometry exist.« less

  3. Multi-type sensor placement and response reconstruction for building structures: Experimental investigations

    NASA Astrophysics Data System (ADS)

    Hu, Rong-Pan; Xu, You-Lin; Zhan, Sheng

    2018-01-01

    Estimation of lateral displacement and acceleration responses is essential to assess safety and serviceability of high-rise buildings under dynamic loadings including earthquake excitations. However, the measurement information from the limited number of sensors installed in a building structure is often insufficient for the complete structural performance assessment. An integrated multi-type sensor placement and response reconstruction method has thus been proposed by the authors to tackle this problem. To validate the feasibility and effectiveness of the proposed method, an experimental investigation using a cantilever beam with multi-type sensors is performed and reported in this paper. The experimental setup is first introduced. The finite element modelling and model updating of the cantilever beam are then performed. The optimal sensor placement for the best response reconstruction is determined by the proposed method based on the updated FE model of the beam. After the sensors are installed on the physical cantilever beam, a number of experiments are carried out. The responses at key locations are reconstructed and compared with the measured ones. The reconstructed responses achieve a good match with the measured ones, manifesting the feasibility and effectiveness of the proposed method. Besides, the proposed method is also examined for the cases of different excitations and unknown excitation, and the results prove the proposed method to be robust and effective. The superiority of the optimized sensor placement scheme is finally demonstrated through comparison with two other different sensor placement schemes: the accelerometer-only scheme and non-optimal sensor placement scheme. The proposed method can be applied to high-rise buildings for seismic performance assessment.

  4. Multiple flood vulnerability assessment approach based on fuzzy comprehensive evaluation method and coordinated development degree model.

    PubMed

    Yang, Weichao; Xu, Kui; Lian, Jijian; Bin, Lingling; Ma, Chao

    2018-05-01

    Flood is a serious challenge that increasingly affects the residents as well as policymakers. Flood vulnerability assessment is becoming gradually relevant in the world. The purpose of this study is to develop an approach to reveal the relationship between exposure, sensitivity and adaptive capacity for better flood vulnerability assessment, based on the fuzzy comprehensive evaluation method (FCEM) and coordinated development degree model (CDDM). The approach is organized into three parts: establishment of index system, assessment of exposure, sensitivity and adaptive capacity, and multiple flood vulnerability assessment. Hydrodynamic model and statistical data are employed for the establishment of index system; FCEM is used to evaluate exposure, sensitivity and adaptive capacity; and CDDM is applied to express the relationship of the three components of vulnerability. Six multiple flood vulnerability types and four levels are proposed to assess flood vulnerability from multiple perspectives. Then the approach is applied to assess the spatiality of flood vulnerability in Hainan's eastern area, China. Based on the results of multiple flood vulnerability, a decision-making process for rational allocation of limited resources is proposed and applied to the study area. The study shows that multiple flood vulnerability assessment can evaluate vulnerability more completely, and help decision makers learn more information about making decisions in a more comprehensive way. In summary, this study provides a new way for flood vulnerability assessment and disaster prevention decision. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. The role of modelling in prioritising and planning clinical trials.

    PubMed

    Chilcott, J; Brennan, A; Booth, A; Karnon, J; Tappenden, P

    2003-01-01

    To identify the role of modelling in planning and prioritising trials. The review focuses on modelling methods used in the construction of disease models and on methods for their analysis and interpretation. Searches were initially developed in MEDLINE and then translated into other databases. Systematic reviews of the methodological and case study literature were undertaken. Search strategies focused on the intersection between three domains: modelling, health technology assessment and prioritisation. The review found that modelling can extend the validity of trials by: generalising from trial populations to specific target groups; generalising to other settings and countries; extrapolating trial outcomes to the longer term; linking intermediate outcome measures to final outcomes; extending analysis to the relevant comparators; adjusting for prognostic factors in trials; and synthesising research results. The review suggested that modelling may offer greatest benefits where the impact of a technology occurs over a long duration, where disease/technology characteristics are not observable, where there are long lead times in research, or for rapidly changing technologies. It was also found that modelling can inform the key parameters for research: sample size, trial duration and population characteristics. One-way, multi-way and threshold sensitivity analysis have been used in informing these aspects but are flawed. The payback approach has been piloted and while there have been weaknesses in its implementation, the approach does have potential. Expected value of information analysis is the only existing methodology that has been applied in practice and can address all these issues. The potential benefit of this methodology is that the value of research is directly related to its impact on technology commissioning decisions, and is demonstrated in real and absolute rather than relative terms; it assesses the technical efficiency of different types of research. Modelling is not a substitute for data collection. However, modelling can identify trial designs of low priority in informing health technology commissioning decisions. Good practice in undertaking and reporting economic modelling studies requires further dissemination and support, specifically in sensitivity analyses, model validation and the reporting of assumptions. Case studies of the payback approach using stochastic sensitivity analyses should be developed. Use of overall expected value of perfect information should be encouraged in modelling studies seeking to inform prioritisation and planning of health technology assessments. Research is required to assess if the potential benefits of value of information analysis can be realised in practice; on the definition of an adequate objective function; on methods for analysing computationally expensive models; and on methods for updating prior probability distributions.

  6. Authentic assessment based showcase portfolio on learning of mathematical problem solving in senior high school

    NASA Astrophysics Data System (ADS)

    Sukmawati, Zuhairoh, Faihatuz

    2017-05-01

    The purpose of this research was to develop authentic assessment model based on showcase portfolio on learning of mathematical problem solving. This research used research and development Method (R & D) which consists of four stages of development that: Phase I, conducting a preliminary study. Phase II, determining the purpose of developing and preparing the initial model. Phase III, trial test of instrument for the initial draft model and the initial product. The respondents of this research are the students of SMAN 8 and SMAN 20 Makassar. The collection of data was through observation, interviews, documentation, student questionnaire, and instrument tests mathematical solving abilities. The data were analyzed with descriptive and inferential statistics. The results of this research are authentic assessment model design based on showcase portfolio which involves: 1) Steps in implementing the authentic assessment based Showcase, assessment rubric of cognitive aspects, assessment rubric of affective aspects, and assessment rubric of skill aspect. 2) The average ability of the students' problem solving which is scored by using authentic assessment based on showcase portfolio was in high category and the students' response in good category.

  7. Introducing the fit-criteria assessment plot - A visualisation tool to assist class enumeration in group-based trajectory modelling.

    PubMed

    Klijn, Sven L; Weijenberg, Matty P; Lemmens, Paul; van den Brandt, Piet A; Lima Passos, Valéria

    2017-10-01

    Background and objective Group-based trajectory modelling is a model-based clustering technique applied for the identification of latent patterns of temporal changes. Despite its manifold applications in clinical and health sciences, potential problems of the model selection procedure are often overlooked. The choice of the number of latent trajectories (class-enumeration), for instance, is to a large degree based on statistical criteria that are not fail-safe. Moreover, the process as a whole is not transparent. To facilitate class enumeration, we introduce a graphical summary display of several fit and model adequacy criteria, the fit-criteria assessment plot. Methods An R-code that accepts universal data input is presented. The programme condenses relevant group-based trajectory modelling output information of model fit indices in automated graphical displays. Examples based on real and simulated data are provided to illustrate, assess and validate fit-criteria assessment plot's utility. Results Fit-criteria assessment plot provides an overview of fit criteria on a single page, placing users in an informed position to make a decision. Fit-criteria assessment plot does not automatically select the most appropriate model but eases the model assessment procedure. Conclusions Fit-criteria assessment plot is an exploratory, visualisation tool that can be employed to assist decisions in the initial and decisive phase of group-based trajectory modelling analysis. Considering group-based trajectory modelling's widespread resonance in medical and epidemiological sciences, a more comprehensive, easily interpretable and transparent display of the iterative process of class enumeration may foster group-based trajectory modelling's adequate use.

  8. A synthetic method for atmospheric diffusion simulation and environmental impact assessment of accidental pollution in the chemical industry in a WEBGIS context.

    PubMed

    Ni, Haochen; Rui, Yikang; Wang, Jiechen; Cheng, Liang

    2014-09-05

    The chemical industry poses a potential security risk to factory personnel and neighboring residents. In order to mitigate prospective damage, a synthetic method must be developed for an emergency response. With the development of environmental numeric simulation models, model integration methods, and modern information technology, many Decision Support Systems (DSSs) have been established. However, existing systems still have limitations, in terms of synthetic simulation and network interoperation. In order to resolve these limitations, the matured simulation model for chemical accidents was integrated into the WEB Geographic Information System (WEBGIS) platform. The complete workflow of the emergency response, including raw data (meteorology information, and accident information) management, numeric simulation of different kinds of accidents, environmental impact assessments, and representation of the simulation results were achieved. This allowed comprehensive and real-time simulation of acute accidents in the chemical industry. The main contribution of this paper is that an organizational mechanism of the model set, based on the accident type and pollutant substance; a scheduling mechanism for the parallel processing of multi-accident-type, multi-accident-substance, and multi-simulation-model; and finally a presentation method for scalar and vector data on the web browser on the integration of a WEB Geographic Information System (WEBGIS) platform. The outcomes demonstrated that this method could provide effective support for deciding emergency responses of acute chemical accidents.

  9. A Synthetic Method for Atmospheric Diffusion Simulation and Environmental Impact Assessment of Accidental Pollution in the Chemical Industry in a WEBGIS Context

    PubMed Central

    Ni, Haochen; Rui, Yikang; Wang, Jiechen; Cheng, Liang

    2014-01-01

    The chemical industry poses a potential security risk to factory personnel and neighboring residents. In order to mitigate prospective damage, a synthetic method must be developed for an emergency response. With the development of environmental numeric simulation models, model integration methods, and modern information technology, many Decision Support Systems (DSSs) have been established. However, existing systems still have limitations, in terms of synthetic simulation and network interoperation. In order to resolve these limitations, the matured simulation model for chemical accidents was integrated into the WEB Geographic Information System (WEBGIS) platform. The complete workflow of the emergency response, including raw data (meteorology information, and accident information) management, numeric simulation of different kinds of accidents, environmental impact assessments, and representation of the simulation results were achieved. This allowed comprehensive and real-time simulation of acute accidents in the chemical industry. The main contribution of this paper is that an organizational mechanism of the model set, based on the accident type and pollutant substance; a scheduling mechanism for the parallel processing of multi-accident-type, multi-accident-substance, and multi-simulation-model; and finally a presentation method for scalar and vector data on the web browser on the integration of a WEB Geographic Information System (WEBGIS) platform. The outcomes demonstrated that this method could provide effective support for deciding emergency responses of acute chemical accidents. PMID:25198686

  10. Accuracy assessment of high resolution satellite imagery orientation by leave-one-out method

    NASA Astrophysics Data System (ADS)

    Brovelli, Maria Antonia; Crespi, Mattia; Fratarcangeli, Francesca; Giannone, Francesca; Realini, Eugenio

    Interest in high-resolution satellite imagery (HRSI) is spreading in several application fields, at both scientific and commercial levels. Fundamental and critical goals for the geometric use of this kind of imagery are their orientation and orthorectification, processes able to georeference the imagery and correct the geometric deformations they undergo during acquisition. In order to exploit the actual potentialities of orthorectified imagery in Geomatics applications, the definition of a methodology to assess the spatial accuracy achievable from oriented imagery is a crucial topic. In this paper we want to propose a new method for accuracy assessment based on the Leave-One-Out Cross-Validation (LOOCV), a model validation method already applied in different fields such as machine learning, bioinformatics and generally in any other field requiring an evaluation of the performance of a learning algorithm (e.g. in geostatistics), but never applied to HRSI orientation accuracy assessment. The proposed method exhibits interesting features which are able to overcome the most remarkable drawbacks involved by the commonly used method (Hold-Out Validation — HOV), based on the partitioning of the known ground points in two sets: the first is used in the orientation-orthorectification model (GCPs — Ground Control Points) and the second is used to validate the model itself (CPs — Check Points). In fact the HOV is generally not reliable and it is not applicable when a low number of ground points is available. To test the proposed method we implemented a new routine that performs the LOOCV in the software SISAR, developed by the Geodesy and Geomatics Team at the Sapienza University of Rome to perform the rigorous orientation of HRSI; this routine was tested on some EROS-A and QuickBird images. Moreover, these images were also oriented using the world recognized commercial software OrthoEngine v. 10 (included in the Geomatica suite by PCI), manually performing the LOOCV since only the HOV is implemented. The software comparison guaranteed about the overall correctness and good performances of the SISAR model, whereas the results showed the good features of the LOOCV method.

  11. Assessment of Professional Development for Teachers in the Vocational Education and Training Sector: An Examination of the Concerns Based Adoption Model

    ERIC Educational Resources Information Center

    Saunders, Rebecca

    2012-01-01

    The purpose of this article is to describe the use of the Concerns Based Adoption Model (Hall & Hord, 2006) as a conceptual lens and practical methodology for professional development program assessment in the vocational education and training (VET) sector. In this sequential mixed-methods study, findings from the first two phases (two of…

  12. Health impact assessment – A survey on quantifying tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org

    Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less

  13. Vulnerability Assessment of Water Supply Systems: Status, Gaps and Opportunities

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2015-12-01

    Conventional frameworks for assessing the impacts of climate change on water resource systems use cascades of climate and hydrological models to provide 'top-down' projections of future water availability, but these are subject to high uncertainty and are model and scenario-specific. Hence there has been recent interest in 'bottom-up' frameworks, which aim to evaluate system vulnerability to change in the context of possible future climate and/or hydrological conditions. Such vulnerability assessments are generic, and can be combined with updated information from top-down assessments as they become available. While some vulnerability methods use hydrological models to estimate water availability, fully bottom-up schemes have recently been proposed that directly map system vulnerability as a function of feasible changes in water supply characteristics. These use stochastic algorithms, based on reconstruction or reshuffling methods, by which multiple water supply realizations can be generated under feasible ranges of change in water supply conditions. The paper reports recent successes, and points to areas of future improvement. Advances in stochastic modeling and optimization can address some technical limitations in flow reconstruction, while various data mining and system identification techniques can provide possibilities to better condition realizations for consistency with top-down scenarios. Finally, we show that probabilistic and Bayesian frameworks together can provide a potential basis to combine information obtained from fully bottom-up analyses with projections available from climate and/or hydrological models in a fully integrated risk assessment framework for deep uncertainty.

  14. Misleading prioritizations from modelling range shifts under climate change

    USGS Publications Warehouse

    Sofaer, Helen R.; Jarnevich, Catherine S.; Flather, Curtis H.

    2018-01-01

    AimConservation planning requires the prioritization of a subset of taxa and geographical locations to focus monitoring and management efforts. Integration of the threats and opportunities posed by climate change often relies on predictions from species distribution models, particularly for assessments of vulnerability or invasion risk for multiple taxa. We evaluated whether species distribution models could reliably rank changes in species range size under climate and land use change.LocationConterminous U.S.A.Time period1977–2014.Major taxa studiedPasserine birds.MethodsWe estimated ensembles of species distribution models based on historical North American Breeding Bird Survey occurrences for 190 songbirds, and generated predictions to recent years given c. 35 years of observed land use and climate change. We evaluated model predictions using standard metrics of discrimination performance and a more detailed assessment of the ability of models to rank species vulnerability to climate change based on predicted range loss, range gain, and overall change in range size.ResultsSpecies distribution models yielded unreliable and misleading assessments of relative vulnerability to climate and land use change. Models could not accurately predict range expansion or contraction, and therefore failed to anticipate patterns of range change among species. These failures occurred despite excellent overall discrimination ability and transferability to the validation time period, which reflected strong performance at the majority of locations that were either always or never occupied by each species.Main conclusionsModels failed for the questions and at the locations of greatest interest to conservation and management. This highlights potential pitfalls of multi-taxa impact assessments under global change; in our case, models provided misleading rankings of the most impacted species, and spatial information about range changes was not credible. As modelling methods and frameworks continue to be refined, performance assessments and validation efforts should focus on the measures of risk and vulnerability useful for decision-making.

  15. Towards an integrated approach to natural hazards risk assessment using GIS: with reference to bushfires.

    PubMed

    Chen, Keping; Blong, Russell; Jacobson, Carol

    2003-04-01

    This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.

  16. A hybrid model to assess the impact of climate variability on streamflow for an ungauged mountainous basin

    NASA Astrophysics Data System (ADS)

    Wang, Chong; Xu, Jianhua; Chen, Yaning; Bai, Ling; Chen, Zhongsheng

    2018-04-01

    To quantitatively assess the impact of climate variability on streamflow in an ungauged mountainous basin is a difficult and challenging work. In this study, a hybrid model combing downscaling method based on earth data products, back propagation artificial neural networks (BPANN) and weights connection method was developed to explore an approach for solving this problem. To validate the applicability of the hybrid model, the Kumarik River and Toshkan River, two headwaters of the Aksu River, were employed to assess the impact of climate variability on streamflow by using this hybrid model. The conclusion is that the hybrid model presented a good performance, and the quantitative assessment results for the two headwaters are: (1) the precipitation respectively increased by 48.5 and 41.0 mm in the Kumarik catchment and Toshkan catchment, and the average annual temperature both increased by 0.1 °C in the two catchments during each decade from 1980 to 2012; (2) with the warming and wetting climate, the streamflow respectively increased 1.5 × 108 and 3.3 × 108 m3 per decade in the Kumarik River and the Toshkan River; and (3) the contribution of the temperature and precipitation to the streamflow, which were 64.01 ± 7.34, 35.99 ± 7.34 and 47.72 ± 8.10, 52.26 ± 8.10%, respectively in the Kumarik catchment and Toshkan catchment. Our study introduced a feasible hybrid model for the assessment of the impact of climate variability on streamflow, which can be used in the ungauged mountainous basin of Northwest China.

  17. A General Model for Testing Mediation and Moderation Effects

    PubMed Central

    MacKinnon, David P.

    2010-01-01

    This paper describes methods for testing mediation and moderation effects in a dataset, both together and separately. Investigations of this kind are especially valuable in prevention research to obtain information on the process by which a program achieves its effects and whether the program is effective for subgroups of individuals. A general model that simultaneously estimates mediation and moderation effects is presented, and the utility of combining the effects into a single model is described. Possible effects of interest in the model are explained, as are statistical methods to assess these effects. The methods are further illustrated in a hypothetical prevention program example. PMID:19003535

  18. Integrated Method for Personal Thermal Comfort Assessment and Optimization through Users' Feedback, IoT and Machine Learning: A Case Study †.

    PubMed

    Salamone, Francesco; Belussi, Lorenzo; Currò, Cristian; Danza, Ludovico; Ghellere, Matteo; Guazzi, Giulia; Lenzi, Bruno; Megale, Valentino; Meroni, Italo

    2018-05-17

    Thermal comfort has become a topic issue in building performance assessment as well as energy efficiency. Three methods are mainly recognized for its assessment. Two of them based on standardized methodologies, face the problem by considering the indoor environment in steady-state conditions (PMV and PPD) and users as active subjects whose thermal perception is influenced by outdoor climatic conditions (adaptive approach). The latter method is the starting point to investigate thermal comfort from an overall perspective by considering endogenous variables besides the traditional physical and environmental ones. Following this perspective, the paper describes the results of an in-field investigation of thermal conditions through the use of nearable and wearable solutions, parametric models and machine learning techniques. The aim of the research is the exploration of the reliability of IoT-based solutions combined with advanced algorithms, in order to create a replicable framework for the assessment and improvement of user thermal satisfaction. For this purpose, an experimental test in real offices was carried out involving eight workers. Parametric models are applied for the assessment of thermal comfort; IoT solutions are used to monitor the environmental variables and the users' parameters; the machine learning CART method allows to predict the users' profile and the thermal comfort perception respect to the indoor environment.

  19. Integrated Method for Personal Thermal Comfort Assessment and Optimization through Users’ Feedback, IoT and Machine Learning: A Case Study †

    PubMed Central

    Currò, Cristian; Danza, Ludovico; Ghellere, Matteo; Guazzi, Giulia; Lenzi, Bruno; Megale, Valentino; Meroni, Italo

    2018-01-01

    Thermal comfort has become a topic issue in building performance assessment as well as energy efficiency. Three methods are mainly recognized for its assessment. Two of them based on standardized methodologies, face the problem by considering the indoor environment in steady-state conditions (PMV and PPD) and users as active subjects whose thermal perception is influenced by outdoor climatic conditions (adaptive approach). The latter method is the starting point to investigate thermal comfort from an overall perspective by considering endogenous variables besides the traditional physical and environmental ones. Following this perspective, the paper describes the results of an in-field investigation of thermal conditions through the use of nearable and wearable solutions, parametric models and machine learning techniques. The aim of the research is the exploration of the reliability of IoT-based solutions combined with advanced algorithms, in order to create a replicable framework for the assessment and improvement of user thermal satisfaction. For this purpose, an experimental test in real offices was carried out involving eight workers. Parametric models are applied for the assessment of thermal comfort; IoT solutions are used to monitor the environmental variables and the users’ parameters; the machine learning CART method allows to predict the users’ profile and the thermal comfort perception respect to the indoor environment. PMID:29772818

  20. Use of the Attribute Hierarchy Method for Development of Student Cognitive Models and Diagnostic Assessments in Geoscience Education

    NASA Astrophysics Data System (ADS)

    Corrigan, S.; Brodsky, L. M.; Loper, S.; Brown, N.; Curley, J.; Baker, J.; Goss, M.; Castek, J.; Barber, J.

    2010-12-01

    There is a recognized need to better understand student learning in the geosciences (Stofflet, 1994; Zalles, Quallmalz, Gobert and Pallant, 2007). Educators, cognitive psychologists and practicing scientists have also called for instructional approaches that support deep conceptual development (Manduca, Mogk and Stillings, 2004, Libarkin and Kurdziel, 2006). In both cases there is an important role for educational measures that can generate descriptions of how student understanding develops over time and inform instruction. The presenters will suggest one way of responding to these needs by describing the Attribute Hierarchy Method (AHM) of assessment (Leighton, Gierl and Hunka, 2004; Gierl, Cui, Wang and Zhou, 2008) as enacted in a large-scale earth science curriculum development project funded by the Bill and Melinda Gates Foundation. The AHM is one approach to criterion referenced, diagnostic assessment that ties measure design to cognitive models of student learning in order to support justified inferences about students’ understanding and the knowledge required for continued development. The Attribute Hierarchy Method bears potential for researchers and practitioners interested in learning progressions and solves many problems associated with making meaningful, justified inferences about students’ understanding based on their assessment performances. The process followed to design and develop the project’s cognitive models as well as a description of how they are used in subsequent assessment task design will be emphasized in order to demonstrate how the AHM may be applied in the context of geoscience education. Results from over twenty student cognitive interviews, and two hypothesized cognitive models -- one describing a student pathway for understanding rock formation and a second describing a student pathway for increasingly sophisticated use of maps and models in the geosciences - are also described. Sample assessment items will be provided as indications of the final assessment measures. The project’s efforts to create an on-line geoscience curriculum for use in the middle school grades that adapts to student performances by customizing whole lessons, grouping assignments or student feedback will provide a broader context for the discussion.

  1. Theory-based interventions in physical activity: a systematic review of literature in Iran.

    PubMed

    Abdi, Jalal; Eftekhar, Hassan; Estebsari, Fatemeh; Sadeghi, Roya

    2014-11-30

    Lack of physical activity is ranked fourth among the causes of human death and chronic diseases. Using models and theories to design, implement, and evaluate the health education and health promotion interventions has many advantages. Using models and theories of physical activity, we decided to systematically study the educational and promotional interventions carried out in Iran from 2003 to 2013.Three information databases were used to systematically select papers using key words including Iranian Magazine Database (MAGIRAN), Iran Medical Library (MEDLIB), and Scientific Information Database (SID). Twenty papers were selected and studied .Having been applied in 9 studies, The Trans Theoretical Model (TTM) was the most widespread model in Iran (PENDER in 3 studies, BASNEF in 2, and the Theory of Planned Behavior in 2 studies). With regards to the educational methods, almost all studies used a combination of methods. The most widely used Integrative educational method was group discussion. Only one integrated study was done. Behavior maintenance was not addressed in 75% of the studies. Almost all studies used self-reporting instruments. The effectiveness of educational methods was assessed in none of the studies. Most of the included studies had several methodological weaknesses, which hinder the validity and applicability of their results. According to the findings, the necessity of need assessment in using models, epidemiology and methodology consultation, addressing maintenance of physical activity, using other theories and models such as social marketing and social-cognitive theory, and other educational methods like empirical and complementary are suggested.

  2. [Hybrid 3-D rendering of the thorax and surface-based virtual bronchoscopy in surgical and interventional therapy control].

    PubMed

    Seemann, M D; Gebicke, K; Luboldt, W; Albes, J M; Vollmar, J; Schäfer, J F; Beinert, T; Englmeier, K H; Bitzer, M; Claussen, C D

    2001-07-01

    The aim of this study was to demonstrate the possibilities of a hybrid rendering method, the combination of a color-coded surface and volume rendering method, with the feasibility of performing surface-based virtual endoscopy with different representation models in the operative and interventional therapy control of the chest. In 6 consecutive patients with partial lung resection (n = 2) and lung transplantation (n = 4) a thin-section spiral computed tomography of the chest was performed. The tracheobronchial system and the introduced metallic stents were visualized using a color-coded surface rendering method. The remaining thoracic structures were visualized using a volume rendering method. For virtual bronchoscopy, the tracheobronchial system was visualized using a triangle surface model, a shaded-surface model and a transparent shaded-surface model. The hybrid 3D visualization uses the advantages of both the color-coded surface and volume rendering methods and facilitates a clear representation of the tracheobronchial system and the complex topographical relationship of morphological and pathological changes without loss of diagnostic information. Performing virtual bronchoscopy with the transparent shaded-surface model facilitates a reasonable to optimal, simultaneous visualization and assessment of the surface structure of the tracheobronchial system and the surrounding mediastinal structures and lesions. Hybrid rendering relieve the morphological assessment of anatomical and pathological changes without the need for time-consuming detailed analysis and presentation of source images. Performing virtual bronchoscopy with a transparent shaded-surface model offers a promising alternative to flexible fiberoptic bronchoscopy.

  3. A model-averaging method for assessing groundwater conceptual model uncertainty.

    PubMed

    Ye, Ming; Pohlmann, Karl F; Chapman, Jenny B; Pohll, Greg M; Reeves, Donald M

    2010-01-01

    This study evaluates alternative groundwater models with different recharge and geologic components at the northern Yucca Flat area of the Death Valley Regional Flow System (DVRFS), USA. Recharge over the DVRFS has been estimated using five methods, and five geological interpretations are available at the northern Yucca Flat area. Combining the recharge and geological components together with additional modeling components that represent other hydrogeological conditions yields a total of 25 groundwater flow models. As all the models are plausible given available data and information, evaluating model uncertainty becomes inevitable. On the other hand, hydraulic parameters (e.g., hydraulic conductivity) are uncertain in each model, giving rise to parametric uncertainty. Propagation of the uncertainty in the models and model parameters through groundwater modeling causes predictive uncertainty in model predictions (e.g., hydraulic head and flow). Parametric uncertainty within each model is assessed using Monte Carlo simulation, and model uncertainty is evaluated using the model averaging method. Two model-averaging techniques (on the basis of information criteria and GLUE) are discussed. This study shows that contribution of model uncertainty to predictive uncertainty is significantly larger than that of parametric uncertainty. For the recharge and geological components, uncertainty in the geological interpretations has more significant effect on model predictions than uncertainty in the recharge estimates. In addition, weighted residuals vary more for the different geological models than for different recharge models. Most of the calibrated observations are not important for discriminating between the alternative models, because their weighted residuals vary only slightly from one model to another.

  4. Advantages and limitations of the Five Domains model for assessing welfare impacts associated with vertebrate pest control.

    PubMed

    Beausoleil, N J; Mellor, D J

    2015-01-01

    Many pest control activities have the potential to impact negatively on the welfare of animals, and animal welfare is an important consideration in the development, implementation and evaluation of ethically defensible vertebrate pest control. Thus, reliable and accurate methods for assessing welfare impacts are required. The Five Domains model provides a systematic method for identifying potential or actual welfare impacts associated with an event or situation in four physical or functional domains (nutrition, environment, health or functional status, behaviour) and one mental domain (overall mental or affective state). Here we evaluate the advantages and limitations of the Five Domains model for this purpose and illustrate them using specific examples from a recent assessment of the welfare impacts of poisons used to lethally control possums in New Zealand. The model has a number of advantages which include the following: the systematic identification of a wide range of impacts associated with a variety of control tools; the production of relative rankings of tools in terms of their welfare impacts; the easy incorporation of new information into assessments; and the highlighting of additional information needed. For example, a recent analysis of sodium fluoroacetate (1080) poisoning in possums revealed the need for more information on the period from the onset of clinical signs to the point at which consciousness is lost, as well as on the level of consciousness during or after the occurrence of muscle spasms and seizures. The model is also valuable because it clearly separates physical or functional and affective impacts, encourages more comprehensive consideration of negative affective experiences than has occurred in the past, and allows development and evaluation of targeted mitigation strategies. Caution must be used in interpreting and applying the outputs of the model, most importantly because relative rankings or grades are fundamentally qualitative in nature. Certain domains are more useful for evaluating impacts associated with slower/longer-acting tools than for faster-acting methods, and it may be easier to identify impacts in some domains than others. Overall, we conclude that the Five Domains model advances evaluation of the animal welfare impacts of vertebrate pest control methods, provided users are cognisant of its limitations.

  5. FROM FOLDING THEORIES TO FOLDING PROTEINS: A Review and Assessment of Simulation Studies of Protein Folding and Unfolding

    NASA Astrophysics Data System (ADS)

    Shea, Joan-Emma; Brooks, Charles L., III

    2001-10-01

    Beginning with simplified lattice and continuum "minimalist" models and progressing to detailed atomic models, simulation studies have augmented and directed development of the modern landscape perspective of protein folding. In this review we discuss aspects of detailed atomic simulation methods applied to studies of protein folding free energy surfaces, using biased-sampling free energy methods and temperature-induced protein unfolding. We review studies from each on systems of particular experimental interest and assess the strengths and weaknesses of each approach in the context of "exact" results for both free energies and kinetics of a minimalist model for a beta-barrel protein. We illustrate in detail how each approach is implemented and discuss analysis methods that have been developed as components of these studies. We describe key insights into the relationship between protein topology and the folding mechanism emerging from folding free energy surface calculations. We further describe the determination of detailed "pathways" and models of folding transition states that have resulted from unfolding studies. Our assessment of the two methods suggests that both can provide, often complementary, details of folding mechanism and thermodynamics, but this success relies on (a) adequate sampling of diverse conformational regions for the biased-sampling free energy approach and (b) many trajectories at multiple temperatures for unfolding studies. Furthermore, we find that temperature-induced unfolding provides representatives of folding trajectories only when the topology and sequence (energy) provide a relatively funneled landscape and "off-pathway" intermediates do not exist.

  6. Automated antibody structure prediction using Accelrys tools: Results and best practices

    PubMed Central

    Fasnacht, Marc; Butenhof, Ken; Goupil-Lamy, Anne; Hernandez-Guzman, Francisco; Huang, Hongwei; Yan, Lisa

    2014-01-01

    We describe the methodology and results from our participation in the second Antibody Modeling Assessment experiment. During the experiment we predicted the structure of eleven unpublished antibody Fv fragments. Our prediction methods centered on template-based modeling; potential templates were selected from an antibody database based on their sequence similarity to the target in the framework regions. Depending on the quality of the templates, we constructed models of the antibody framework regions either using a single, chimeric or multiple template approach. The hypervariable loop regions in the initial models were rebuilt by grafting the corresponding regions from suitable templates onto the model. For the H3 loop region, we further refined models using ab initio methods. The final models were subjected to constrained energy minimization to resolve severe local structural problems. The analysis of the models submitted show that Accelrys tools allow for the construction of quite accurate models for the framework and the canonical CDR regions, with RMSDs to the X-ray structure on average below 1 Å for most of these regions. The results show that accurate prediction of the H3 hypervariable loops remains a challenge. Furthermore, model quality assessment of the submitted models show that the models are of quite high quality, with local geometry assessment scores similar to that of the target X-ray structures. Proteins 2014; 82:1583–1598. © 2014 The Authors. Proteins published by Wiley Periodicals, Inc. PMID:24833271

  7. Research on Assessment Methods for Urban Public Transport Development in China

    PubMed Central

    Zou, Linghong; Guo, Hongwei

    2014-01-01

    In recent years, with the rapid increase in urban population, the urban travel demands in Chinese cities have been increasing dramatically. As a result, developing comprehensive urban transport systems becomes an inevitable choice to meet the growing urban travel demands. In urban transport systems, public transport plays the leading role to promote sustainable urban development. This paper aims to establish an assessment index system for the development level of urban public transport consisting of a target layer, a criterion layer, and an index layer. Review on existing literature shows that methods used in evaluating urban public transport structure are dominantly qualitative. To overcome this shortcoming, fuzzy mathematics method is used for describing qualitative issues quantitatively, and AHP (analytic hierarchy process) is used to quantify expert's subjective judgment. The assessment model is established based on the fuzzy AHP. The weight of each index is determined through the AHP and the degree of membership of each index through the fuzzy assessment method to obtain the fuzzy synthetic assessment matrix. Finally, a case study is conducted to verify the rationality and practicability of the assessment system and the proposed assessment method. PMID:25530756

  8. Risk Assessment of Alzheimer's Disease using the Information Diffusion Model from Structural Magnetic Resonance Imaging.

    PubMed

    Beheshti, Iman; Olya, Hossain G T; Demirel, Hasan

    2016-04-05

    Recently, automatic risk assessment methods have been a target for the detection of Alzheimer's disease (AD) risk. This study aims to develop an automatic computer-aided AD diagnosis technique for risk assessment of AD using information diffusion theory. Information diffusion is a fuzzy mathematics logic of set-value that is used for risk assessment of natural phenomena, which attaches fuzziness (uncertainty) and incompleteness. Data were obtained from voxel-based morphometry analysis of structural magnetic resonance imaging. The information diffusion model results revealed that the risk of AD increases with a reduction of the normalized gray matter ratio (p > 0.5, normalized gray matter ratio <40%). The information diffusion model results were evaluated by calculation of the correlation of two traditional risk assessments of AD, the Mini-Mental State Examination and the Clinical Dementia Rating. The correlation results revealed that the information diffusion model findings were in line with Mini-Mental State Examination and Clinical Dementia Rating results. Application of information diffusion model contributes to the computerization of risk assessment of AD, which has a practical implication for the early detection of AD.

  9. Northwest Montana/North Idaho transmission corridor study: a computer-assisted corridor location and impact evaluation assessment

    Treesearch

    Timothy J. Murray; Daniel J. Bisenius; Jay G. Marcotte

    1979-01-01

    A computer-assisted method was used to locate and evaluate approximately 1,200 miles of alternative corridors within an 8,000 square mile study region. The method involved in-depth impact analyses for nine major location criteria or determinant models. Regional "experts" from the Rocky Mountain area participated with BPA in developing model structure....

  10. Estimation of Standard Error of Regression Effects in Latent Regression Models Using Binder's Linearization. Research Report. ETS RR-07-09

    ERIC Educational Resources Information Center

    Li, Deping; Oranje, Andreas

    2007-01-01

    Two versions of a general method for approximating standard error of regression effect estimates within an IRT-based latent regression model are compared. The general method is based on Binder's (1983) approach, accounting for complex samples and finite populations by Taylor series linearization. In contrast, the current National Assessment of…

  11. Methods for assessing the stability of slopes during earthquakes-A retrospective

    USGS Publications Warehouse

    Jibson, R.W.

    2011-01-01

    During the twentieth century, several methods to assess the stability of slopes during earthquakes were developed. Pseudostatic analysis was the earliest method; it involved simply adding a permanent body force representing the earthquake shaking to a static limit-equilibrium analysis. Stress-deformation analysis, a later development, involved much more complex modeling of slopes using a mesh in which the internal stresses and strains within elements are computed based on the applied external loads, including gravity and seismic loads. Stress-deformation analysis provided the most realistic model of slope behavior, but it is very complex and requires a high density of high-quality soil-property data as well as an accurate model of soil behavior. In 1965, Newmark developed a method that effectively bridges the gap between these two types of analysis. His sliding-block model is easy to apply and provides a useful index of co-seismic slope performance. Subsequent modifications to sliding-block analysis have made it applicable to a wider range of landslide types. Sliding-block analysis provides perhaps the greatest utility of all the types of analysis. It is far easier to apply than stress-deformation analysis, and it yields much more useful information than does pseudostatic analysis. ?? 2010.

  12. Methods for Probabilistic Radiological Dose Assessment at a High-Level Radioactive Waste Repository.

    NASA Astrophysics Data System (ADS)

    Maheras, Steven James

    Methods were developed to assess and evaluate the uncertainty in offsite and onsite radiological dose at a high-level radioactive waste repository to show reasonable assurance that compliance with applicable regulatory requirements will be achieved. Uncertainty in offsite dose was assessed by employing a stochastic precode in conjunction with Monte Carlo simulation using an offsite radiological dose assessment code. Uncertainty in onsite dose was assessed by employing a discrete-event simulation model of repository operations in conjunction with an occupational radiological dose assessment model. Complementary cumulative distribution functions of offsite and onsite dose were used to illustrate reasonable assurance. Offsite dose analyses were performed for iodine -129, cesium-137, strontium-90, and plutonium-239. Complementary cumulative distribution functions of offsite dose were constructed; offsite dose was lognormally distributed with a two order of magnitude range. However, plutonium-239 results were not lognormally distributed and exhibited less than one order of magnitude range. Onsite dose analyses were performed for the preliminary inspection, receiving and handling, and the underground areas of the repository. Complementary cumulative distribution functions of onsite dose were constructed and exhibited less than one order of magnitude range. A preliminary sensitivity analysis of the receiving and handling areas was conducted using a regression metamodel. Sensitivity coefficients and partial correlation coefficients were used as measures of sensitivity. Model output was most sensitive to parameters related to cask handling operations. Model output showed little sensitivity to parameters related to cask inspections.

  13. Comparison of the Current Center of Site Annual Neshap Dose Modeling at the Savannah River Site with Other Assessment Methods.

    PubMed

    Minter, Kelsey M; Jannik, G Timothy; Stagich, Brooke H; Dixon, Kenneth L; Newton, Joseph R

    2018-04-01

    The U.S. Environmental Protection Agency (EPA) requires the use of the model CAP88 to estimate the total effective dose (TED) to an offsite maximally exposed individual (MEI) for demonstrating compliance with 40 CFR 61, Subpart H: The National Emission Standards for Hazardous Air Pollutants (NESHAP) regulations. For NESHAP compliance at the Savannah River Site (SRS), the EPA, the U.S. Department of Energy (DOE), South Carolina's Department of Health and Environmental Control, and SRS approved a dose assessment method in 1991 that models all radiological emissions as if originating from a generalized center of site (COS) location at two allowable stack heights (0 m and 61 m). However, due to changes in SRS missions, radiological emissions are no longer evenly distributed about the COS. An area-specific simulation of the 2015 SRS radiological airborne emissions was conducted to compare to the current COS method. The results produced a slightly higher dose estimate (2.97 × 10 mSv vs. 2.22 × 10 mSv), marginally changed the overall MEI location, and noted that H-Area tritium emissions dominated the dose. Thus, an H-Area dose model was executed as a potential simplification of the area-specific simulation by adopting the COS methodology and modeling all site emissions from a single location in H-Area using six stack heights that reference stacks specific to the tritium production facilities within H-Area. This "H-Area Tritium Stacks" method produced a small increase in TED estimates (3.03 × 10 mSv vs. 2.97 × 10 mSv) when compared to the area-specific simulation. This suggests that the current COS method is still appropriate for demonstrating compliance with NESHAP regulations but that changing to the H-Area Tritium Stacks assessment method may now be a more appropriate representation of operations at SRS.

  14. Quality assessment of protein model-structures based on structural and functional similarities

    PubMed Central

    2012-01-01

    Background Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. Results GOBA - Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. Conclusions The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and one of CASP9, compared to the contest participants. Consequently, GOBA offers a novel single model quality assessment program that addresses the practical needs of biologists. In conjunction with other Model Quality Assessment Programs (MQAPs), it would prove useful for the evaluation of single protein models. PMID:22998498

  15. Evaluating Differential Effects Using Regression Interactions and Regression Mixture Models

    ERIC Educational Resources Information Center

    Van Horn, M. Lee; Jaki, Thomas; Masyn, Katherine; Howe, George; Feaster, Daniel J.; Lamont, Andrea E.; George, Melissa R. W.; Kim, Minjung

    2015-01-01

    Research increasingly emphasizes understanding differential effects. This article focuses on understanding regression mixture models, which are relatively new statistical methods for assessing differential effects by comparing results to using an interactive term in linear regression. The research questions which each model answers, their…

  16. GUIDELINES TO ASSESSING REGIONAL VULNERABILITIES

    EPA Science Inventory

    Decision-makers today face increasingly complex environmental problems that require integrative and innovative approaches for analyzing, modeling, and interpreting various types of information. ReVA acknowledges this need and is designed to evaluate methods and models for synthe...

  17. a Fast Method for Measuring the Similarity Between 3d Model and 3d Point Cloud

    NASA Astrophysics Data System (ADS)

    Zhang, Zongliang; Li, Jonathan; Li, Xin; Lin, Yangbin; Zhang, Shanxin; Wang, Cheng

    2016-06-01

    This paper proposes a fast method for measuring the partial Similarity between 3D Model and 3D point Cloud (SimMC). It is crucial to measure SimMC for many point cloud-related applications such as 3D object retrieval and inverse procedural modelling. In our proposed method, the surface area of model and the Distance from Model to point Cloud (DistMC) are exploited as measurements to calculate SimMC. Here, DistMC is defined as the weighted distance of the distances between points sampled from model and point cloud. Similarly, Distance from point Cloud to Model (DistCM) is defined as the average distance of the distances between points in point cloud and model. In order to reduce huge computational burdens brought by calculation of DistCM in some traditional methods, we define SimMC as the ratio of weighted surface area of model to DistMC. Compared to those traditional SimMC measuring methods that are only able to measure global similarity, our method is capable of measuring partial similarity by employing distance-weighted strategy. Moreover, our method is able to be faster than other partial similarity assessment methods. We demonstrate the superiority of our method both on synthetic data and laser scanning data.

  18. Assessing the applicability of template-based protein docking in the twilight zone.

    PubMed

    Negroni, Jacopo; Mosca, Roberto; Aloy, Patrick

    2014-09-02

    The structural modeling of protein interactions in the absence of close homologous templates is a challenging task. Recently, template-based docking methods have emerged to exploit local structural similarities to help ab-initio protocols provide reliable 3D models for protein interactions. In this work, we critically assess the performance of template-based docking in the twilight zone. Our results show that, while it is possible to find templates for nearly all known interactions, the quality of the obtained models is rather limited. We can increase the precision of the models at expenses of coverage, but it drastically reduces the potential applicability of the method, as illustrated by the whole-interactome modeling of nine organisms. Template-based docking is likely to play an important role in the structural characterization of the interaction space, but we still need to improve the repertoire of structural templates onto which we can reliably model protein complexes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Determining blood and plasma volumes using bioelectrical response spectroscopy

    NASA Technical Reports Server (NTRS)

    Siconolfi, S. F.; Nusynowitz, M. L.; Suire, S. S.; Moore, A. D. Jr; Leig, J.

    1996-01-01

    We hypothesized that an electric field (inductance) produced by charged blood components passing through the many branches of arteries and veins could assess total blood volume (TBV) or plasma volume (PV). Individual (N = 29) electrical circuits (inductors, two resistors, and a capacitor) were determined from bioelectrical response spectroscopy (BERS) using a Hewlett Packard 4284A Precision LCR Meter. Inductance, capacitance, and resistance from the circuits of 19 subjects modeled TBV (sum of PV and computed red cell volume) and PV (based on 125I-albumin). Each model (N = 10, cross validation group) had good validity based on 1) mean differences (-2.3 to 1.5%) between the methods that were not significant and less than the propagated errors (+/- 5.2% for TBV and PV), 2) high correlations (r > 0.92) with low SEE (< 7.7%) between dilution and BERS assessments, and 3) Bland-Altman pairwise comparisons that indicated "clinical equivalency" between the methods. Given the limitation of this study (10 validity subjects), we concluded that BERS models accurately assessed TBV and PV. Further evaluations of the models' validities are needed before they are used in clinical or research settings.

  20. A modified precise integration method for transient dynamic analysis in structural systems with multiple damping models

    NASA Astrophysics Data System (ADS)

    Ding, Zhe; Li, Li; Hu, Yujin

    2018-01-01

    Sophisticated engineering systems are usually assembled by subcomponents with significantly different levels of energy dissipation. Therefore, these damping systems often contain multiple damping models and lead to great difficulties in analyzing. This paper aims at developing a time integration method for structural systems with multiple damping models. The dynamical system is first represented by a generally damped model. Based on this, a new extended state-space method for the damped system is derived. A modified precise integration method with Gauss-Legendre quadrature is then proposed. The numerical stability and accuracy of the proposed integration method are discussed in detail. It is verified that the method is conditionally stable and has inherent algorithmic damping, period error and amplitude decay. Numerical examples are provided to assess the performance of the proposed method compared with other methods. It is demonstrated that the method is more accurate than other methods with rather good efficiency and the stable condition is easy to be satisfied in practice.

  1. Assessment of PWR Steam Generator modelling in RELAP5/MOD2. International Agreement Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Putney, J.M.; Preece, R.J.

    1993-06-01

    An assessment of Steam Generator (SG) modelling in the PWR thermal-hydraulic code RELAP5/MOD2 is presented. The assessment is based on a review of code assessment calculations performed in the UK and elsewhere, detailed calculations against a series of commissioning tests carried out on the Wolf Creek PWR and analytical investigations of the phenomena involved in normal and abnormal SG operation. A number of modelling deficiencies are identified and their implications for PWR safety analysis are discussed -- including methods for compensating for the deficiencies through changes to the input deck. Consideration is also given as to whether the deficiencies willmore » still be present in the successor code RELAP5/MOD3.« less

  2. Toward Dynamic Ocean Management: Fisheries assessment and climate projections informed by community developed habitat models based on dynamic coastal oceanography

    NASA Astrophysics Data System (ADS)

    Kohut, J. T.; Manderson, J.; Palamara, L. J.; Saba, V. S.; Saba, G.; Hare, J. A.; Curchitser, E. N.; Moore, P.; Seibel, B.; DiDomenico, G.

    2016-12-01

    Through a multidisciplinary study group of experts in marine ecology, physical oceanography and stock assessment from the fishing industry, government and academia we developed a method to explicitly account for shifting habitat distributions in fish population assessments. We used data from field surveys throughout the Northwest Atlantic Ocean to develop a parametric thermal niche model for an important short-lived pelagic forage fish, Atlantic Butterfish. This niche model was coupled to a hindcast of daily bottom water temperature derived from a regional numerical ocean model in order to project daily thermal habitat suitability over the last 40 years. This ecological hindcast was used to estimate the proportion of thermal habitat suitability available on the U.S. Northeast Shelf that was sampled on fishery-independent surveys, accounting for the relative motions of thermal habitat and the trajectory of sampling on the survey. The method and habitat based estimates of availability was integrated into the catchability estimate used to scale population size in the butterfish stock assessment model accepted by the reviewers of the 59th NEFSC stock assessment review, as well as the mid-Atlantic Council's Scientific and Statistical Committee. The contribution of the availability estimate (along with an estimate of detectability) allowed for the development of fishery reference points, a change in stock status from unknown to known, and the establishment of a directed fishery with an allocation of 20,000 metric tons of quota. This presentation will describe how a community based workgroup utilized ocean observing technologies combined with ocean models to better understand the physical ocean that structures marine ecosystems. Using these approaches we will discuss opportunities to inform ecological hindcasts and climate projections with mechanistic models that link species-specific physiology to climate-based thermal scenarios.

  3. Toward Dynamic Ocean Management: Fisheries assessment and climate projections informed by community developed habitat models based on dynamic coastal oceanography

    NASA Astrophysics Data System (ADS)

    Kohut, J. T.; Manderson, J.; Palamara, L. J.; Saba, V. S.; Saba, G.; Hare, J. A.; Curchitser, E. N.; Moore, P.; Seibel, B.; DiDomenico, G.

    2016-02-01

    Through a multidisciplinary study group of experts in marine ecology, physical oceanography and stock assessment from the fishing industry, government and academia we developed a method to explicitly account for shifting habitat distributions in fish population assessments. We used data from field surveys throughout the Northwest Atlantic Ocean to develop a parametric thermal niche model for an important short-lived pelagic forage fish, Atlantic Butterfish. This niche model was coupled to a hindcast of daily bottom water temperature derived from a regional numerical ocean model in order to project daily thermal habitat suitability over the last 40 years. This ecological hindcast was used to estimate the proportion of thermal habitat suitability available on the U.S. Northeast Shelf that was sampled on fishery-independent surveys, accounting for the relative motions of thermal habitat and the trajectory of sampling on the survey. The method and habitat based estimates of availability was integrated into the catchability estimate used to scale population size in the butterfish stock assessment model accepted by the reviewers of the 59th NEFSC stock assessment review, as well as the mid-Atlantic Council's Scientific and Statistical Committee. The contribution of the availability estimate (along with an estimate of detectability) allowed for the development of fishery reference points, a change in stock status from unknown to known, and the establishment of a directed fishery with an allocation of 20,000 metric tons of quota. This presentation will describe how a community based workgroup utilized ocean observing technologies combined with ocean models to better understand the physical ocean that structures marine ecosystems. Using these approaches we will discuss opportunities to inform ecological hindcasts and climate projections with mechanistic models that link species-specific physiology to climate-based thermal scenarios.

  4. Research on Condition Assessment Method of Transmission Tower Under the Action of Strong Wind

    NASA Astrophysics Data System (ADS)

    Huang, Ren-mou; An, Li-qiang; Zhang, Rong-lun; Wu, Jiong; Liang, Ya-feng

    2018-03-01

    Transmission towers are often subjected to the external damage of severe weather like strong wind and so on, which may cause the collapse due to the yield and fracture of the tower material. Aiming this issue, an assessment method was proposed in this paper to assess the operation condition of transmission towers under strong wind. With a reasonable assess index system established firstly, then the internal force of the tower material was solved and its stability was determined through the mechanical analysis of the transmission tower finite element model. Meanwhile, the condition risk level of the tower was finally determined by considering the difference among the influences of other factors like corrosion and loose of members, slope on the transmission tower through the analytic hierarchy process. The assessment method was applied to assess the wind-induced collapse of towers in 110kV Bao Yi II line in Wenchang City, Hainan Province, of which the result proves the method can assess the condition of transmission tower under strong wind and of guiding significance for improving the windproof capability of transmission towers.

  5. Assessment of winter wheat loss risk impacted by climate change from 1982 to 2011

    NASA Astrophysics Data System (ADS)

    Du, Xin

    2017-04-01

    The world's farmers will face increasing pressure to grow more food on less land in succeeding few decades, because it seems that the continuous population growth and agricultural products turning to biofuels would extend several decades into the future. Therefore, the increased demand for food supply worldwide calls for improved accuracy of crop productivity estimation and assessment of grain production loss risk. Extensive studies have been launched to evaluate the impacts of climate change on crop production based on various crop models drove with global or regional climate model (GCM/RCM) output. However, assessment of climate change impacts on agriculture productivity is plagued with uncertainties of the future climate change scenarios and complexity of crop model. Therefore, given uncertain climate conditions and a lack of model parameters, these methods are strictly limited in application. In this study, an empirical assessment approach for crop loss risk impacted by water stress has been established and used to evaluate the risk of winter wheat loss in China, United States, Germany, France and United Kingdom. The average value of winter wheat loss risk impacted by water stress for the three countries of Europe is about -931kg/ha, which is obviously higher in contrast with that in China (-570kg/ha) and in United States (-367kg/ha). Our study has important implications for further application of operational assessment of crop loss risk at a country or region scale. Future studies should focus on using higher spatial resolution remote sensing data, combining actual evapo-transpiration to estimate water stress, improving the method for downscaling of statistic crop yield data, and establishing much more rational and elaborate zoning method.

  6. Comparison of numerical weather prediction based deterministic and probabilistic wind resource assessment methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Draxl, Caroline; Hopson, Thomas

    Numerical weather prediction (NWP) models have been widely used for wind resource assessment. Model runs with higher spatial resolution are generally more accurate, yet extremely computational expensive. An alternative approach is to use data generated by a low resolution NWP model, in conjunction with statistical methods. In order to analyze the accuracy and computational efficiency of different types of NWP-based wind resource assessment methods, this paper performs a comparison of three deterministic and probabilistic NWP-based wind resource assessment methodologies: (i) a coarse resolution (0.5 degrees x 0.67 degrees) global reanalysis data set, the Modern-Era Retrospective Analysis for Research and Applicationsmore » (MERRA); (ii) an analog ensemble methodology based on the MERRA, which provides both deterministic and probabilistic predictions; and (iii) a fine resolution (2-km) NWP data set, the Wind Integration National Dataset (WIND) Toolkit, based on the Weather Research and Forecasting model. Results show that: (i) as expected, the analog ensemble and WIND Toolkit perform significantly better than MERRA confirming their ability to downscale coarse estimates; (ii) the analog ensemble provides the best estimate of the multi-year wind distribution at seven of the nine sites, while the WIND Toolkit is the best at one site; (iii) the WIND Toolkit is more accurate in estimating the distribution of hourly wind speed differences, which characterizes the wind variability, at five of the available sites, with the analog ensemble being best at the remaining four locations; and (iv) the analog ensemble computational cost is negligible, whereas the WIND Toolkit requires large computational resources. Future efforts could focus on the combination of the analog ensemble with intermediate resolution (e.g., 10-15 km) NWP estimates, to considerably reduce the computational burden, while providing accurate deterministic estimates and reliable probabilistic assessments.« less

  7. Developing Mathematical Provisions for Assessment of Liquid Hydrocarbon Emissions in Emergency Situations

    NASA Astrophysics Data System (ADS)

    Zemenkova, M. Yu; Zemenkov, Yu D.; Shantarin, V. D.

    2016-10-01

    The paper reviews the development of methodology for calculation of hydrocarbon emissions during seepage and evaporation to monitor the reliability and safety of hydrocarbon storage and transportation. The authors have analyzed existing methods, models and techniques for assessing the amount of evaporated oil. Models used for predicting the material balance of multicomponent two-phase systems have been discussed. The results of modeling the open-air hydrocarbon evaporation from an oil spill are provided and exemplified by an emergency pit. Dependences and systems of differential equations have been obtained to assess parameters of mass transfer from the open surface of a liquid multicomponent mixture.

  8. Automatically quantifying the scientific quality and sensationalism of news records mentioning pandemics: validating a maximum entropy machine-learning model.

    PubMed

    Hoffman, Steven J; Justicz, Victoria

    2016-07-01

    To develop and validate a method for automatically quantifying the scientific quality and sensationalism of individual news records. After retrieving 163,433 news records mentioning the Severe Acute Respiratory Syndrome (SARS) and H1N1 pandemics, a maximum entropy model for inductive machine learning was used to identify relationships among 500 randomly sampled news records that correlated with systematic human assessments of their scientific quality and sensationalism. These relationships were then computationally applied to automatically classify 10,000 additional randomly sampled news records. The model was validated by randomly sampling 200 records and comparing human assessments of them to the computer assessments. The computer model correctly assessed the relevance of 86% of news records, the quality of 65% of records, and the sensationalism of 73% of records, as compared to human assessments. Overall, the scientific quality of SARS and H1N1 news media coverage had potentially important shortcomings, but coverage was not too sensationalizing. Coverage slightly improved between the two pandemics. Automated methods can evaluate news records faster, cheaper, and possibly better than humans. The specific procedure implemented in this study can at the very least identify subsets of news records that are far more likely to have particular scientific and discursive qualities. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Students Explaining Science--Assessment of Science Communication Competence

    ERIC Educational Resources Information Center

    Kulgemeyer, Christoph; Schecker, Horst

    2013-01-01

    Science communication competence (SCC) is an important educational goal in the school science curricula of several countries. However, there is a lack of research about the structure and the assessment of SCC. This paper specifies the theoretical framework of SCC by a competence model. We developed a qualitative assessment method for SCC that is…

  10. Integrating human health and ecological data into cumulative risk assessment through the Aggregate Exposure Pathway and Adverse Outcome Pathway frameworks

    EPA Science Inventory

    Cumulative risk assessment (CRA) methods promote the use of a conceptual site model (CSM) to apportion exposures and integrate risk from relevant stressors across different species. Integration is important to provide a more complete assessment of risk, but evaluating endpoints a...

  11. Critical Emergency Medicine Procedural Skills: A Comparative Study of Methods for Teaching and Assessment.

    ERIC Educational Resources Information Center

    Chapman, Dane M.; And Others

    Three critical procedural skills in emergency medicine were evaluated using three assessment modalities--written, computer, and animal model. The effects of computer practice and previous procedure experience on skill competence were also examined in an experimental sequential assessment design. Subjects were six medical students, six residents,…

  12. Training Staff to Implement Brief Stimulus Preference Assessments

    ERIC Educational Resources Information Center

    Weldy, Christina R.; Rapp, John T.; Capocasa, Kelli

    2014-01-01

    We trained 9 behavioral staff members to conduct 2 brief preference assessments using 30-min video presentations that contained instructions and modeling. After training, we evaluated each staff member's implementation of the assessments in situ. Results indicated that 1 or 2 training sessions for each method were sufficient for teaching each…

  13. Leveraging Educational Data Mining for Real-Time Performance Assessment of Scientific Inquiry Skills within Microworlds

    ERIC Educational Resources Information Center

    Gobert, Janice D.; Sao Pedro, Michael A.; Baker, Ryan S. J. D.; Toto, Ermal; Montalvo, Orlando

    2012-01-01

    We present "Science Assistments," an interactive environment, which assesses students' inquiry skills as they engage in inquiry using science microworlds. We frame our variables, tasks, assessments, and methods of analyzing data in terms of "evidence-centered design." Specifically, we focus on the "student model," the…

  14. Proposal of an environmental performance index to assess solid waste treatment technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goulart Coelho, Hosmanny Mauro, E-mail: hosmanny@hotmail.com; Lange, Lisete Celina; Coelho, Lineker Max Goulart

    2012-07-15

    Highlights: Black-Right-Pointing-Pointer Proposal of a new concept in waste management: Cleaner Treatment. Black-Right-Pointing-Pointer Development of an index to assess quantitatively waste treatment technologies. Black-Right-Pointing-Pointer Delphi Method was carried out so as to define environmental indicators. Black-Right-Pointing-Pointer Environmental performance evaluation of waste-to-energy plants. - Abstract: Although the concern with sustainable development and environment protection has considerably grown in the last years it is noted that the majority of decision making models and tools are still either excessively tied to economic aspects or geared to the production process. Moreover, existing models focus on the priority steps of solid waste management, beyond wastemore » energy recovery and disposal. So, in order to help the lack of models and tools aiming at the waste treatment and final disposal, a new concept is proposed: the Cleaner Treatment, which is based on the Cleaner Production principles. This paper focuses on the development and validation of the Cleaner Treatment Index (CTI), to assess environmental performance of waste treatment technologies based on the Cleaner Treatment concept. The index is formed by aggregation (summation or product) of several indicators that consists in operational parameters. The weights of the indicator were established by Delphi Method and Brazilian Environmental Laws. In addition, sensitivity analyses were carried out comparing both aggregation methods. Finally, index validation was carried out by applying the CTI to 10 waste-to-energy plants data. From sensitivity analysis and validation results it is possible to infer that summation model is the most suitable aggregation method. For summation method, CTI results were superior to 0.5 (in a scale from 0 to 1) for most facilities evaluated. So, this study demonstrates that CTI is a simple and robust tool to assess and compare the environmental performance of different treatment plants being an excellent quantitative tool to support Cleaner Treatment implementation.« less

  15. Comparison of 1-step and 2-step methods of fitting microbiological models.

    PubMed

    Jewell, Keith

    2012-11-15

    Previous conclusions that a 1-step fitting method gives more precise coefficients than the traditional 2-step method are confirmed by application to three different data sets. It is also shown that, in comparison to 2-step fits, the 1-step method gives better fits to the data (often substantially) with directly interpretable regression diagnostics and standard errors. The improvement is greatest at extremes of environmental conditions and it is shown that 1-step fits can indicate inappropriate functional forms when 2-step fits do not. 1-step fits are better at estimating primary parameters (e.g. lag, growth rate) as well as concentrations, and are much more data efficient, allowing the construction of more robust models on smaller data sets. The 1-step method can be straightforwardly applied to any data set for which the 2-step method can be used and additionally to some data sets where the 2-step method fails. A 2-step approach is appropriate for visual assessment in the early stages of model development, and may be a convenient way to generate starting values for a 1-step fit, but the 1-step approach should be used for any quantitative assessment. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Forced-Choice Assessment of Work-Related Maladaptive Personality Traits: Preliminary Evidence From an Application of Thurstonian Item Response Modeling.

    PubMed

    Guenole, Nigel; Brown, Anna A; Cooper, Andrew J

    2018-06-01

    This article describes an investigation of whether Thurstonian item response modeling is a viable method for assessment of maladaptive traits. Forced-choice responses from 420 working adults to a broad-range personality inventory assessing six maladaptive traits were considered. The Thurstonian item response model's fit to the forced-choice data was adequate, while the fit of a counterpart item response model to responses to the same items but arranged in a single-stimulus design was poor. Monotrait heteromethod correlations indicated corresponding traits in the two formats overlapped substantially, although they did not measure equivalent constructs. A better goodness of fit and higher factor loadings for the Thurstonian item response model, coupled with a clearer conceptual alignment to the theoretical trait definitions, suggested that the single-stimulus item responses were influenced by biases that the independent clusters measurement model did not account for. Researchers may wish to consider forced-choice designs and appropriate item response modeling techniques such as Thurstonian item response modeling for personality questionnaire applications in industrial psychology, especially when assessing maladaptive traits. We recommend further investigation of this approach in actual selection situations and with different assessment instruments.

  17. Assessment of current atomic scale modelling methods for the investigation of nuclear fuels under irradiation: Example of uranium dioxide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertolus, Marjorie; Krack, Matthias; Freyss, Michel

    Multiscale approaches are developed to build more physically based kinetic and mechanical mesoscale models to enhance the predictive capability of fuel performance codes and increase the efficiency of the development of the safer and more innovative nuclear materials needed in the future. Atomic scale methods, and in particular electronic structure and empirical potential methods, form the basis of this multiscale approach. It is therefore essential to know the accuracy of the results computed at this scale if we want to feed them into higher scale models. We focus here on the assessment of the description of interatomic interactions in uraniummore » dioxide using on the one hand electronic structure methods, in particular in the density functional theory (DFT) framework and on the other hand empirical potential methods. These two types of methods are complementary, the former enabling to get results from a minimal amount of input data and further insight into the electronic and magnetic properties, while the latter are irreplaceable for studies where a large number of atoms needs to be considered. We consider basic properties as well as specific ones, which are important for the description of nuclear fuel under irradiation. These are especially energies, which are the main data passed to higher scale models. We limit ourselves to uranium dioxide.« less

  18. Participatory flood vulnerability assessment: a multi-criteria approach

    NASA Astrophysics Data System (ADS)

    Madruga de Brito, Mariana; Evers, Mariele; Delos Santos Almoradie, Adrian

    2018-01-01

    This paper presents a participatory multi-criteria decision-making (MCDM) approach for flood vulnerability assessment while considering the relationships between vulnerability criteria. The applicability of the proposed framework is demonstrated in the municipalities of Lajeado and Estrela, Brazil. The model was co-constructed by 101 experts from governmental organizations, universities, research institutes, NGOs, and private companies. Participatory methods such as the Delphi survey, focus groups, and workshops were applied. A participatory problem structuration, in which the modellers work closely with end users, was used to establish the structure of the vulnerability index. The preferences of each participant regarding the criteria importance were spatially modelled through the analytical hierarchy process (AHP) and analytical network process (ANP) multi-criteria methods. Experts were also involved at the end of the modelling exercise for validation. The final product is a set of individual and group flood vulnerability maps. Both AHP and ANP proved to be effective for flood vulnerability assessment; however, ANP is preferred as it considers the dependences among criteria. The participatory approach enabled experts to learn from each other and acknowledge different perspectives towards social learning. The findings highlight that to enhance the credibility and deployment of model results, multiple viewpoints should be integrated without forcing consensus.

  19. Thermodynamic assessment and binary nucleation modeling of Sn-seeded InGaAs nanowires

    NASA Astrophysics Data System (ADS)

    Ghasemi, Masoomeh; Selleby, Malin; Johansson, Jonas

    2017-11-01

    We have performed a thermodynamic assessment of the As-Ga-In-Sn system based on the CALculation of PHAse Diagram (CALPHAD) method. This system is part of a comprehensive thermodynamic database that we are developing for nanowire materials. Specifically, the As-Ga-In-Sn can be used in modeling the growth of GaAs, InAs, and InxGa1-xAs nanowires assisted by Sn liquid seeds. In this work, the As-Sn binary, the As-Ga-Sn, As-In-Sn, and Ga-In-Sn ternary systems have been thermodynamically assessed using the CALPHAD method. We show the relevant phase diagrams and property diagrams. They all show good agreement with experimental data. Using our optimized description we have modeled the nucleation of InxGa1-xAs in the zinc blende phase from a Sn-based quaternary liquid alloy using binary nucleation modeling. We have linked the composition of the solid nucleus to the composition of the liquid phase. Eventually, we have predicted the critical size of the nucleus that forms from InAs and GaAs pairs under various conditions. We believe that our modeling can guide future experimental realization of Sn-seeded InxGa1-xAs nanowires.

  20. Soft sensor modelling by time difference, recursive partial least squares and adaptive model updating

    NASA Astrophysics Data System (ADS)

    Fu, Y.; Yang, W.; Xu, O.; Zhou, L.; Wang, J.

    2017-04-01

    To investigate time-variant and nonlinear characteristics in industrial processes, a soft sensor modelling method based on time difference, moving-window recursive partial least square (PLS) and adaptive model updating is proposed. In this method, time difference values of input and output variables are used as training samples to construct the model, which can reduce the effects of the nonlinear characteristic on modelling accuracy and retain the advantages of recursive PLS algorithm. To solve the high updating frequency of the model, a confidence value is introduced, which can be updated adaptively according to the results of the model performance assessment. Once the confidence value is updated, the model can be updated. The proposed method has been used to predict the 4-carboxy-benz-aldehyde (CBA) content in the purified terephthalic acid (PTA) oxidation reaction process. The results show that the proposed soft sensor modelling method can reduce computation effectively, improve prediction accuracy by making use of process information and reflect the process characteristics accurately.

Top