DOT National Transportation Integrated Search
1978-10-01
This report presents a method that may be used to evaluate the reliability of performance of individual subjects, particularly in applied laboratory research. The method is based on analysis of variance of a tasks-by-subjects data matrix, with all sc...
NASA Astrophysics Data System (ADS)
Ando, Yoshinobu; Eguchi, Yuya; Mizukawa, Makoto
In this research, we proposed and evaluated a management method of college mechatronics education. We applied the project management to college mechatronics education. We practiced our management method to the seminar “Microcomputer Seminar” for 3rd grade students who belong to Department of Electrical Engineering, Shibaura Institute of Technology. We succeeded in management of Microcomputer Seminar in 2006. We obtained the good evaluation for our management method by means of questionnaire.
Applied Pharmacokinetics: Course Description and Retrospective Evaluation.
ERIC Educational Resources Information Center
Beck, Diane E.
1984-01-01
An applied course designed to allow students to formulate pharmacokinetic recommendations individually for actual patient data and compare their recommendations to those of a pharmacokinetic consulting service is described and evaluated, and an objective student evaluation method is outlined. (MSE)
2009-06-01
3. Previous Navy CRM Assessments ....................................................24 4. Applying Kirkpatrick’s Topology of Evaluation...development within each aviation community. Kirkpatrick’s (1976) hierarchy of training evaluation technique was applied to examine three levels of... Applying methods and techniques used in previous CRM evaluation research, this thesis provided an updated evaluation of the Naval CRM program to fill
ERIC Educational Resources Information Center
Dodd, Carol Ann
This study explores a technique for evaluating teacher education programs in terms of teaching competencies, as applied to the Indiana University Mathematics Methods Program (MMP). The evaluation procedures formulated for the study include a process product design in combination with a modification of Pophan's performance test paradigm and Gage's…
A usability evaluation toolkit for In-Vehicle Information Systems (IVISs).
Harvey, Catherine; Stanton, Neville A; Pickering, Carl A; McDonald, Mike; Zheng, Pengjun
2011-05-01
Usability must be defined specifically for the context of use of the particular system under investigation. This specific context of use should also be used to guide the definition of specific usability criteria and the selection of appropriate evaluation methods. There are four principles which can guide the selection of evaluation methods, relating to the information required in the evaluation, the stage at which to apply methods, the resources required and the people involved in the evaluation. This paper presents a framework for the evaluation of usability in the context of In-Vehicle Information Systems (IVISs). This framework guides designers through defining usability criteria for an evaluation, selecting appropriate evaluation methods and applying those methods. These stages form an iterative process of design-evaluation-redesign with the overall aim of improving the usability of IVISs and enhancing the driving experience, without compromising the safety of the driver. Copyright © 2010 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Quantification of the Barkhausen noise method for the evaluation of time-dependent degradation
NASA Astrophysics Data System (ADS)
Kim, Dong-Won; Kwon, Dongil
2003-02-01
The Barkhausen noise (BN) method has long been applied to measure the bulk magnetic properties of magnetic materials. Recently, this important nondestructive testing (NDT) method has been applied to evaluate microstructure, stress distribution analysis, fatigue, creep and fracture characteristics. Until now the BN method has been used only qualitatively in evaluating the variation of BN with variations in material properties. For this reason, few NDT methods have been applied in industrial plants and laboratories. The present investigation studied the coercive force and BN while varying the microstructure of ultrafine-grained steels and SA508 cl.3 steels. This variation was carried out according to the second heat-treatment condition with rolling of ultrafine-grained steels and the simulated time-dependent degradation of SA 508 cl.3 steels. An attempt was also made to quantify BN from the relationship between the velocity of magnetic domain walls and the retarding force, using the coercive force of the domain wall movement. The microstructure variation was analyzed according to time-dependent degradation. Fracture toughness was evaluated quantitatively by measuring the BN from two intermediary parameters; grain size and distribution of nonmagnetic particles. From these measurements, the variation of microstructure and fracture toughness can be directly evaluated by the BN method as an accurate in situ NDT method.
ERIC Educational Resources Information Center
Lin, Su-ching; Wu, Ming-sui
2016-01-01
This study was the first year of a two-year project which applied a program theory-driven approach to evaluating the impact of teachers' professional development interventions on students' learning by using a mix of methods, qualitative inquiry, and quasi-experimental design. The current study was to show the results of using the method of…
Fullerton, Birgit; Pöhlmann, Boris; Krohn, Robert; Adams, John L; Gerlach, Ferdinand M; Erler, Antje
2016-10-01
To present a case study on how to compare various matching methods applying different measures of balance and to point out some pitfalls involved in relying on such measures. Administrative claims data from a German statutory health insurance fund covering the years 2004-2008. We applied three different covariance balance diagnostics to a choice of 12 different matching methods used to evaluate the effectiveness of the German disease management program for type 2 diabetes (DMPDM2). We further compared the effect estimates resulting from applying these different matching techniques in the evaluation of the DMPDM2. The choice of balance measure leads to different results on the performance of the applied matching methods. Exact matching methods performed well across all measures of balance, but resulted in the exclusion of many observations, leading to a change of the baseline characteristics of the study sample and also the effect estimate of the DMPDM2. All PS-based methods showed similar effect estimates. Applying a higher matching ratio and using a larger variable set generally resulted in better balance. Using a generalized boosted instead of a logistic regression model showed slightly better performance for balance diagnostics taking into account imbalances at higher moments. Best practice should include the application of several matching methods and thorough balance diagnostics. Applying matching techniques can provide a useful preprocessing step to reveal areas of the data that lack common support. The use of different balance diagnostics can be helpful for the interpretation of different effect estimates found with different matching methods. © Health Research and Educational Trust.
A survey of methods for the evaluation of tissue engineering scaffold permeability.
Pennella, F; Cerino, G; Massai, D; Gallo, D; Falvo D'Urso Labate, G; Schiavi, A; Deriu, M A; Audenino, A; Morbiducci, Umberto
2013-10-01
The performance of porous scaffolds for tissue engineering (TE) applications is evaluated, in general, in terms of porosity, pore size and distribution, and pore tortuosity. These descriptors are often confounding when they are applied to characterize transport phenomena within porous scaffolds. On the contrary, permeability is a more effective parameter in (1) estimating mass and species transport through the scaffold and (2) describing its topological features, thus allowing a better evaluation of the overall scaffold performance. However, the evaluation of TE scaffold permeability suffers of a lack of uniformity and standards in measurement and testing procedures which makes the comparison of results obtained in different laboratories unfeasible. In this review paper we summarize the most important features influencing TE scaffold permeability, linking them to the theoretical background. An overview of methods applied for TE scaffold permeability evaluation is given, presenting experimental test benches and computational methods applied (1) to integrate experimental measurements and (2) to support the TE scaffold design process. Both experimental and computational limitations in the permeability evaluation process are also discussed.
Using a fuzzy comprehensive evaluation method to determine product usability: A test case
Zhou, Ronggang; Chan, Alan H. S.
2016-01-01
BACKGROUND: In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. OBJECTIVE AND METHODS: In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. RESULTS AND CONCLUSIONS: This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method. PMID:28035942
Spreco, A; Eriksson, O; Dahlström, Ö; Timpka, T
2017-07-01
Methods for the detection of influenza epidemics and prediction of their progress have seldom been comparatively evaluated using prospective designs. This study aimed to perform a prospective comparative trial of algorithms for the detection and prediction of increased local influenza activity. Data on clinical influenza diagnoses recorded by physicians and syndromic data from a telenursing service were used. Five detection and three prediction algorithms previously evaluated in public health settings were calibrated and then evaluated over 3 years. When applied on diagnostic data, only detection using the Serfling regression method and prediction using the non-adaptive log-linear regression method showed acceptable performances during winter influenza seasons. For the syndromic data, none of the detection algorithms displayed a satisfactory performance, while non-adaptive log-linear regression was the best performing prediction method. We conclude that evidence was found for that available algorithms for influenza detection and prediction display satisfactory performance when applied on local diagnostic data during winter influenza seasons. When applied on local syndromic data, the evaluated algorithms did not display consistent performance. Further evaluations and research on combination of methods of these types in public health information infrastructures for 'nowcasting' (integrated detection and prediction) of influenza activity are warranted.
NASA Technical Reports Server (NTRS)
Liu, A. F.
1974-01-01
A systematic approach for applying methods for fracture control in the structural components of space vehicles consists of four major steps. The first step is to define the primary load-carrying structural elements and the type of load, environment, and design stress levels acting upon them. The second step is to identify the potential fracture-critical parts by means of a selection logic flow diagram. The third step is to evaluate the safe-life and fail-safe capabilities of the specified part. The last step in the sequence is to apply the control procedures that will prevent damage to the fracture-critical parts. The fracture control methods discussed include fatigue design and analysis methods, methods for preventing crack-like defects, fracture mechanics analysis methods, and nondestructive evaluation methods. An example problem is presented for evaluation of the safe-crack-growth capability of the space shuttle crew compartment skin structure.
Using a fuzzy comprehensive evaluation method to determine product usability: A test case.
Zhou, Ronggang; Chan, Alan H S
2017-01-01
In order to take into account the inherent uncertainties during product usability evaluation, Zhou and Chan [1] proposed a comprehensive method of usability evaluation for products by combining the analytic hierarchy process (AHP) and fuzzy evaluation methods for synthesizing performance data and subjective response data. This method was designed to provide an integrated framework combining the inevitable vague judgments from the multiple stages of the product evaluation process. In order to illustrate the effectiveness of the model, this study used a summative usability test case to assess the application and strength of the general fuzzy usability framework. To test the proposed fuzzy usability evaluation framework [1], a standard summative usability test was conducted to benchmark the overall usability of a specific network management software. Based on the test data, the fuzzy method was applied to incorporate both the usability scores and uncertainties involved in the multiple components of the evaluation. Then, with Monte Carlo simulation procedures, confidence intervals were used to compare the reliabilities among the fuzzy approach and two typical conventional methods combining metrics based on percentages. This case study showed that the fuzzy evaluation technique can be applied successfully for combining summative usability testing data to achieve an overall usability quality for the network software evaluated. Greater differences of confidence interval widths between the method of averaging equally percentage and weighted evaluation method, including the method of weighted percentage averages, verified the strength of the fuzzy method.
Applying Propensity Score Methods in Medical Research: Pitfalls and Prospects
Luo, Zhehui; Gardiner, Joseph C.; Bradley, Cathy J.
2012-01-01
The authors review experimental and nonexperimental causal inference methods, focusing on assumptions for the validity of instrumental variables and propensity score (PS) methods. They provide guidance in four areas for the analysis and reporting of PS methods in medical research and selectively evaluate mainstream medical journal articles from 2000 to 2005 in the four areas, namely, examination of balance, overlapping support description, use of estimated PS for evaluation of treatment effect, and sensitivity analyses. In spite of the many pitfalls, when appropriately evaluated and applied, PS methods can be powerful tools in assessing average treatment effects in observational studies. Appropriate PS applications can create experimental conditions using observational data when randomized controlled trials are not feasible and, thus, lead researchers to an efficient estimator of the average treatment effect. PMID:20442340
[An Introduction to Methods for Evaluating Health Care Technology].
Lee, Ting-Ting
2015-06-01
The rapid and continual advance of healthcare technology makes ensuring that this technology is used effectively to achieve its original goals a critical issue. This paper presents three methods that may be applied by healthcare professionals in the evaluation of healthcare technology. These methods include: the perception/experiences of users, user work-pattern changes, and chart review or data mining. The first method includes two categories: using interviews to explore the user experience and using theory-based questionnaire surveys. The second method applies work sampling to observe the work pattern changes of users. The last method conducts chart reviews or data mining to analyze the designated variables. In conclusion, while evaluative feedback may be used to improve the design and development of healthcare technology applications, the informatics competency and informatics literacy of users may be further explored in future research.
Morita, K; Uchiyama, Y; Tominaga, S
1987-06-01
In order to evaluate the treatment results of radiotherapy it is important to estimate the degree of complications of the surrounding normal tissues as well as the frequency of tumor control. In this report, the cumulative incidence rate of the late radiation injuries of the normal tissues was calculated using the modified actuarial method (Cutler-Ederer's method) or Kaplan-Meier's method, which is usually applied to the calculation of the survival rate. By the use of this method of calculation, an accurate cumulative incidence rate over time can be easily obtained and applied to the statistical evaluation of the late radiation injuries.
ERIC Educational Resources Information Center
Kimball, Steven M.; Milanowski, Anthony
2009-01-01
Purpose: The article reports on a study of school leader decision making that examined variation in the validity of teacher evaluation ratings in a school district that has implemented a standards-based teacher evaluation system. Research Methods: Applying mixed methods, the study used teacher evaluation ratings and value-added student achievement…
Equating Scores from Adaptive to Linear Tests
ERIC Educational Resources Information Center
van der Linden, Wim J.
2006-01-01
Two local methods for observed-score equating are applied to the problem of equating an adaptive test to a linear test. In an empirical study, the methods were evaluated against a method based on the test characteristic function (TCF) of the linear test and traditional equipercentile equating applied to the ability estimates on the adaptive test…
Evaluating a physician leadership development program - a mixed methods approach.
Throgmorton, Cheryl; Mitchell, Trey; Morley, Tom; Snyder, Marijo
2016-05-16
Purpose - With the extent of change in healthcare today, organizations need strong physician leaders. To compensate for the lack of physician leadership education, many organizations are sending physicians to external leadership programs or developing in-house leadership programs targeted specifically to physicians. The purpose of this paper is to outline the evaluation strategy and outcomes of the inaugural year of a Physician Leadership Academy (PLA) developed and implemented at a Michigan-based regional healthcare system. Design/methodology/approach - The authors applied the theoretical framework of Kirkpatrick's four levels of evaluation and used surveys, observations, activity tracking, and interviews to evaluate the program outcomes. The authors applied grounded theory techniques to the interview data. Findings - The program met targeted outcomes across all four levels of evaluation. Interview themes focused on the significance of increasing self-awareness, building relationships, applying new skills, and building confidence. Research limitations/implications - While only one example, this study illustrates the importance of developing the evaluation strategy as part of the program design. Qualitative research methods, often lacking from learning evaluation design, uncover rich themes of impact. The study supports how a PLA program can enhance physician learning, engagement, and relationship building throughout and after the program. Physician leaders' partnership with organization development and learning professionals yield results with impact to individuals, groups, and the organization. Originality/value - Few studies provide an in-depth review of evaluation methods and outcomes of physician leadership development programs. Healthcare organizations seeking to develop similar in-house programs may benefit applying the evaluation strategy outlined in this study.
Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; ...
2016-02-05
Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamicmore » integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.« less
Babaloukas, Georgios; Tentolouris, Nicholas; Liatis, Stavros; Sklavounou, Alexandra; Perrea, Despoina
2011-12-01
Correction of vignetting on images obtained by a digital camera mounted on a microscope is essential before applying image analysis. The aim of this study is to evaluate three methods for retrospective correction of vignetting on medical microscopy images and compare them with a prospective correction method. One digital image from four different tissues was used and a vignetting effect was applied on each of these images. The resulted vignetted image was replicated four times and in each replica a different method for vignetting correction was applied with fiji and gimp software tools. The highest peak signal-to-noise ratio from the comparison of each method to the original image was obtained from the prospective method in all tissues. The morphological filtering method provided the highest peak signal-to-noise ratio value amongst the retrospective methods. The prospective method is suggested as the method of choice for correction of vignetting and if it is not applicable, then the morphological filtering may be suggested as the retrospective alternative method. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.
Toma, Tudor; Bosman, Robert-Jan; Siebes, Arno; Peek, Niels; Abu-Hanna, Ameen
2010-08-01
An important problem in the Intensive Care is how to predict on a given day of stay the eventual hospital mortality for a specific patient. A recent approach to solve this problem suggested the use of frequent temporal sequences (FTSs) as predictors. Methods following this approach were evaluated in the past by inducing a model from a training set and validating the prognostic performance on an independent test set. Although this evaluative approach addresses the validity of the specific models induced in an experiment, it falls short of evaluating the inductive method itself. To achieve this, one must account for the inherent sources of variation in the experimental design. The main aim of this work is to demonstrate a procedure based on bootstrapping, specifically the .632 bootstrap procedure, for evaluating inductive methods that discover patterns, such as FTSs. A second aim is to apply this approach to find out whether a recently suggested inductive method that discovers FTSs of organ functioning status is superior over a traditional method that does not use temporal sequences when compared on each successive day of stay at the Intensive Care Unit. The use of bootstrapping with logistic regression using pre-specified covariates is known in the statistical literature. Using inductive methods of prognostic models based on temporal sequence discovery within the bootstrap procedure is however novel at least in predictive models in the Intensive Care. Our results of applying the bootstrap-based evaluative procedure demonstrate the superiority of the FTS-based inductive method over the traditional method in terms of discrimination as well as accuracy. In addition we illustrate the insights gained by the analyst into the discovered FTSs from the bootstrap samples. Copyright 2010 Elsevier Inc. All rights reserved.
Decision curve analysis: a novel method for evaluating prediction models.
Vickers, Andrew J; Elkin, Elena B
2006-01-01
Diagnostic and prognostic models are typically evaluated with measures of accuracy that do not address clinical consequences. Decision-analytic techniques allow assessment of clinical outcomes but often require collection of additional information and may be cumbersome to apply to models that yield a continuous result. The authors sought a method for evaluating and comparing prediction models that incorporates clinical consequences,requires only the data set on which the models are tested,and can be applied to models that have either continuous or dichotomous results. The authors describe decision curve analysis, a simple, novel method of evaluating predictive models. They start by assuming that the threshold probability of a disease or event at which a patient would opt for treatment is informative of how the patient weighs the relative harms of a false-positive and a false-negative prediction. This theoretical relationship is then used to derive the net benefit of the model across different threshold probabilities. Plotting net benefit against threshold probability yields the "decision curve." The authors apply the method to models for the prediction of seminal vesicle invasion in prostate cancer patients. Decision curve analysis identified the range of threshold probabilities in which a model was of value, the magnitude of benefit, and which of several models was optimal. Decision curve analysis is a suitable method for evaluating alternative diagnostic and prognostic strategies that has advantages over other commonly used measures and techniques.
Burden, Anne; Roche, Nicolas; Miglio, Cristiana; Hillyer, Elizabeth V; Postma, Dirkje S; Herings, Ron MC; Overbeek, Jetty A; Khalid, Javaria Mona; van Eickels, Daniela; Price, David B
2017-01-01
Background Cohort matching and regression modeling are used in observational studies to control for confounding factors when estimating treatment effects. Our objective was to evaluate exact matching and propensity score methods by applying them in a 1-year pre–post historical database study to investigate asthma-related outcomes by treatment. Methods We drew on longitudinal medical record data in the PHARMO database for asthma patients prescribed the treatments to be compared (ciclesonide and fine-particle inhaled corticosteroid [ICS]). Propensity score methods that we evaluated were propensity score matching (PSM) using two different algorithms, the inverse probability of treatment weighting (IPTW), covariate adjustment using the propensity score, and propensity score stratification. We defined balance, using standardized differences, as differences of <10% between cohorts. Results Of 4064 eligible patients, 1382 (34%) were prescribed ciclesonide and 2682 (66%) fine-particle ICS. The IPTW and propensity score-based methods retained more patients (96%–100%) than exact matching (90%); exact matching selected less severe patients. Standardized differences were >10% for four variables in the exact-matched dataset and <10% for both PSM algorithms and the weighted pseudo-dataset used in the IPTW method. With all methods, ciclesonide was associated with better 1-year asthma-related outcomes, at one-third the prescribed dose, than fine-particle ICS; results varied slightly by method, but direction and statistical significance remained the same. Conclusion We found that each method has its particular strengths, and we recommend at least two methods be applied for each matched cohort study to evaluate the robustness of the findings. Balance diagnostics should be applied with all methods to check the balance of confounders between treatment cohorts. If exact matching is used, the calculation of a propensity score could be useful to identify variables that require balancing, thereby informing the choice of matching criteria together with clinical considerations. PMID:28356782
A Critical Review of Methods to Evaluate the Impact of FDA Regulatory Actions
Briesacher, Becky A.; Soumerai, Stephen B.; Zhang, Fang; Toh, Sengwee; Andrade, Susan E.; Wagner, Joann L.; Shoaibi, Azadeh; Gurwitz, Jerry H.
2013-01-01
Purpose To conduct a synthesis of the literature on methods to evaluate the impacts of FDA regulatory actions, and identify best practices for future evaluations. Methods We searched MEDLINE for manuscripts published between January 1948 and August 2011 that included terms related to FDA, regulatory actions, and empirical evaluation; the review additionally included FDA-identified literature. We used a modified Delphi method to identify preferred methodologies. We included studies with explicit methods to address threats to validity, and identified designs and analytic methods with strong internal validity that have been applied to other policy evaluations. Results We included 18 studies out of 243 abstracts and papers screened. Overall, analytic rigor in prior evaluations of FDA regulatory actions varied considerably; less than a quarter of studies (22%) included control groups. Only 56% assessed changes in the use of substitute products/services, and 11% examined patient health outcomes. Among studies meeting minimal criteria of rigor, 50% found no impact or weak/modest impacts of FDA actions and 33% detected unintended consequences. Among those studies finding significant intended effects of FDA actions, all cited the importance of intensive communication efforts. There are preferred methods with strong internal validity that have yet to be applied to evaluations of FDA regulatory actions. Conclusions Rigorous evaluations of the impact of FDA regulatory actions have been limited and infrequent. Several methods with strong internal validity are available to improve trustworthiness of future evaluations of FDA policies. PMID:23847020
Efficacy Evaluation of Different Wavelet Feature Extraction Methods on Brain MRI Tumor Detection
NASA Astrophysics Data System (ADS)
Nabizadeh, Nooshin; John, Nigel; Kubat, Miroslav
2014-03-01
Automated Magnetic Resonance Imaging brain tumor detection and segmentation is a challenging task. Among different available methods, feature-based methods are very dominant. While many feature extraction techniques have been employed, it is still not quite clear which of feature extraction methods should be preferred. To help improve the situation, we present the results of a study in which we evaluate the efficiency of using different wavelet transform features extraction methods in brain MRI abnormality detection. Applying T1-weighted brain image, Discrete Wavelet Transform (DWT), Discrete Wavelet Packet Transform (DWPT), Dual Tree Complex Wavelet Transform (DTCWT), and Complex Morlet Wavelet Transform (CMWT) methods are applied to construct the feature pool. Three various classifiers as Support Vector Machine, K Nearest Neighborhood, and Sparse Representation-Based Classifier are applied and compared for classifying the selected features. The results show that DTCWT and CMWT features classified with SVM, result in the highest classification accuracy, proving of capability of wavelet transform features to be informative in this application.
Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, J.; Polly, B.; Collis, J.
2013-09-01
This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less
Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon
2013-09-01
This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less
Burden, Anne; Roche, Nicolas; Miglio, Cristiana; Hillyer, Elizabeth V; Postma, Dirkje S; Herings, Ron Mc; Overbeek, Jetty A; Khalid, Javaria Mona; van Eickels, Daniela; Price, David B
2017-01-01
Cohort matching and regression modeling are used in observational studies to control for confounding factors when estimating treatment effects. Our objective was to evaluate exact matching and propensity score methods by applying them in a 1-year pre-post historical database study to investigate asthma-related outcomes by treatment. We drew on longitudinal medical record data in the PHARMO database for asthma patients prescribed the treatments to be compared (ciclesonide and fine-particle inhaled corticosteroid [ICS]). Propensity score methods that we evaluated were propensity score matching (PSM) using two different algorithms, the inverse probability of treatment weighting (IPTW), covariate adjustment using the propensity score, and propensity score stratification. We defined balance, using standardized differences, as differences of <10% between cohorts. Of 4064 eligible patients, 1382 (34%) were prescribed ciclesonide and 2682 (66%) fine-particle ICS. The IPTW and propensity score-based methods retained more patients (96%-100%) than exact matching (90%); exact matching selected less severe patients. Standardized differences were >10% for four variables in the exact-matched dataset and <10% for both PSM algorithms and the weighted pseudo-dataset used in the IPTW method. With all methods, ciclesonide was associated with better 1-year asthma-related outcomes, at one-third the prescribed dose, than fine-particle ICS; results varied slightly by method, but direction and statistical significance remained the same. We found that each method has its particular strengths, and we recommend at least two methods be applied for each matched cohort study to evaluate the robustness of the findings. Balance diagnostics should be applied with all methods to check the balance of confounders between treatment cohorts. If exact matching is used, the calculation of a propensity score could be useful to identify variables that require balancing, thereby informing the choice of matching criteria together with clinical considerations.
Electrostatics of crossed arrays of strips.
Danicki, Eugene
2010-07-01
The BIS-expansion method is widely applied in analysis of SAW devices. Its generalization is presented for two planar periodic systems of perfectly conducting strips arranged perpendicularly on both sides of a dielectric layer. The generalized method can be applied in the evaluation of capacitances of strips on printed circuits boards and certain microwave devices, but primarily it may help in evaluation of 2-D piezoelectric sensors and actuators, with row and column addressing their elements, and also piezoelectric bulk wave resonators.
Niaksu, Olegas; Zaptorius, Jonas
2014-01-01
This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility.
Methods for evaluating the biological impact of potentially toxic waste applied to soils
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neuhauser, E.F.; Loehr, R.C.; Malecki, M.R.
1985-12-01
The study was designed to evaluate two methods that can be used to estimate the biological impact of organics and inorganics that may be in wastes applied to land for treatment and disposal. The two methods were the contact test and the artificial soil test. The contact test is a 48 hr test using an adult worm, a small glass vial, and filter paper to which the test chemical or waste is applied. The test is designed to provide close contact between the worm and a chemical similar to the situation in soils. The method provides a rapid estimate ofmore » the relative toxicity of chemicals and industrial wastes. The artificial soil test uses a mixture of sand, kaolin, peat, and calcium carbonate as a representative soil. Different concentrations of the test material are added to the artificial soil, adult worms are added and worm survival is evaluated after two weeks. These studies have shown that: earthworms can distinguish between a wide variety of chemicals with a high degree of accuracy.« less
Rajabioun, Mehdi; Nasrabadi, Ali Motie; Shamsollahi, Mohammad Bagher
2017-09-01
Effective connectivity is one of the most important considerations in brain functional mapping via EEG. It demonstrates the effects of a particular active brain region on others. In this paper, a new method is proposed which is based on dual Kalman filter. In this method, firstly by using a brain active localization method (standardized low resolution brain electromagnetic tomography) and applying it to EEG signal, active regions are extracted, and appropriate time model (multivariate autoregressive model) is fitted to extracted brain active sources for evaluating the activity and time dependence between sources. Then, dual Kalman filter is used to estimate model parameters or effective connectivity between active regions. The advantage of this method is the estimation of different brain parts activity simultaneously with the calculation of effective connectivity between active regions. By combining dual Kalman filter with brain source localization methods, in addition to the connectivity estimation between parts, source activity is updated during the time. The proposed method performance has been evaluated firstly by applying it to simulated EEG signals with interacting connectivity simulation between active parts. Noisy simulated signals with different signal to noise ratios are used for evaluating method sensitivity to noise and comparing proposed method performance with other methods. Then the method is applied to real signals and the estimation error during a sweeping window is calculated. By comparing proposed method results in different simulation (simulated and real signals), proposed method gives acceptable results with least mean square error in noisy or real conditions.
Duan, Ran; Fu, Haoda
2015-08-30
Recurrent event data are an important data type for medical research. In particular, many safety endpoints are recurrent outcomes, such as hypoglycemic events. For such a situation, it is important to identify the factors causing these events and rank these factors by their importance. Traditional model selection methods are not able to provide variable importance in this context. Methods that are able to evaluate the variable importance, such as gradient boosting and random forest algorithms, cannot directly be applied to recurrent events data. In this paper, we propose a two-step method that enables us to evaluate the variable importance for recurrent events data. We evaluated the performance of our proposed method by simulations and applied it to a data set from a diabetes study. Copyright © 2015 John Wiley & Sons, Ltd.
Fuzzy rule-based image segmentation in dynamic MR images of the liver
NASA Astrophysics Data System (ADS)
Kobashi, Syoji; Hata, Yutaka; Tokimoto, Yasuhiro; Ishikawa, Makato
2000-06-01
This paper presents a fuzzy rule-based region growing method for segmenting two-dimensional (2-D) and three-dimensional (3- D) magnetic resonance (MR) images. The method is an extension of the conventional region growing method. The proposed method evaluates the growing criteria by using fuzzy inference techniques. The use of the fuzzy if-then rules is appropriate for describing the knowledge of the legions on the MR images. To evaluate the performance of the proposed method, it was applied to artificially generated images. In comparison with the conventional method, the proposed method shows high robustness for noisy images. The method then applied for segmenting the dynamic MR images of the liver. The dynamic MR imaging has been used for diagnosis of hepatocellular carcinoma (HCC), portal hypertension, and so on. Segmenting the liver, portal vein (PV), and inferior vena cava (IVC) can give useful description for the diagnosis, and is a basis work of a pres-surgery planning system and a virtual endoscope. To apply the proposed method, fuzzy if-then rules are derived from the time-density curve of ROIs. In the experimental results, the 2-D reconstructed and 3-D rendered images of the segmented liver, PV, and IVC are shown. The evaluation by a physician shows that the generated images are comparable to the hepatic anatomy, and they would be useful to understanding, diagnosis, and pre-surgery planning.
Subatmospheric vapor pressures evaluated from internal-energy measurements
NASA Astrophysics Data System (ADS)
Duarte-Garza, H. A.; Magee, J. W.
1997-01-01
Vapor pressures were evaluated from measured internal-energy changes in the vapor+liquid two-phase region, Δ U (2). The method employed a thermodynamic relationship between the derivative quantity (ϖ U (2)/ϖ V) T and the vapor pressure ( p σ) and its temperature derivative (ϖ p/ϖ T)σ. This method was applied at temperatures between the triple point and the normal boiling point of three substances: 1,1,1,2-tetrafluoroethane (R134a), pentafluoroethane (R125), and difluoromethane (R32). Agreement with experimentally measured vapor pressures near the normal boiling point (101.325 kPa) was within the experimental uncertainty of approximately ±0.04 kPa (±0.04%). The method was applied to R134a to test the thermodynamic consistency of a published p-p-T equation of state with an equation for p σ for this substance. It was also applied to evaluate published p σ data which are in disagreement by more than their claimed uncertainty.
Feng, Lan; Zhu, Xiaodong; Sun, Xiang
2014-12-15
Coastal reclamation suitability evaluation (CRSE) is a difficult, complex and protracted process requiring the evaluation of many different criteria. In this paper, an integrated framework employing a fuzzy comprehensive evaluation method and analytic hierarchy process (AHP) was applied to the suitability evaluation for coastal reclamation for future sustainable development in the coastal area of Lianyungang, China. The evaluation results classified 6.63%, 22.99%, 31.59% and 38.79% of the coastline as suitable, weakly suitable, unsuitable and forbidden, respectively. The evaluation results were verified by the marine pollution data and highly consistent with the water quality status. The fuzzy-AHP comprehensive evaluation method (FACEM) was found to be suitable for the CRSE. This CRSE can also be applied to other coastal areas in China and thereby be used for the better management of coastal reclamation and coastline protection projects. Copyright © 2014 Elsevier Ltd. All rights reserved.
Shi, Zhenghao; Ma, Jiejue; Feng, Yaning; He, Lifeng; Suzuki, Kenji
2015-11-01
MTANN (Massive Training Artificial Neural Network) is a promising tool, which applied to eliminate false-positive for thoracic CT in recent years. In order to evaluate whether this method is feasible to eliminate false-positive of different CAD schemes, especially, when it is applied to commercial CAD software, this paper evaluate the performance of the method for eliminating false-positives produced by three different versions of commercial CAD software for lung nodules detection in chest radiographs. Experimental results demonstrate that the approach is useful in reducing FPs for different computer aided lung nodules detection software in chest radiographs.
Sum of top-hat transform based algorithm for vessel enhancement in MRA images
NASA Astrophysics Data System (ADS)
Ouazaa, Hibet-Allah; Jlassi, Hajer; Hamrouni, Kamel
2018-04-01
The Magnetic Resonance Angiography (MRA) is rich with information's. But, they suffer from poor contrast, illumination and noise. Thus, it is required to enhance the images. But, these significant information can be lost if improper techniques are applied. Therefore, in this paper, we propose a new method of enhancement. We applied firstly the CLAHE method to increase the contrast of the image. Then, we applied the sum of Top-Hat Transform to increase the brightness of vessels. It is performed with the structuring element oriented in different angles. The methodology is tested and evaluated on the publicly available database BRAINIX. And, we used the measurement methods MSE (Mean Square Error), PSNR (Peak Signal to Noise Ratio) and SNR (Signal to Noise Ratio) for the evaluation. The results demonstrate that the proposed method could efficiently enhance the image details and is comparable with state of the art algorithms. Hence, the proposed method could be broadly used in various applications.
ERIC Educational Resources Information Center
Grammatikopoulos, Vasilis; Zachopoulou, Evridiki; Tsangaridou, Niki; Liukkonen, Jarmo; Pickup, Ian
2008-01-01
The body of research relating to assessment in education suggests that professional developers and seminar administrators have generally paid little attention to evaluation procedures. Scholars have also been critical of evaluations which use a single data source and have favoured the use of a multiple method design to generate a complete picture…
Interference detection and correction applied to incoherent-scatter radar power spectrum measurement
NASA Technical Reports Server (NTRS)
Ying, W. P.; Mathews, J. D.; Rastogi, P. K.
1986-01-01
A median filter based interference detection and correction technique is evaluated and the method applied to the Arecibo incoherent scatter radar D-region ionospheric power spectrum is discussed. The method can be extended to other kinds of data when the statistics involved in the process are still valid.
Nondestructive testing of a weld repair on the I-65 Bridge over the Ohio River at Louisville.
DOT National Transportation Integrated Search
2009-06-01
Nondestructive evaluation methods were applied to verify the structural integrity of a fracture critical structural member on the I-65 John F. Kennedy Memorial Bridge over the Ohio River at Louisville. Several nondestructive evaluation methods includ...
Code of Federal Regulations, 2010 CFR
2010-07-01
... applying existing technology to new products and processes in a general way. Advanced research is most... Category 6.3A) programs within Research, Development, Test and Evaluation (RDT&E). Applied research... technology such as new materials, devices, methods and processes. It typically is funded in Applied Research...
NASA Technical Reports Server (NTRS)
Miller, James G.
1993-01-01
In this Progress Report, we describe our current research activities concerning the development and implementation of advanced ultrasonic nondestructive evaluation methods applied to the characterization of stitched composite materials and bonded aluminum plate specimens. One purpose of this investigation is to identify and characterize specific features of polar backscatter interrogation which enhance the ability of ultrasound to detect flaws in a stitched composite laminate. Another focus is to explore the feasibility of implementing medical linear array imaging technology as a viable ultrasonic-based nondestructive evaluation method to inspect and characterize bonded aluminum lap joints. As an approach to implementing quantitative ultrasonic inspection methods to both of these materials, we focus on the physics that underlies the detection of flaws in such materials.
ERIC Educational Resources Information Center
Stuebing, Karla K.; Fletcher, Jack M.; Branum-Martin, Lee; Francis, David J.
2012-01-01
This study used simulation techniques to evaluate the technical adequacy of three methods for the identification of specific learning disabilities via patterns of strengths and weaknesses in cognitive processing. Latent and observed data were generated and the decision-making process of each method was applied to assess concordance in…
Critical path method applied to research project planning: Fire Economics Evaluation System (FEES)
Earl B. Anderson; R. Stanton Hales
1986-01-01
The critical path method (CPM) of network analysis (a) depicts precedence among the many activities in a project by a network diagram; (b) identifies critical activities by calculating their starting, finishing, and float times; and (c) displays possible schedules by constructing time charts. CPM was applied to the development of the Forest Service's Fire...
Evaluation Methods Sourcebook.
ERIC Educational Resources Information Center
Love, Arnold J., Ed.
The chapters commissioned for this book describe key aspects of evaluation methodology as they are practiced in a Canadian context, providing representative illustrations of recent developments in evaluation methodology as it is currently applied. The following chapters are included: (1) "Program Evaluation with Limited Fiscal and Human…
ERIC Educational Resources Information Center
Nielsen, Karina; Randall, Raymond; Christensen, Karl B.
2017-01-01
A mixed methods approach was applied to examine the effects of a naturally occurring teamwork intervention supported with training. The first objective was to integrate qualitative process evaluation and quantitative effect evaluation to examine "how" and "why" the training influence intervention outcomes. The intervention (N =…
Murphy, Thomas; Schwedock, Julie; Nguyen, Kham; Mills, Anna; Jones, David
2015-01-01
New recommendations for the validation of rapid microbiological methods have been included in the revised Technical Report 33 release from the PDA. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This case study applies those statistical methods to accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological methods system being evaluated for water bioburden testing. Results presented demonstrate that the statistical methods described in the PDA Technical Report 33 chapter can all be successfully applied to the rapid microbiological method data sets and gave the same interpretation for equivalence to the standard method. The rapid microbiological method was in general able to pass the requirements of PDA Technical Report 33, though the study shows that there can be occasional outlying results and that caution should be used when applying statistical methods to low average colony-forming unit values. Prior to use in a quality-controlled environment, any new method or technology has to be shown to work as designed by the manufacturer for the purpose required. For new rapid microbiological methods that detect and enumerate contaminating microorganisms, additional recommendations have been provided in the revised PDA Technical Report No. 33. The changes include a more comprehensive review of the statistical methods to be used to analyze data obtained during validation. This paper applies those statistical methods to analyze accuracy, precision, ruggedness, and equivalence data obtained using a rapid microbiological method system being validated for water bioburden testing. The case study demonstrates that the statistical methods described in the PDA Technical Report No. 33 chapter can be successfully applied to rapid microbiological method data sets and give the same comparability results for similarity or difference as the standard method. © PDA, Inc. 2015.
Köroğlu, Fahri; Çolak, Tuğba Kuru; Polat, M Gülden
2017-09-22
Low back pain is one of the most important causes of morbidity. This study was designed to evaluate the effect of Kinesio® taping on pain, functionality, mobility and endurance in chronic low back pain treatment. Patients with chronic low back pain were randomly divided into three groups. Therapeutic ultrasound, hot packs, and transcutaneous electrical nerve stimulation were applied to each group for ten sessions during two weeks, and therapeutic exercises were applied in the clinic under physiotherapist supervision starting from the sixth session. Kinesio® tape was applied to the patients in the first group after each treatment session, and placebo tape was applied to the patients in the second group. No taping was applied to the third group, which constituted the control group. All the patients were evaluated pre and post-treatment in respect of pain, functional status (Oswestry scale), flexibility and endurance. The study included 60 patients (32 females). When the initial demographic and clinical characteristics of the groups were evaluated, all assessment results, except the Oswestry scores, were similar (p= 0.000). When the average changes in the clinical evaluations were evaluated after the treatment, a statistically significant improvement demonstrating the superiority of the taping group was observed in pain, functionality, flexibility and endurance values (p= 0.000, 0.000, 0.000, 0.000). Kinesio® taping in chronic low back pain is an easy and effective method which increases the effectiveness of the treatment significantly in a short period when applied in addition to exercise and electrotherapy methods.
Link, Manuela; Schmid, Christina; Pleus, Stefan; Baumstark, Annette; Rittmeyer, Delia; Haug, Cornelia; Freckmann, Guido
2015-04-14
The standard ISO (International Organization for Standardization) 15197 is widely accepted for the accuracy evaluation of systems for self-monitoring of blood glucose (SMBG). Accuracy evaluation was performed for 4 SMBG systems (Accu-Chek Aviva, ContourXT, GlucoCheck XL, GlucoMen LX PLUS) with 3 test strip lots each. To investigate a possible impact of the comparison method on system accuracy data, 2 different established methods were used. The evaluation was performed in a standardized manner following test procedures described in ISO 15197:2003 (section 7.3). System accuracy was assessed by applying ISO 15197:2003 and in addition ISO 15197:2013 criteria (section 6.3.3). For each system, comparison measurements were performed with a glucose oxidase (YSI 2300 STAT Plus glucose analyzer) and a hexokinase (cobas c111) method. All 4 systems fulfilled the accuracy requirements of ISO 15197:2003 with the tested lots. More stringent accuracy criteria of ISO 15197:2013 were fulfilled by 3 systems (Accu-Chek Aviva, ContourXT, GlucoMen LX PLUS) when compared to the manufacturer's comparison method and by 2 systems (Accu-Chek Aviva, ContourXT) when compared to the alternative comparison method. All systems showed lot-to-lot variability to a certain degree; 2 systems (Accu-Chek Aviva, ContourXT), however, showed only minimal differences in relative bias between the 3 evaluated lots. In this study, all 4 systems complied with the evaluated test strip lots with accuracy criteria of ISO 15197:2003. Applying ISO 15197:2013 accuracy limits, differences in the accuracy of the tested systems were observed, also demonstrating that the applied comparison method/system and the lot-to-lot variability can have a decisive influence on accuracy data obtained for a SMBG system. © 2015 Diabetes Technology Society.
Hu, Yannan; van Lenthe, Frank J; Hoffmann, Rasmus; van Hedel, Karen; Mackenbach, Johan P
2017-04-20
The scientific evidence-base for policies to tackle health inequalities is limited. Natural policy experiments (NPE) have drawn increasing attention as a means to evaluating the effects of policies on health. Several analytical methods can be used to evaluate the outcomes of NPEs in terms of average population health, but it is unclear whether they can also be used to assess the outcomes of NPEs in terms of health inequalities. The aim of this study therefore was to assess whether, and to demonstrate how, a number of commonly used analytical methods for the evaluation of NPEs can be applied to quantify the effect of policies on health inequalities. We identified seven quantitative analytical methods for the evaluation of NPEs: regression adjustment, propensity score matching, difference-in-differences analysis, fixed effects analysis, instrumental variable analysis, regression discontinuity and interrupted time-series. We assessed whether these methods can be used to quantify the effect of policies on the magnitude of health inequalities either by conducting a stratified analysis or by including an interaction term, and illustrated both approaches in a fictitious numerical example. All seven methods can be used to quantify the equity impact of policies on absolute and relative inequalities in health by conducting an analysis stratified by socioeconomic position, and all but one (propensity score matching) can be used to quantify equity impacts by inclusion of an interaction term between socioeconomic position and policy exposure. Methods commonly used in economics and econometrics for the evaluation of NPEs can also be applied to assess the equity impact of policies, and our illustrations provide guidance on how to do this appropriately. The low external validity of results from instrumental variable analysis and regression discontinuity makes these methods less desirable for assessing policy effects on population-level health inequalities. Increased use of the methods in social epidemiology will help to build an evidence base to support policy making in the area of health inequalities.
A note on evaluating VAN earthquake predictions
NASA Astrophysics Data System (ADS)
Tselentis, G.-Akis; Melis, Nicos S.
The evaluation of the success level of an earthquake prediction method should not be based on approaches that apply generalized strict statistical laws and avoid the specific nature of the earthquake phenomenon. Fault rupture processes cannot be compared to gambling processes. The outcome of the present note is that even an ideal earthquake prediction method is still shown to be a matter of a “chancy” association between precursors and earthquakes if we apply the same procedure proposed by Mulargia and Gasperini [1992] in evaluating VAN earthquake predictions. Each individual VAN prediction has to be evaluated separately, taking always into account the specific circumstances and information available. The success level of epicenter prediction should depend on the earthquake magnitude, and magnitude and time predictions may depend on earthquake clustering and the tectonic regime respectively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jordan, G.
1995-10-30
The objective of the workshop was to promote discussions between experts and research managers on developing approaches for assessing the impact of DOE`s basic energy research upon the energy mission, applied research, technology transfer, the economy, and society. The purpose of this impact assessment is to demonstrate results and improve ER research programs in this era when basic research is expected to meet changing national economic and social goals. The questions addressed were: (1) By what criteria and metrics does Energy Research measure performance and evaluate its impact on the DOE mission and society while maintaining an environment that fostersmore » basic research? (2) What combination of evaluation methods best applies to assessing the performance and impact of OBES basic research? The focus will be upon the following methods: Case studies, User surveys, Citation analysis, TRACES approach, Return on DOE investment (ROI)/Econometrics, and Expert panels. (3) What combination of methods and specific rules of thumb can be applied to capture impacts along the spectrum from basic research to products and societal impacts?« less
Comparative study on the welded structure fatigue strength assessment method
NASA Astrophysics Data System (ADS)
Hu, Tao
2018-04-01
Due to the welding structure is widely applied in various industries, especially the pressure container, motorcycle, automobile, aviation, ship industry, such as large crane steel structure, so for welded structure fatigue strength evaluation is particularly important. For welded structure fatigue strength evaluation method mainly has four kinds of, the more from the use of two kinds of welded structure fatigue strength evaluation method, namely the nominal stress method and the hot spot stress evaluation method, comparing from its principle, calculation method for the process analysis and research, compare the similarities and the advantages and disadvantages, the analysis of practical engineering problems to provide the reference for every profession and trade, as well as the future welded structure fatigue strength and life evaluation method put forward outlook.
A strategy for evaluating pathway analysis methods.
Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques
2017-10-13
Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth, either established or assumed, of the pathways perturbed by a specific clinical or experimental condition. As such, our strategy allows researchers to systematically and objectively evaluate pathway analysis methods by employing any number of datasets for a variety of conditions.
Brown Connolly, Nancy E
2014-12-01
This foundational study applies the process of receiver operating characteristic (ROC) analysis to evaluate utility and predictive value of a disease management (DM) model that uses RM devices for chronic obstructive pulmonary disease (COPD). The literature identifies a need for a more rigorous method to validate and quantify evidence-based value for remote monitoring (RM) systems being used to monitor persons with a chronic disease. ROC analysis is an engineering approach widely applied in medical testing, but that has not been evaluated for its utility in RM. Classifiers (saturated peripheral oxygen [SPO2], blood pressure [BP], and pulse), optimum threshold, and predictive accuracy are evaluated based on patient outcomes. Parametric and nonparametric methods were used. Event-based patient outcomes included inpatient hospitalization, accident and emergency, and home health visits. Statistical analysis tools included Microsoft (Redmond, WA) Excel(®) and MedCalc(®) (MedCalc Software, Ostend, Belgium) version 12 © 1993-2013 to generate ROC curves and statistics. Persons with COPD were monitored a minimum of 183 days, with at least one inpatient hospitalization within 12 months prior to monitoring. Retrospective, de-identified patient data from a United Kingdom National Health System COPD program were used. Datasets included biometric readings, alerts, and resource utilization. SPO2 was identified as a predictive classifier, with an optimal average threshold setting of 85-86%. BP and pulse were failed classifiers, and areas of design were identified that may improve utility and predictive capacity. Cost avoidance methodology was developed. RESULTS can be applied to health services planning decisions. Methods can be applied to system design and evaluation based on patient outcomes. This study validated the use of ROC in RM program evaluation.
Connecting clinical and actuarial prediction with rule-based methods.
Fokkema, Marjolein; Smits, Niels; Kelderman, Henk; Penninx, Brenda W J H
2015-06-01
Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction methods for clinical practice. We argue that rule-based methods may be more useful than the linear main effect models usually employed in prediction studies, from a data and decision analytic as well as a practical perspective. In addition, decision rules derived with rule-based methods can be represented as fast and frugal trees, which, unlike main effects models, can be used in a sequential fashion, reducing the number of cues that have to be evaluated before making a prediction. We illustrate the usability of rule-based methods by applying RuleFit, an algorithm for deriving decision rules for classification and regression problems, to a dataset on prediction of the course of depressive and anxiety disorders from Penninx et al. (2011). The RuleFit algorithm provided a model consisting of 2 simple decision rules, requiring evaluation of only 2 to 4 cues. Predictive accuracy of the 2-rule model was very similar to that of a logistic regression model incorporating 20 predictor variables, originally applied to the dataset. In addition, the 2-rule model required, on average, evaluation of only 3 cues. Therefore, the RuleFit algorithm appears to be a promising method for creating decision tools that are less time consuming and easier to apply in psychological practice, and with accuracy comparable to traditional actuarial methods. (c) 2015 APA, all rights reserved).
2010-10-01
Downloaded on February 20,2010 at 10:55:59 EST from IEEE Xplore . Restrictions apply. STUDENSKI et al.: ACQUISITION AND PROCESSING METHODS FOR A BEDSIDE...February 20,2010 at 10:55:59 EST from IEEE Xplore . Restrictions apply. 208 IEEE TRANSACTIONS ON NUCLEAR SCIENCE, VOL. 57, NO. 1, FEBRUARY 2010 from the...59 EST from IEEE Xplore . Restrictions apply. STUDENSKI et al.: ACQUISITION AND PROCESSING METHODS FOR A BEDSIDE CARDIAC SPECT IMAGING SYSTEM 209
Hoseini, Bibi Leila; Mazloum, Seyed Reza; Jafarnejad, Farzaneh; Foroughipour, Mohsen
2013-03-01
The clinical evaluation, as one of the most important elements in medical education, must measure students' competencies and abilities. The implementation of any assessment tool is basically dependent on the acceptance of students. This study tried to assess midwifery students' satisfaction with Direct Observation of Procedural Skills (DOPS) and current clinical evaluation methods. This quasi-experimental study was conducted in the university hospitals affiliated to Mashhad University of Medical Sciences. The subjects comprised 67 undergraduate midwifery students selected by convenience sampling and allocated to control and intervention groups according to the training transposition. Current method was performed in the control group, and DOPS was conducted in the intervention group. The applied tools included DOPS rating scales, logbook, and satisfaction questionnaires with clinical evaluation methods. Validity and reliability of these tools were approved. At the end of training, students' satisfaction with the evaluation methods was assessed by the mentioned tools. The data were analyzed by descriptive and analytical statistics. Satisfaction mean scores of midwifery students with DOPS and current methods were 76.7 ± 12.9 and 62.6 ± 14.7 (out of 100), respectively. DOPS students' satisfaction mean score was significantly higher than the score obtained in current method (P < 0.000). The most satisfactory domains in the current method were "consistence with learning objectives" (71.2 ± 14.9) and "objectiveness" in DOPS (87.9 ± 15.0). In contrast, the least satisfactory domains in the current method were "interested in applying the method" (57.8 ± 26.5) and "number of assessments for each skill" (58.8 ± 25.9) in DOPS method. This study showed that DOPS method is associated with greater students' satisfaction. Since the students' satisfaction with the current method was also acceptable, we recommend combining this new clinical evaluation method with the current method, which covers its weaknesses, to promote the students' satisfaction with clinical evaluation methods in a perfect manner.
Sala, Emma; Bonfiglioli, Roberta; Fostinellil, Jacopo; Tomasi, Cesare; Graziosi, Francesca; Violante, Francesco S; Apostoli, Pietro
2014-01-01
Risk assessment for upper extremity work related muscoloskeletal disorders by applying six methods of ergonomic: a ten years experience. The objective of this research was to verify and validate the multiple step method suggested by SIMLII guidelines and to compare results obtained by use of these methods: Washington State Standard, OCRA, HAL, RULA, OREGE and STRAIN INDEX. 598 workstations for a total of 1800 analysis by different methods were considered, by adopting the following multiple step procedure: prelinminary evaluation by Washington State method and OCRA checklist in all the working stations, RULA or HAL as first level evaluation, OREGE or SI as second level evaluation. The preliminary evaluation resulted negative (risk absent) in the 75% of examined work stations and by using checklist OCRA optimal-acceptable condition was found in 58% by HAL in 92% of analysis, by RULA in 100%, by OREGE in 64%; by SI in 70% of examined working positions. We observed similar evaluation of strain among methods and main differences have been observed in posture and frequency assessment. The preliminary evaluation by State of Washington method appears to be an adequate instrument for identify the working condition at risk. All the adopted methods were in a good agreement in two estreme situations: high risk or absent risk, expecially in absent risk conditions. Level of accordance varied on the basis of their rationale and of the role of their different components so SIML indications about the critical use of biouzechanical methods and about the possible use of more than one of them (considering working chlaracteristics) have been confirmed.
Evaluation of new techniques for the calculation of internal recirculating flows
NASA Technical Reports Server (NTRS)
Van Doormaal, J. P.; Turan, A.; Raithby, G. D.
1987-01-01
The performance of discrete methods for the prediction of fluid flows can be enhanced by improving the convergence rate of solvers and by increasing the accuracy of the discrete representation of the equations of motion. This paper evaluates the gains in solver performance that are available when various acceleration methods are applied. Various discretizations are also examined and two are recommended because of their accuracy and robustness. Insertion of the improved discretization and solver accelerator into a TEACH code, that has been widely applied to combustor flows, illustrates the substantial gains that can be achieved.
Evaluating Equating Results: Percent Relative Error for Chained Kernel Equating
ERIC Educational Resources Information Center
Jiang, Yanlin; von Davier, Alina A.; Chen, Haiwen
2012-01-01
This article presents a method for evaluating equating results. Within the kernel equating framework, the percent relative error (PRE) for chained equipercentile equating was computed under the nonequivalent groups with anchor test (NEAT) design. The method was applied to two data sets to obtain the PRE, which can be used to measure equating…
Varmazyar, Mohsen; Dehghanbaghi, Maryam; Afkhami, Mehdi
2016-10-01
Balanced Scorecard (BSC) is a strategic evaluation tool using both financial and non-financial indicators to determine the business performance of organizations or companies. In this paper, a new integrated approach based on the Balanced Scorecard (BSC) and multi-criteria decision making (MCDM) methods are proposed to evaluate the performance of research centers of research and technology organization (RTO) in Iran. Decision-Making Trial and Evaluation Laboratory (DEMATEL) are employed to reflect the interdependencies among BSC perspectives. Then, Analytic Network Process (ANP) is utilized to weight the indices influencing the considered problem. In the next step, we apply four MCDM methods including Additive Ratio Assessment (ARAS), Complex Proportional Assessment (COPRAS), Multi-Objective Optimization by Ratio Analysis (MOORA), and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for ranking of alternatives. Finally, the utility interval technique is applied to combine the ranking results of MCDM methods. Weighted utility intervals are computed by constructing a correlation matrix between the ranking methods. A real case is presented to show the efficacy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.
1981-01-01
Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.
Hu, Wei; Liu, Guangbing; Tu, Yong
2016-01-01
This paper applied the fuzzy comprehensive evaluation (FCE) technique and analytic hierarchy process (AHP) procedure to evaluate the wastewater treatment for enterprises. Based on the characteristics of wastewater treatment for enterprises in Taihu basin, an evaluating index system was established for enterprise and analysis hierarchy process method was applied to determine index weight. Then the AHP and FCE methods were combined to validate the wastewater treatment level of 3 representative enterprises. The results show that the evaluation grade of enterprise 1, enterprise 2 and enterprise 3 was middle, good and excellent, respectively. Finally, the scores of 3 enterprises were calculated according to the hundred-mark system, and enterprise 3 has the highest wastewater treatment level, followed by enterprise 2 and enterprises 1. The application of this work can make the evaluation results more scientific and accurate. It is expected that this work may serve as an assistance tool for managers of enterprise in improving the wastewater treatment level.
Berns, U; Hemprich, L
2001-01-01
In No. 8, 48th year, August 1998, of the journal "Psychotherapie--Psychosomatik--Medizinische Psychologie" the tape recorder transcription of the 290th session of a long-term analysis was studied by three methods (BIP, Frames, ZBKT). The paper presented here was stimulated by this publication. From the author's viewpoint substantial clinical aspects of evaluation could be added by applying a clinical evaluation method developed by R. Langs and his corresponding concept of interpretation. Clinical vignettes exemplify the possibility to resolve pathological countertransference by using this evaluation method. With the help of this method the presented transcription of the 290th session is evaluated partially.
Monkman, Helen; Kushniruk, Andre
2013-01-01
The prevalence of consumer health information systems is increasing. However, usability and health literacy impact both the value and adoption of these systems. Health literacy and usability are closely related in that systems may not be used accurately if users cannot understand the information therein. Thus, it is imperative to focus on mitigating the demands on health literacy in consumer health information systems. This study modified two usability evaluation methods (heuristic evaluation and usability testing) to incorporate the identification of potential health literacy issues in a Personal Health Record (PHR). Heuristic evaluation is an analysis of a system performed by a usability specialist who evaluates how well the system abides by usability principles. In contrast, a usability test involves a post hoc analysis of a representative user interacting with the system. These two methods revealed several health literacy issues and suggestions to ameliorate them were made. Thus, it was demonstrated that usability methods could be successfully augmented for the purpose of investigating health literacy issues. To improve users' health knowledge, the adoption of consumer health information systems, and the accuracy of the information contained therein, it is encouraged that usability methods be applied with an added focus on health literacy.
ERIC Educational Resources Information Center
Byrn, Darcie; And Others
The authors have written this manual to aid workers in the Cooperative Extension Service of the United States to be better able to understand and apply the principles and methods of evaluation. The manual contains three sections which cover the nature and place of evaluation in extension work, the evaluation process, and the uses of evaluation…
Developing Methodologies for Evaluating the Earthquake Safety of Existing Buildings.
ERIC Educational Resources Information Center
Bresler, B.; And Others
This report contains four papers written during an investigation of methods for evaluating the safety of existing school buildings under Research Applied to National Needs (RANN) grants. In "Evaluation of Earthquake Safety of Existing Buildings," by B. Bresler, preliminary ideas on the evaluation of the earthquake safety of existing…
Reliability analysis of composite structures
NASA Technical Reports Server (NTRS)
Kan, Han-Pin
1992-01-01
A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.
Krutulyte, Grazina; Kimtys, Algimantas; Krisciūnas, Aleksandras
2003-01-01
The purpose of this study was to examine whether two different physiotherapy regimes caused any differences in outcome in the rehabilitation after stroke. We examined 240 patients with stroke. Examination was carried out at the Rehabilitation Center of Kaunas Second Clinical Hospital. Patients were divided into 2 groups: Bobath method was applied to the first (I) group (n=147), motor relearning program (MRP) method was applied to the second (II) group (n=93). In every group of patients we established samples according to sex, age, hospitalization to rehab unit as occurrence of CVA degree of disorder (hemiplegia, hemiparesis). The mobility of patients was evaluated according to European Federation for Research in Rehabilitation (EFRR) scale. Activities of daily living were evaluated by Barthel index. Analyzed groups were evaluated before physical therapy. When preliminary analysis was carried out it proved no statically reliable differences between analyzed groups (reliability 95%). The same statistical analysis was carried out after physical therapy. The results of differences between patient groups were compared using chi(2) method. Bobath method was applied working with the first group of patients. The aim of the method is to improve quality of the affected body side's movements in order to keep both sides working as harmoniously as possible. While applying this method at work, physical therapist guides patient's body on key-points, stimulating normal postural reactions, and training normal movement pattern. MRP method was used while working with the second group patients. This method is based on movement science, biomechanics and training of functional movement. Program is based on idea that movement pattern shouldn't be trained; it must be relearned. CONCLUSION. This study indicates that physiotherapy with task-oriented strategies represented by MRP, is preferable to physiotherapy with facilitation/inhibition strategies, such the Bobath programme, in the rehabilitation of stroke patients (p< 0.05).
Quantitative Evaluation of Management Courses: Part 1
ERIC Educational Resources Information Center
Cunningham, Cyril
1973-01-01
The author describes how he developed a method of evaluating and comparing management courses of different types and lengths by applying an ordinal system of relative values using a process of transmutation. (MS)
Absolute order-of-magnitude reasoning applied to a social multi-criteria evaluation framework
NASA Astrophysics Data System (ADS)
Afsordegan, A.; Sánchez, M.; Agell, N.; Aguado, J. C.; Gamboa, G.
2016-03-01
A social multi-criteria evaluation framework for solving a real-case problem of selecting a wind farm location in the regions of Urgell and Conca de Barberá in Catalonia (northeast of Spain) is studied. This paper applies a qualitative multi-criteria decision analysis approach based on linguistic labels assessment able to address uncertainty and deal with different levels of precision. This method is based on qualitative reasoning as an artificial intelligence technique for assessing and ranking multi-attribute alternatives with linguistic labels in order to handle uncertainty. This method is suitable for problems in the social framework such as energy planning which require the construction of a dialogue process among many social actors with high level of complexity and uncertainty. The method is compared with an existing approach, which has been applied previously in the wind farm location problem. This approach, consisting of an outranking method, is based on Condorcet's original method. The results obtained by both approaches are analysed and their performance in the selection of the wind farm location is compared in aggregation procedures. Although results show that both methods conduct to similar alternatives rankings, the study highlights both their advantages and drawbacks.
Tugwell, Peter; Pottie, Kevin; Welch, Vivian; Ueffing, Erin; Chambers, Andrea; Feightner, John
2011-01-01
Background: This article describes the evidence review and guideline development method developed for the Clinical Preventive Guidelines for Immigrants and Refugees in Canada by the Canadian Collaboration for Immigrant and Refugee Health Guideline Committee. Methods: The Appraisal of Guidelines for Research and Evaluation (AGREE) best-practice framework was combined with the recently developed Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to produce evidence-based clinical guidelines for immigrants and refugees in Canada. Results: A systematic approach was designed to produce the evidence reviews and apply the GRADE approach, including building on evidence from previous systematic reviews, searching for and comparing evidence between general and specific immigrant populations, and applying the GRADE criteria for making recommendations. This method was used for priority health conditions that had been selected by practitioners caring for immigrants and refugees in Canada. Interpretation: This article outlines the 14-step method that was defined to standardize the guideline development process for each priority health condition. PMID:20573711
Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.
Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R
2014-03-01
A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.
NASA Astrophysics Data System (ADS)
Ariyarit, Atthaphon; Sugiura, Masahiko; Tanabe, Yasutada; Kanazaki, Masahiro
2018-06-01
A multi-fidelity optimization technique by an efficient global optimization process using a hybrid surrogate model is investigated for solving real-world design problems. The model constructs the local deviation using the kriging method and the global model using a radial basis function. The expected improvement is computed to decide additional samples that can improve the model. The approach was first investigated by solving mathematical test problems. The results were compared with optimization results from an ordinary kriging method and a co-kriging method, and the proposed method produced the best solution. The proposed method was also applied to aerodynamic design optimization of helicopter blades to obtain the maximum blade efficiency. The optimal shape obtained by the proposed method achieved performance almost equivalent to that obtained using the high-fidelity, evaluation-based single-fidelity optimization. Comparing all three methods, the proposed method required the lowest total number of high-fidelity evaluation runs to obtain a converged solution.
Szerkus, Oliwia; Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bartosińska, Ewa; Bujak, Renata; Borsuk, Agnieszka; Bienert, Agnieszka; Bartkowska-Śniatkowska, Alicja; Warzybok, Justyna; Wiczling, Paweł; Nasal, Antoni; Kaliszan, Roman; Markuszewski, Michał Jan; Siluk, Danuta
2017-02-01
The purpose of this work was to develop and validate a rapid and robust LC-MS/MS method for the determination of dexmedetomidine (DEX) in plasma, suitable for analysis of a large number of samples. Systematic approach, Design of Experiments, was applied to optimize ESI source parameters and to evaluate method robustness, therefore, a rapid, stable and cost-effective assay was developed. The method was validated according to US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (5-2500 pg/ml), Results: Experimental design approach was applied for optimization of ESI source parameters and evaluation of method robustness. The method was validated according to the US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (R 2 > 0.98). The accuracies, intra- and interday precisions were less than 15%. The stability data confirmed reliable behavior of DEX under tested conditions. Application of Design of Experiments approach allowed for fast and efficient analytical method development and validation as well as for reduced usage of chemicals necessary for regular method optimization. The proposed technique was applied to determination of DEX pharmacokinetics in pediatric patients undergoing long-term sedation in the intensive care unit.
Quantitative methods for analysing cumulative effects on fish migration success: a review.
Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G
2012-07-01
It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.
Applied Cognitive Task Analysis (ACTA) Methodology
1997-11-01
experienced based cognitive skills. The primary goal of this project was to develop streamlined methods of Cognitive Task Analysis that would fill this need...We have made important progression this direction. We have developed streamlined methods of Cognitive Task Analysis . Our evaluation study indicates...developed a CD-based stand alone instructional package, which will make the Applied Cognitive Task Analysis (ACTA) tools widely accessible. A survey of the
A Multivariate Randomization Text of Association Applied to Cognitive Test Results
NASA Technical Reports Server (NTRS)
Ahumada, Albert; Beard, Bettina
2009-01-01
Randomization tests provide a conceptually simple, distribution-free way to implement significance testing. We have applied this method to the problem of evaluating the significance of the association among a number (k) of variables. The randomization method was the random re-ordering of k-1 of the variables. The criterion variable was the value of the largest eigenvalue of the correlation matrix.
An Exploratory Study of the Role of Human Resource Management in Models of Employee Turnover
ERIC Educational Resources Information Center
Ozolina-Ozola, Iveta
2016-01-01
The purpose of this paper is to present the study results of the human resource management role in the voluntary employee turnover models. The mixed methods design was applied. On the basis of the results of the search and evaluation of publications, the 16 models of employee turnover were selected. Applying the method of content analysis, the…
Flexible methods for segmentation evaluation: results from CT-based luggage screening.
Karimi, Seemeen; Jiang, Xiaoqian; Cosman, Pamela; Martz, Harry
2014-01-01
Imaging systems used in aviation security include segmentation algorithms in an automatic threat recognition pipeline. The segmentation algorithms evolve in response to emerging threats and changing performance requirements. Analysis of segmentation algorithms' behavior, including the nature of errors and feature recovery, facilitates their development. However, evaluation methods from the literature provide limited characterization of the segmentation algorithms. To develop segmentation evaluation methods that measure systematic errors such as oversegmentation and undersegmentation, outliers, and overall errors. The methods must measure feature recovery and allow us to prioritize segments. We developed two complementary evaluation methods using statistical techniques and information theory. We also created a semi-automatic method to define ground truth from 3D images. We applied our methods to evaluate five segmentation algorithms developed for CT luggage screening. We validated our methods with synthetic problems and an observer evaluation. Both methods selected the same best segmentation algorithm. Human evaluation confirmed the findings. The measurement of systematic errors and prioritization helped in understanding the behavior of each segmentation algorithm. Our evaluation methods allow us to measure and explain the accuracy of segmentation algorithms.
Chen, Shuonan; Mar, Jessica C
2018-06-19
A fundamental fact in biology states that genes do not operate in isolation, and yet, methods that infer regulatory networks for single cell gene expression data have been slow to emerge. With single cell sequencing methods now becoming accessible, general network inference algorithms that were initially developed for data collected from bulk samples may not be suitable for single cells. Meanwhile, although methods that are specific for single cell data are now emerging, whether they have improved performance over general methods is unknown. In this study, we evaluate the applicability of five general methods and three single cell methods for inferring gene regulatory networks from both experimental single cell gene expression data and in silico simulated data. Standard evaluation metrics using ROC curves and Precision-Recall curves against reference sets sourced from the literature demonstrated that most of the methods performed poorly when they were applied to either experimental single cell data, or simulated single cell data, which demonstrates their lack of performance for this task. Using default settings, network methods were applied to the same datasets. Comparisons of the learned networks highlighted the uniqueness of some predicted edges for each method. The fact that different methods infer networks that vary substantially reflects the underlying mathematical rationale and assumptions that distinguish network methods from each other. This study provides a comprehensive evaluation of network modeling algorithms applied to experimental single cell gene expression data and in silico simulated datasets where the network structure is known. Comparisons demonstrate that most of these assessed network methods are not able to predict network structures from single cell expression data accurately, even if they are specifically developed for single cell methods. Also, single cell methods, which usually depend on more elaborative algorithms, in general have less similarity to each other in the sets of edges detected. The results from this study emphasize the importance for developing more accurate optimized network modeling methods that are compatible for single cell data. Newly-developed single cell methods may uniquely capture particular features of potential gene-gene relationships, and caution should be taken when we interpret these results.
Afzali, Maryam; Fatemizadeh, Emad; Soltanian-Zadeh, Hamid
2015-09-30
Diffusion weighted imaging (DWI) is a non-invasive method for investigating the brain white matter structure and can be used to evaluate fiber bundles. However, due to practical constraints, DWI data acquired in clinics are low resolution. This paper proposes a method for interpolation of orientation distribution functions (ODFs). To this end, fuzzy clustering is applied to segment ODFs based on the principal diffusion directions (PDDs). Next, a cluster is modeled by a tensor so that an ODF is represented by a mixture of tensors. For interpolation, each tensor is rotated separately. The method is applied on the synthetic and real DWI data of control and epileptic subjects. Both experiments illustrate capability of the method in increasing spatial resolution of the data in the ODF field properly. The real dataset show that the method is capable of reliable identification of differences between temporal lobe epilepsy (TLE) patients and normal subjects. The method is compared to existing methods. Comparison studies show that the proposed method generates smaller angular errors relative to the existing methods. Another advantage of the method is that it does not require an iterative algorithm to find the tensors. The proposed method is appropriate for increasing resolution in the ODF field and can be applied to clinical data to improve evaluation of white matter fibers in the brain. Copyright © 2015 Elsevier B.V. All rights reserved.
Pérez-Olmos, R; Rios, A; Fernández, J R; Lapa, R A; Lima, J L
2001-01-05
In this paper, the construction and evaluation of an electrode selective to nitrate with improved sensitivity, constructed like a conventional electrode (ISE) but using an operational amplifier to sum the potentials supplied by four membranes (ESOA) is described. The two types of electrodes, without an inner reference solution, were constructed using tetraoctylammonium bromide as sensor, dibutylphthalate as solvent mediator and PVC as plastic matrix, the membranes obtained directly applied onto a conductive epoxy resin support. After the comparative evaluation of their working characteristics they were used in the determination of nitrate in different types of tobacco. The limit of detection of the direct potentiometric method developed was found to be 0.18 g kg(-1) and the precision and accuracy of the method, when applied to eight different samples of tobacco, expressed in terms of mean R.S.D. and average percentage of spike recovery was 0.6 and 100.3%, respectively. The comparison of variances showed, on all ocassions, that the results obtained by the ESOA were similar to those obtained by the conventional ISE, but with higher precision. Linear regression analysis showed good agreement (r=0.9994) between the results obtained by the developed potentiometric method and those of a spectrophotometric method based on brucine, adopted as reference method, when applied simultaneously to 32 samples of different types of tobacco.
NASA Astrophysics Data System (ADS)
Mao, Hanling; Zhang, Yuhua; Mao, Hanying; Li, Xinxin; Huang, Zhenfeng
2018-06-01
This paper presents the study of applying the nonlinear ultrasonic wave to evaluate the stress state of metallic materials under steady state. The pre-stress loading method is applied to guarantee components with steady stress. Three kinds of nonlinear ultrasonic experiments based on critically refracted longitudinal wave are conducted on components which the critically refracted longitudinal wave propagates along x, x1 and x2 direction. Experimental results indicate the second and third order relative nonlinear coefficients monotonically increase with stress, and the normalized relationship is consistent with simplified dislocation models, which indicates the experimental result is logical. The combined ultrasonic nonlinear parameter is proposed, and three stress evaluation models at x direction are established based on three ultrasonic nonlinear parameters, which the estimation error is below 5%. Then two stress detection models at x1 and x2 direction are built based on combined ultrasonic nonlinear parameter, the stress synthesis method is applied to calculate the magnitude and direction of principal stress. The results show the prediction error is within 5% and the angle deviation is within 1.5°. Therefore the nonlinear ultrasonic technique based on LCR wave could be applied to nondestructively evaluate the stress of metallic materials under steady state which the magnitude and direction are included.
On the effect of boundary layer growth on the stability of compressible flows
NASA Technical Reports Server (NTRS)
El-Hady, N. M.
1981-01-01
The method of multiple scales is used to describe a formally correct method based on the nonparallel linear stability theory, that examines the two and three dimensional stability of compressible boundary layer flows. The method is applied to the supersonic flat plate layer at Mach number 4.5. The theoretical growth rates are in good agreement with experimental results. The method is also applied to the infinite-span swept wing transonic boundary layer with suction to evaluate the effect of the nonparallel flow on the development of crossflow disturbances.
A New Approach to Aircraft Robust Performance Analysis
NASA Technical Reports Server (NTRS)
Gregory, Irene M.; Tierno, Jorge E.
2004-01-01
A recently developed algorithm for nonlinear system performance analysis has been applied to an F16 aircraft to begin evaluating the suitability of the method for aerospace problems. The algorithm has a potential to be much more efficient than the current methods in performance analysis for aircraft. This paper is the initial step in evaluating this potential.
Applied Pluralism in the Evaluation of Employee Counselling.
ERIC Educational Resources Information Center
Goss, Stephen; Mearns, Dave
1997-01-01
Outlines the method, findings, and philosophical approach taken in a 22-month evaluation of an Employee Assistance Program. The program offered free counseling sessions, telephone support, in-service training, and conciliation work. Using an integrated pluralist evaluation, found that clients reported high satisfaction. Reduced absenteeism…
Evaluation on Cost Overrun Risks of Long-distance Water Diversion Project Based on SPA-IAHP Method
NASA Astrophysics Data System (ADS)
Yuanyue, Yang; Huimin, Li
2018-02-01
Large investment, long route, many change orders and etc. are main causes for costs overrun of long-distance water diversion project. This paper, based on existing research, builds a full-process cost overrun risk evaluation index system for water diversion project, apply SPA-IAHP method to set up cost overrun risk evaluation mode, calculate and rank weight of every risk evaluation indexes. Finally, the cost overrun risks are comprehensively evaluated by calculating linkage measure, and comprehensive risk level is acquired. SPA-IAHP method can accurately evaluate risks, and the reliability is high. By case calculation and verification, it can provide valid cost overrun decision making information to construction companies.
NASA Astrophysics Data System (ADS)
Ebrahimi, Mehdi; Jahangirian, Alireza
2017-12-01
An efficient strategy is presented for global shape optimization of wing sections with a parallel genetic algorithm. Several computational techniques are applied to increase the convergence rate and the efficiency of the method. A variable fidelity computational evaluation method is applied in which the expensive Navier-Stokes flow solver is complemented by an inexpensive multi-layer perceptron neural network for the objective function evaluations. A population dispersion method that consists of two phases, of exploration and refinement, is developed to improve the convergence rate and the robustness of the genetic algorithm. Owing to the nature of the optimization problem, a parallel framework based on the master/slave approach is used. The outcomes indicate that the method is able to find the global optimum with significantly lower computational time in comparison to the conventional genetic algorithm.
Usability evaluation techniques in mobile commerce applications: A systematic review
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.
2016-08-01
Obviously, there are a number of literatures concerning the usability of mobile commerce (m-commerce) applications and related areas, but they do not adequately provide knowledge about usability techniques used in most of the empirical usability evaluation for m-commerce application. Therefore, this paper is aimed at producing the usability techniques frequently used in the aspect of usability evaluation for m-commerce applications. To achieve the stated objective, systematic literature review was employed. Sixty seven papers were downloaded in usability evaluation for m-commerce and related areas; twenty one most relevant studies were selected for review in order to extract the appropriate information. The results from the review shows that heuristic evaluation, formal test and think aloud methods are the most commonly used methods in m-commerce application in comparison to cognitive walkthrough and the informal test methods. Moreover, most of the studies applied control experiment (33.3% of the total studies); other studies that applied case study for usability evaluation are 14.28%. The results from this paper provide additional knowledge to the usability practitioners and research community for the current state and use of usability techniques in m-commerce application.
Detecting long-term growth trends using tree rings: a critical evaluation of methods.
Peters, Richard L; Groenendijk, Peter; Vlam, Mart; Zuidema, Pieter A
2015-05-01
Tree-ring analysis is often used to assess long-term trends in tree growth. A variety of growth-trend detection methods (GDMs) exist to disentangle age/size trends in growth from long-term growth changes. However, these detrending methods strongly differ in approach, with possible implications for their output. Here, we critically evaluate the consistency, sensitivity, reliability and accuracy of four most widely used GDMs: conservative detrending (CD) applies mathematical functions to correct for decreasing ring widths with age; basal area correction (BAC) transforms diameter into basal area growth; regional curve standardization (RCS) detrends individual tree-ring series using average age/size trends; and size class isolation (SCI) calculates growth trends within separate size classes. First, we evaluated whether these GDMs produce consistent results applied to an empirical tree-ring data set of Melia azedarach, a tropical tree species from Thailand. Three GDMs yielded similar results - a growth decline over time - but the widely used CD method did not detect any change. Second, we assessed the sensitivity (probability of correct growth-trend detection), reliability (100% minus probability of detecting false trends) and accuracy (whether the strength of imposed trends is correctly detected) of these GDMs, by applying them to simulated growth trajectories with different imposed trends: no trend, strong trends (-6% and +6% change per decade) and weak trends (-2%, +2%). All methods except CD, showed high sensitivity, reliability and accuracy to detect strong imposed trends. However, these were considerably lower in the weak or no-trend scenarios. BAC showed good sensitivity and accuracy, but low reliability, indicating uncertainty of trend detection using this method. Our study reveals that the choice of GDM influences results of growth-trend studies. We recommend applying multiple methods when analysing trends and encourage performing sensitivity and reliability analysis. Finally, we recommend SCI and RCS, as these methods showed highest reliability to detect long-term growth trends. © 2014 John Wiley & Sons Ltd.
Mannila, H.; Koivisto, M.; Perola, M.; Varilo, T.; Hennah, W.; Ekelund, J.; Lukk, M.; Peltonen, L.; Ukkonen, E.
2003-01-01
We describe a new probabilistic method for finding haplotype blocks that is based on the use of the minimum description length (MDL) principle. We give a rigorous definition of the quality of a segmentation of a genomic region into blocks and describe a dynamic programming algorithm for finding the optimal segmentation with respect to this measure. We also describe a method for finding the probability of a block boundary for each pair of adjacent markers: this gives a tool for evaluating the significance of each block boundary. We have applied the method to the published data of Daly and colleagues. The results expose some problems that exist in the current methods for the evaluation of the significance of predicted block boundaries. Our method, MDL block finder, can be used to compare block borders in different sample sets, and we demonstrate this by applying the MDL-based method to define the block structure in chromosomes from population isolates. PMID:12761696
Mannila, H; Koivisto, M; Perola, M; Varilo, T; Hennah, W; Ekelund, J; Lukk, M; Peltonen, L; Ukkonen, E
2003-07-01
We describe a new probabilistic method for finding haplotype blocks that is based on the use of the minimum description length (MDL) principle. We give a rigorous definition of the quality of a segmentation of a genomic region into blocks and describe a dynamic programming algorithm for finding the optimal segmentation with respect to this measure. We also describe a method for finding the probability of a block boundary for each pair of adjacent markers: this gives a tool for evaluating the significance of each block boundary. We have applied the method to the published data of Daly and colleagues. The results expose some problems that exist in the current methods for the evaluation of the significance of predicted block boundaries. Our method, MDL block finder, can be used to compare block borders in different sample sets, and we demonstrate this by applying the MDL-based method to define the block structure in chromosomes from population isolates.
Detection of fatigue cracks by nondestructive testing methods
NASA Technical Reports Server (NTRS)
Anderson, R. T.; Delacy, T. J.; Stewart, R. C.
1973-01-01
The effectiveness was assessed of various NDT methods to detect small tight cracks by randomly introducing fatigue cracks into aluminum sheets. The study included optimizing NDT methods calibrating NDT equipment with fatigue cracked standards, and evaluating a number of cracked specimens by the optimized NDT methods. The evaluations were conducted by highly trained personnel, provided with detailed procedures, in order to minimize the effects of human variability. These personnel performed the NDT on the test specimens without knowledge of the flaw locations and reported on the flaws detected. The performance of these tests was measured by comparing the flaws detected against the flaws present. The principal NDT methods utilized were radiographic, ultrasonic, penetrant, and eddy current. Holographic interferometry, acoustic emission monitoring, and replication methods were also applied on a reduced number of specimens. Generally, the best performance was shown by eddy current, ultrasonic, penetrant and holographic tests. Etching provided no measurable improvement, while proof loading improved flaw detectability. Data are shown that quantify the performances of the NDT methods applied.
Common cause evaluations in applied risk analysis of nuclear power plants. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taniguchi, T.; Ligon, D.; Stamatelatos, M.
1983-04-01
Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights inmore » the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.« less
A Teacher's Guide to Memory Techniques.
ERIC Educational Resources Information Center
Hodges, Daniel L.
1982-01-01
To aid instructors in teaching their students to use effective methods of memorization, this article outlines major memory methods, provides examples of their use, evaluates the methods, and discusses ways students can be taught to apply them. First, common, but less effective, memory methods are presented, including reading and re-reading…
2011-01-01
Background Fluorescence in situ hybridization (FISH) is very accurate method for measuring HER2 gene copies, as a sign of potential breast cancer. This method requires small tissue samples, and has a high sensitivity to detect abnormalities from a histological section. By using multiple colors, this method allows the detection of multiple targets simultaneously. The target parts in the cells become visible as colored dots. The HER-2 probes are visible as orange stained spots under a fluorescent microscope while probes for centromere 17 (CEP-17), the chromosome on which the gene HER-2/neu is located, are visible as green spots. Methods The conventional analysis involves the scoring of the ratio of HER-2/neu over CEP 17 dots within each cell nucleus and then averaging the scores for a number of 60 cells. A ratio of 2.0 of HER-2/neu to CEP 17 copy number denotes amplification. Several methods have been proposed for the detection and automated evaluation (dot counting) of FISH signals. In this paper the combined method based on the mathematical morphology (MM) and inverse multifractal (IMF) analysis is suggested. Similar method was applied recently in detection of microcalcifications in digital mammograms, and was very successful. Results The combined MM using top-hat and bottom-hat filters, and the IMF method was applied to FISH images from Molecular Biology Lab, Department of Pathology, Wielkoposka Cancer Center, Poznan. Initial results indicate that this method can be applied to FISH images for the evaluation of HER2/neu status. Conclusions Mathematical morphology and multifractal approach are used for colored dot detection and counting in FISH images. Initial results derived on clinical cases are promising. Note that the overlapping of colored dots, particularly red/orange dots, needs additional improvements in post-processing. PMID:21489192
Assessing and Evaluating Multidisciplinary Translational Teams: A Mixed Methods Approach
Wooten, Kevin C.; Rose, Robert M.; Ostir, Glenn V.; Calhoun, William J.; Ameredes, Bill T.; Brasier, Allan R.
2014-01-01
A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team type taxonomy. Based on team maturation and scientific progress, teams were designated as: a) early in development, b) traditional, c) process focused, or d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored. PMID:24064432
NASA Astrophysics Data System (ADS)
Qiu, Feng; Dai, Guang; Zhang, Ying
According to the acoustic emission information and the appearance inspection information of tank bottom online testing, the external factors associated with tank bottom corrosion status are confirmed. Applying artificial neural network intelligent evaluation method, three tank bottom corrosion status evaluation models based on appearance inspection information, acoustic emission information, and online testing information are established. Comparing with the result of acoustic emission online testing through the evaluation of test sample, the accuracy of the evaluation model based on online testing information is 94 %. The evaluation model can evaluate tank bottom corrosion accurately and realize acoustic emission online testing intelligent evaluation of tank bottom.
USDA-ARS?s Scientific Manuscript database
Aims: Determine if contemporary, seed-applied fungicidal formulations inhibit colonization of plant roots by arbuscular mycorrhizal (AM) fungi, plant development, or plant nutrient content during early vegetative stages of several commodity crops. Methods: We evaluated seed-applied commercial fungic...
ERIC Educational Resources Information Center
Baker, William Henry
The purpose of this study was to determine whether a letter-evaluation method would be as effective as the traditional letter-writing method when applied in a college level business correspondence class. One hundred twenty-nine Brigham Young University students were divided into two experimental and two control groups, and categorized according to…
Conceptual Design Study on Bolts for Self-Loosing Preventable Threaded Fasteners
NASA Astrophysics Data System (ADS)
Noma, Atsushi; He, Jianmei
2017-11-01
Threaded fasteners using bolts is widely applied in industrial field as well as various fields. However, threaded fasteners using bolts have loosing problems and cause many accidents. In this study, the purpose is to obtain self-loosing preventable threaded fasteners by applying spring characteristic effects on bolt structures. Helical-cutting applied bolt structures is introduced through three dimensional (3D) CAD modeling tools. Analytical approaches for evaluations on the spring characteristic effects helical-cutting applied bolt structures and self-loosing preventable performance of threaded fasteners were performed using finite element method and results are reported. Comparing slackness test results with analytical results and more details on evaluating mechanical properties will be executed in future study.
Luhnen, Miriam; Prediger, Barbara; Neugebauer, Edmund A M; Mathes, Tim
2017-12-02
The number of systematic reviews of economic evaluations is steadily increasing. This is probably related to the continuing pressure on health budgets worldwide which makes an efficient resource allocation increasingly crucial. In particular in recent years, the introduction of several high-cost interventions presents enormous challenges regarding universal accessibility and sustainability of health care systems. An increasing number of health authorities, inter alia, feel the need for analyzing economic evidence. Economic evidence might effectively be generated by means of systematic reviews. Nevertheless, no standard methods seem to exist for their preparation so far. The objective of this study was to analyze the methods applied for systematic reviews of health economic evaluations (SR-HE) with a focus on the identification of common challenges. The planned study is a systematic review of the characteristics and methods actually applied in SR-HE. We will combine validated search filters developed for the retrieval of economic evaluations and systematic reviews to identify relevant studies in MEDLINE (via Ovid, 2015-present). To be eligible for inclusion, studies have to conduct a systematic review of full economic evaluations. Articles focusing exclusively on methodological aspects and secondary publications of health technology assessment (HTA) reports will be excluded. Two reviewers will independently assess titles and abstracts and then full-texts of studies for eligibility. Methodological features will be extracted in a standardized, beforehand piloted data extraction form. Data will be summarized with descriptive statistical measures and systematically analyzed focusing on differences/similarities and methodological weaknesses. The systematic review will provide a detailed overview of characteristics of SR-HE and the applied methods. Differences and methodological shortcomings will be detected and their implications will be discussed. The findings of our study can improve the recommendations on the preparation of SR-HE. This can increase the acceptance and usefulness of systematic reviews in health economics for researchers and medical decision makers. The review will not be registered with PROSPERO as it does not meet the eligibility criterion of dealing with clinical outcomes.
Flexible methods for segmentation evaluation: Results from CT-based luggage screening
Karimi, Seemeen; Jiang, Xiaoqian; Cosman, Pamela; Martz, Harry
2017-01-01
BACKGROUND Imaging systems used in aviation security include segmentation algorithms in an automatic threat recognition pipeline. The segmentation algorithms evolve in response to emerging threats and changing performance requirements. Analysis of segmentation algorithms’ behavior, including the nature of errors and feature recovery, facilitates their development. However, evaluation methods from the literature provide limited characterization of the segmentation algorithms. OBJECTIVE To develop segmentation evaluation methods that measure systematic errors such as oversegmentation and undersegmentation, outliers, and overall errors. The methods must measure feature recovery and allow us to prioritize segments. METHODS We developed two complementary evaluation methods using statistical techniques and information theory. We also created a semi-automatic method to define ground truth from 3D images. We applied our methods to evaluate five segmentation algorithms developed for CT luggage screening. We validated our methods with synthetic problems and an observer evaluation. RESULTS Both methods selected the same best segmentation algorithm. Human evaluation confirmed the findings. The measurement of systematic errors and prioritization helped in understanding the behavior of each segmentation algorithm. CONCLUSIONS Our evaluation methods allow us to measure and explain the accuracy of segmentation algorithms. PMID:24699346
Meacock, Rachel
2018-04-20
There is a requirement for economic evaluation of health technologies seeking public funding across Europe. Changes to the organisation and delivery of health services, including changes to health policy, are not covered by such appraisals. These changes also have consequences for National Health Service (NHS) funds, yet undergo no mandatory cost-effectiveness assessment. The focus on health technologies may have occurred because larger-scale service changes pose more complex challenges to evaluators. This paper discusses the principal challenges faced when performing economic evaluations of changes to the organisation and delivery of health services and provides recommendations for overcoming them. The five principal challenges identified are as follows: undertaking ex-ante evaluation; evaluating impacts in terms of quality-adjusted life years; assessing costs and opportunity costs; accounting for spillover effects; and generalisability. Of these challenges, methods for estimating the impact on costs and quality-adjusted life years are those most in need of development. Methods are available for ex-ante evaluation, assessing opportunity costs and examining generalisability. However, these are rarely applied in practice. The general principles of assessing the cost-effectiveness of interventions should be applied to all NHS spending, not just that involving health technologies. Advancements in this area have the potential to improve the allocation of scarce NHS resources.
Participatory Design in Gerontechnology: A Systematic Literature Review.
Merkel, Sebastian; Kucharski, Alexander
2018-05-19
Participatory design (PD) is widely used within gerontechnology but there is no common understanding about which methods are used for what purposes. This review aims to examine what different forms of PD exist in the field of gerontechnology and how these can be categorized. We conducted a systematic literature review covering several databases. The search strategy was based on 3 elements: (1) participatory methods and approaches with (2) older persons aiming at developing (3) technology for older people. Our final review included 26 studies representing a variety of technologies designed/developed and methods/instruments applied. According to the technologies, the publications reviewed can be categorized in 3 groups: Studies that (1) use already existing technology with the aim to find new ways of use; (2) aim at creating new devices; (3) test and/or modify prototypes. The implementation of PD depends on the questions: Why a participatory approach is applied, who is involved as future user(s), when those future users are involved, and how they are incorporated into the innovation process. There are multiple ways, methods, and instruments to integrate users into the innovation process. Which methods should be applied, depends on the context. However, most studies do not evaluate if participatory approaches will lead to a better acceptance and/or use of the co-developed products. Therefore, participatory design should follow a comprehensive strategy, starting with the users' needs and ending with an evaluation if the applied methods have led to better results.
[Application of Delphi method in traditional Chinese medicine clinical research].
Bi, Ying-fei; Mao, Jing-yuan
2012-03-01
In recent years, Delphi method has been widely applied in traditional Chinese medicine (TCM) clinical research. This article analyzed the present application situation of Delphi method in TCM clinical research, and discussed some problems presented in the choice of evaluation method, classification of observation indexes and selection of survey items. On the basis of present application of Delphi method, the author analyzed the method on questionnaire making, selection of experts, evaluation of observation indexes and selection of survey items. Furthermore, the author summarized the steps of application of Delphi method in TCM clinical research.
NASA Technical Reports Server (NTRS)
Tsang, L.; Brown, R.; Kong, J. A.; Simmons, G.
1974-01-01
Two numerical methods are used to evaluate the integrals that express the em fields due to dipole antennas radiating in the presence of a stratified medium. The first method is a direct integration by means of Simpson's rule. The second method is indirect and approximates the kernel of the integral by means of the fast Fourier transform. In contrast to previous analytical methods that applied only to two-layer cases the numerical methods can be used for any arbitrary number of layers with general properties.
UNDERSTANDING AND APPLYING ENVIRONMENTAL RELATIVE MOLDINESS INDEX - ERMI
This study compared two binary classification methods to evaluate the mold condition in 271 homes of infants, 144 of which later developed symptoms of respiratory illness. A method using on-site visual mold inspection was compared to another method using a quantitative index of ...
78 FR 23961 - Request for Steering Committee Nominations
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-23
... development of a methods research agenda and coordination of methods research in support of using electronic... surveillance, and methods research and application for scientific professionals. 3. IMEDS-Evaluation: Applies... transparent way to create exciting new research projects to advance regulatory science. The Foundation acts as...
ERIC Educational Resources Information Center
Seely, Sara Robertson; Fry, Sara Winstead; Ruppel, Margie
2011-01-01
An investigation into preservice teachers' information evaluation skills at a large university suggests that formative assessment can improve student achievement. Preservice teachers were asked to apply information evaluation skills in the areas of currency, relevancy, authority, accuracy, and purpose. The study used quantitative methods to assess…
Stochastic Methods for Aircraft Design
NASA Technical Reports Server (NTRS)
Pelz, Richard B.; Ogot, Madara
1998-01-01
The global stochastic optimization method, simulated annealing (SA), was adapted and applied to various problems in aircraft design. The research was aimed at overcoming the problem of finding an optimal design in a space with multiple minima and roughness ubiquitous to numerically generated nonlinear objective functions. SA was modified to reduce the number of objective function evaluations for an optimal design, historically the main criticism of stochastic methods. SA was applied to many CFD/MDO problems including: low sonic-boom bodies, minimum drag on supersonic fore-bodies, minimum drag on supersonic aeroelastic fore-bodies, minimum drag on HSCT aeroelastic wings, FLOPS preliminary design code, another preliminary aircraft design study with vortex lattice aerodynamics, HSR complete aircraft aerodynamics. In every case, SA provided a simple, robust and reliable optimization method which found optimal designs in order 100 objective function evaluations. Perhaps most importantly, from this academic/industrial project, technology has been successfully transferred; this method is the method of choice for optimization problems at Northrop Grumman.
NASA Astrophysics Data System (ADS)
Zhao, Hui; Qu, Weilu; Qiu, Weiting
2018-03-01
In order to evaluate sustainable development level of resource-based cities, an evaluation method with Shapely entropy and Choquet integral is proposed. First of all, a systematic index system is constructed, the importance of each attribute is calculated based on the maximum Shapely entropy principle, and then the Choquet integral is introduced to calculate the comprehensive evaluation value of each city from the bottom up, finally apply this method to 10 typical resource-based cities in China. The empirical results show that the evaluation method is scientific and reasonable, which provides theoretical support for the sustainable development path and reform direction of resource-based cities.
NASA Astrophysics Data System (ADS)
Zhu, Lianqing; Chen, Yunfang; Chen, Qingshan; Meng, Hao
2011-05-01
According to minimum zone condition, a method for evaluating the profile error of Archimedes helicoid surface based on Genetic Algorithm (GA) is proposed. The mathematic model of the surface is provided and the unknown parameters in the equation of surface are acquired through least square method. Principle of GA is explained. Then, the profile error of Archimedes Helicoid surface is obtained through GA optimization method. To validate the proposed method, the profile error of an Archimedes helicoid surface, Archimedes Cylindrical worm (ZA worm) surface, is evaluated. The results show that the proposed method is capable of correctly evaluating the profile error of Archimedes helicoid surface and satisfy the evaluation standard of the Minimum Zone Method. It can be applied to deal with the measured data of profile error of complex surface obtained by three coordinate measurement machines (CMM).
Toward a Web Based Environment for Evaluation and Design of Pedagogical Hypermedia
ERIC Educational Resources Information Center
Trigano, Philippe C.; Pacurar-Giacomini, Ecaterina
2004-01-01
We are working on a method, called CEPIAH. We propose a web based system used to help teachers to design multimedia documents and to evaluate their prototypes. Our current research objectives are to create a methodology to sustain the educational hypermedia design and evaluation. A module is used to evaluate multimedia software applied in…
NASA Astrophysics Data System (ADS)
Toutin, Thierry; Wang, Huili; Charbonneau, Francois; Schmitt, Carla
2013-08-01
This paper presented two methods for the orthorectification of full/compact polarimetric SAR data: the polarimetric processing is performed in the image space (scientist's idealism) or in the ground space (user's realism) before or after the geometric processing, respectively. Radarsat-2 (R2) fine-quad and simulated very high-resolution RCM data acquired with different look angles over a hilly relief study site were processed using accurate lidar digital surface model. Quantitative evaluations between the two methods as a function of different geometric and radiometric parameters were performed to evaluate the impact during the orthorectification. The results demonstrated that the ground-space method can be safely applied to polarimetric R2 SAR data with an exception with the steep look angles and steep terrain slopes. On the other hand, the ground-space method cannot be applied to simulated compact RCM data due to 17dB noise floor and oversampling.
Defining and Applying a Functionality Approach to Intellectual Disability
ERIC Educational Resources Information Center
Luckasson, R.; Schalock, R. L.
2013-01-01
Background: The current functional models of disability do not adequately incorporate significant changes of the last three decades in our understanding of human functioning, and how the human functioning construct can be applied to clinical functions, professional practices and outcomes evaluation. Methods: The authors synthesise current…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Jong-Won; Hirao, Kimihiko
Long-range corrected density functional theory (LC-DFT) attracts many chemists’ attentions as a quantum chemical method to be applied to large molecular system and its property calculations. However, the expensive time cost to evaluate the long-range HF exchange is a big obstacle to be overcome to be applied to the large molecular systems and the solid state materials. Upon this problem, we propose a linear-scaling method of the HF exchange integration, in particular, for the LC-DFT hybrid functional.
On Applying the Prognostic Performance Metrics
NASA Technical Reports Server (NTRS)
Saxena, Abhinav; Celaya, Jose; Saha, Bhaskar; Saha, Sankalita; Goebel, Kai
2009-01-01
Prognostics performance evaluation has gained significant attention in the past few years. As prognostics technology matures and more sophisticated methods for prognostic uncertainty management are developed, a standardized methodology for performance evaluation becomes extremely important to guide improvement efforts in a constructive manner. This paper is in continuation of previous efforts where several new evaluation metrics tailored for prognostics were introduced and were shown to effectively evaluate various algorithms as compared to other conventional metrics. Specifically, this paper presents a detailed discussion on how these metrics should be interpreted and used. Several shortcomings identified, while applying these metrics to a variety of real applications, are also summarized along with discussions that attempt to alleviate these problems. Further, these metrics have been enhanced to include the capability of incorporating probability distribution information from prognostic algorithms as opposed to evaluation based on point estimates only. Several methods have been suggested and guidelines have been provided to help choose one method over another based on probability distribution characteristics. These approaches also offer a convenient and intuitive visualization of algorithm performance with respect to some of these new metrics like prognostic horizon and alpha-lambda performance, and also quantify the corresponding performance while incorporating the uncertainty information.
A Novel Degradation Identification Method for Wind Turbine Pitch System
NASA Astrophysics Data System (ADS)
Guo, Hui-Dong
2018-04-01
It’s difficult for traditional threshold value method to identify degradation of operating equipment accurately. An novel degradation evaluation method suitable for wind turbine condition maintenance strategy implementation was proposed in this paper. Based on the analysis of typical variable-speed pitch-to-feather control principle and monitoring parameters for pitch system, a multi input multi output (MIMO) regression model was applied to pitch system, where wind speed, power generation regarding as input parameters, wheel rotation speed, pitch angle and motor driving currency for three blades as output parameters. Then, the difference between the on-line measurement and the calculated value from the MIMO regression model applying least square support vector machines (LSSVM) method was defined as the Observed Vector of the system. The Gaussian mixture model (GMM) was applied to fitting the distribution of the multi dimension Observed Vectors. Applying the model established, the Degradation Index was calculated using the SCADA data of a wind turbine damaged its pitch bearing retainer and rolling body, which illustrated the feasibility of the provided method.
Comparing Methods for UAV-Based Autonomous Surveillance
NASA Technical Reports Server (NTRS)
Freed, Michael; Harris, Robert; Shafto, Michael
2004-01-01
We describe an approach to evaluating algorithmic and human performance in directing UAV-based surveillance. Its key elements are a decision-theoretic framework for measuring the utility of a surveillance schedule and an evaluation testbed consisting of 243 scenarios covering a well-defined space of possible missions. We apply this approach to two example UAV-based surveillance methods, a TSP-based algorithm and a human-directed approach, then compare them to identify general strengths, and weaknesses of each method.
Appraising the reliability of visual impact assessment methods
Nickolaus R. Feimer; Kenneth H. Craik; Richard C. Smardon; Stephen R.J. Sheppard
1979-01-01
This paper presents the research approach and selected results of an empirical investigation aimed at the evaluation of selected observer-based visual impact assessment (VIA) methods. The VIA methods under examination were chosen to cover a range of VIA methods currently in use in both applied and research settings. Variation in three facets of VIA methods were...
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956
Reporting Qualitative Research: Standards, Challenges, and Implications for Health Design.
Peditto, Kathryn
2018-04-01
This Methods column describes the existing reporting standards for qualitative research, their application to health design research, and the challenges to implementation. Intended for both researchers and practitioners, this article provides multiple perspectives on both reporting and evaluating high-quality qualitative research. Two popular reporting standards exist for reporting qualitative research-the Consolidated Criteria for Reporting Qualitative Research (COREQ) and the Standards for Reporting Qualitative Research (SRQR). Though compiled using similar procedures, they differ in their criteria and the methods to which they apply. Creating and applying reporting criteria is inherently difficult due to the undefined and fluctuating nature of qualitative research when compared to quantitative studies. Qualitative research is expansive and occasionally controversial, spanning many different methods of inquiry and epistemological approaches. A "one-size-fits-all" standard for reporting qualitative research can be restrictive, but COREQ and SRQR both serve as valuable tools for developing responsible qualitative research proposals, effectively communicating research decisions, and evaluating submissions. Ultimately, tailoring a set of standards specific to health design research and its frequently used methods would ensure quality research and aid reviewers in their evaluations.
Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.
A quantitative method for evaluating alternatives. [aid to decision making
NASA Technical Reports Server (NTRS)
Forthofer, M. J.
1981-01-01
When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.
Improved numerical methods for turbulent viscous recirculating flows
NASA Technical Reports Server (NTRS)
Turan, A.; Vandoormaal, J. P.
1988-01-01
The performance of discrete methods for the prediction of fluid flows can be enhanced by improving the convergence rate of solvers and by increasing the accuracy of the discrete representation of the equations of motion. This report evaluates the gains in solver performance that are available when various acceleration methods are applied. Various discretizations are also examined and two are recommended because of their accuracy and robustness. Insertion of the improved discretization and solver accelerator into a TEACH mode, that has been widely applied to combustor flows, illustrates the substantial gains to be achieved.
NASA Astrophysics Data System (ADS)
Ningrum, R. W.; Surarso, B.; Farikhin; Safarudin, Y. M.
2018-03-01
This paper proposes the combination of Firefly Algorithm (FA) and Chen Fuzzy Time Series Forecasting. Most of the existing fuzzy forecasting methods based on fuzzy time series use the static length of intervals. Therefore, we apply an artificial intelligence, i.e., Firefly Algorithm (FA) to set non-stationary length of intervals for each cluster on Chen Method. The method is evaluated by applying on the Jakarta Composite Index (IHSG) and compare with classical Chen Fuzzy Time Series Forecasting. Its performance verified through simulation using Matlab.
Empirical evaluation of the market price of risk using the CIR model
NASA Astrophysics Data System (ADS)
Bernaschi, M.; Torosantucci, L.; Uboldi, A.
2007-03-01
We describe a simple but effective method for the estimation of the market price of risk. The basic idea is to compare the results obtained by following two different approaches in the application of the Cox-Ingersoll-Ross (CIR) model. In the first case, we apply the non-linear least squares method to cross sectional data (i.e., all rates of a single day). In the second case, we consider the short rate obtained by means of the first procedure as a proxy of the real market short rate. Starting from this new proxy, we evaluate the parameters of the CIR model by means of martingale estimation techniques. The estimate of the market price of risk is provided by comparing results obtained with these two techniques, since this approach makes possible to isolate the market price of risk and evaluate, under the Local Expectations Hypothesis, the risk premium given by the market for different maturities. As a test case, we apply the method to data of the European Fixed Income Market.
Neural networks and fault probability evaluation for diagnosis issues.
Kourd, Yahia; Lefebvre, Dimitri; Guersi, Noureddine
2014-01-01
This paper presents a new FDI technique for fault detection and isolation in unknown nonlinear systems. The objective of the research is to construct and analyze residuals by means of artificial intelligence and probabilistic methods. Artificial neural networks are first used for modeling issues. Neural networks models are designed for learning the fault-free and the faulty behaviors of the considered systems. Once the residuals generated, an evaluation using probabilistic criteria is applied to them to determine what is the most likely fault among a set of candidate faults. The study also includes a comparison between the contributions of these tools and their limitations, particularly through the establishment of quantitative indicators to assess their performance. According to the computation of a confidence factor, the proposed method is suitable to evaluate the reliability of the FDI decision. The approach is applied to detect and isolate 19 fault candidates in the DAMADICS benchmark. The results obtained with the proposed scheme are compared with the results obtained according to a usual thresholding method.
Future animal improvement programs applied to global populations
USDA-ARS?s Scientific Manuscript database
Breeding programs evolved gradually from within-herd phenotypic selection to local and regional cooperatives to national evaluations and now international evaluations. In the future, breeders may adapt reproductive, computational, and genomic methods to global populations as easily as with national ...
A rule-based automatic sleep staging method.
Liang, Sheng-Fu; Kuo, Chin-En; Hu, Yu-Han; Cheng, Yu-Shian
2012-03-30
In this paper, a rule-based automatic sleep staging method was proposed. Twelve features including temporal and spectrum analyses of the EEG, EOG, and EMG signals were utilized. Normalization was applied to each feature to eliminating individual differences. A hierarchical decision tree with fourteen rules was constructed for sleep stage classification. Finally, a smoothing process considering the temporal contextual information was applied for the continuity. The overall agreement and kappa coefficient of the proposed method applied to the all night polysomnography (PSG) of seventeen healthy subjects compared with the manual scorings by R&K rules can reach 86.68% and 0.79, respectively. This method can integrate with portable PSG system for sleep evaluation at-home in the near future. Copyright © 2012 Elsevier B.V. All rights reserved.
A method for evaluating discoverability and navigability of recommendation algorithms.
Lamprecht, Daniel; Strohmaier, Markus; Helic, Denis
2017-01-01
Recommendations are increasingly used to support and enable discovery, browsing, and exploration of items. This is especially true for entertainment platforms such as Netflix or YouTube, where frequently, no clear categorization of items exists. Yet, the suitability of a recommendation algorithm to support these use cases cannot be comprehensively evaluated by any recommendation evaluation measures proposed so far. In this paper, we propose a method to expand the repertoire of existing recommendation evaluation techniques with a method to evaluate the discoverability and navigability of recommendation algorithms. The proposed method tackles this by means of first evaluating the discoverability of recommendation algorithms by investigating structural properties of the resulting recommender systems in terms of bow tie structure, and path lengths. Second, the method evaluates navigability by simulating three different models of information seeking scenarios and measuring the success rates. We show the feasibility of our method by applying it to four non-personalized recommendation algorithms on three data sets and also illustrate its applicability to personalized algorithms. Our work expands the arsenal of evaluation techniques for recommendation algorithms, extends from a one-click-based evaluation towards multi-click analysis, and presents a general, comprehensive method to evaluating navigability of arbitrary recommendation algorithms.
USDA-ARS?s Scientific Manuscript database
A wide range of analytical techniques are available for the detection, quantitation, and evaluation of vitamin K in foods. The methods vary from simple to complex depending on extraction, separation, identification and detection of the analyte. Among the extraction methods applied for vitamin K anal...
Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom
ERIC Educational Resources Information Center
Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy
2016-01-01
The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…
ERIC Educational Resources Information Center
Kapes, Jerome T.; And Others
Three models of multiple regression analysis (MRA): single equation, commonality analysis, and path analysis, were applied to longitudinal data from the Pennsylvania Vocational Development Study. Variables influencing weekly income of vocational education students one year after high school graduation were examined: grade point averages (grades…
Comparative evaluation of ultrasound scanner accuracy in distance measurement
NASA Astrophysics Data System (ADS)
Branca, F. P.; Sciuto, S. A.; Scorza, A.
2012-10-01
The aim of the present study is to develop and compare two different automatic methods for accuracy evaluation in ultrasound phantom measurements on B-mode images: both of them give as a result the relative error e between measured distances, performed by 14 brand new ultrasound medical scanners, and nominal distances, among nylon wires embedded in a reference test object. The first method is based on a least squares estimation, while the second one applies the mean value of the same distance evaluated at different locations in ultrasound image (same distance method). Results for both of them are proposed and explained.
Evaluating hospital design from an operations management perspective.
Vos, Leti; Groothuis, Siebren; van Merode, Godefridus G
2007-12-01
This paper describes an evaluation method for the assessment of hospital building design from the viewpoint of operations management to assure that the building design supports the efficient and effective operating of care processes now and in the future. The different steps of the method are illustrated by a case study. In the case study an experimental design is applied to assess the effect of used logistical concepts, patient mix and technologies. The study shows that the evaluation method provides a valuable tool for the assessment of both functionality and the ability to meet future developments in operational control of a building design.
Experimental design methodologies in the optimization of chiral CE or CEC separations: an overview.
Dejaegher, Bieke; Mangelings, Debby; Vander Heyden, Yvan
2013-01-01
In this chapter, an overview of experimental designs to develop chiral capillary electrophoresis (CE) and capillary electrochromatographic (CEC) methods is presented. Method development is generally divided into technique selection, method optimization, and method validation. In the method optimization part, often two phases can be distinguished, i.e., a screening and an optimization phase. In method validation, the method is evaluated on its fit for purpose. A validation item, also applying experimental designs, is robustness testing. In the screening phase and in robustness testing, screening designs are applied. During the optimization phase, response surface designs are used. The different design types and their application steps are discussed in this chapter and illustrated by examples of chiral CE and CEC methods.
Evaluation Methodology. The Evaluation Exchange. Volume 11, Number 2, Summer 2005
ERIC Educational Resources Information Center
Coffman, Julia, Ed.
2005-01-01
This is the third issue of "The Evaluation Exchange" devoted entirely to the theme of methodology, though every issue tries to identify new methodological choices, the instructive ways in which people have applied or combined different methods, and emerging methodological trends. For example, lately "theories of change" have gained almost…
Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2014-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162.
Wagener, Marc L; Driesprong, Marco; Heesterbeek, Petra J C; Verdonschot, Nico; Eygendaal, Denise
2013-08-01
In this study three different methods for fixating the Chevron osteotomy of the olecranon are evaluated. Transcortical fixed Kirschner wires with a tension band, a large cancellous screw with a tension band, and a large cancellous screw alone are compared using Roentgen Stereophotogrammatic Analysis (RSA). The different fixation methods were tested in 17 cadaver specimens by applying increasing repetitive force to the triceps tendon. Forces applied were 200N, 350N, and 500N. Translation and rotation of the osteotomy were recorded using Roentgen Stereophotogrammatic Analysis. Both the fixations with a cancellous screw with tension band and with bi-cortical placed Kirschner wires with a tension band provide enough stability to withstand the forces of normal daily use. Since fixation with a cancellous screw with tension band is a fast and easy method and is related to minimal soft tissue damage this method can preferably be used for fixation of a Chevron osteotomy of the olecranon. © 2013.
A forward model-based validation of cardiovascular system identification
NASA Technical Reports Server (NTRS)
Mukkamala, R.; Cohen, R. J.
2001-01-01
We present a theoretical evaluation of a cardiovascular system identification method that we previously developed for the analysis of beat-to-beat fluctuations in noninvasively measured heart rate, arterial blood pressure, and instantaneous lung volume. The method provides a dynamical characterization of the important autonomic and mechanical mechanisms responsible for coupling the fluctuations (inverse modeling). To carry out the evaluation, we developed a computational model of the cardiovascular system capable of generating realistic beat-to-beat variability (forward modeling). We applied the method to data generated from the forward model and compared the resulting estimated dynamics with the actual dynamics of the forward model, which were either precisely known or easily determined. We found that the estimated dynamics corresponded to the actual dynamics and that this correspondence was robust to forward model uncertainty. We also demonstrated the sensitivity of the method in detecting small changes in parameters characterizing autonomic function in the forward model. These results provide confidence in the performance of the cardiovascular system identification method when applied to experimental data.
NASA Astrophysics Data System (ADS)
Klomp, Sander; van der Sommen, Fons; Swager, Anne-Fré; Zinger, Svitlana; Schoon, Erik J.; Curvers, Wouter L.; Bergman, Jacques J.; de With, Peter H. N.
2017-03-01
Volumetric Laser Endomicroscopy (VLE) is a promising technique for the detection of early neoplasia in Barrett's Esophagus (BE). VLE generates hundreds of high resolution, grayscale, cross-sectional images of the esophagus. However, at present, classifying these images is a time consuming and cumbersome effort performed by an expert using a clinical prediction model. This paper explores the feasibility of using computer vision techniques to accurately predict the presence of dysplastic tissue in VLE BE images. Our contribution is threefold. First, a benchmarking is performed for widely applied machine learning techniques and feature extraction methods. Second, three new features based on the clinical detection model are proposed, having superior classification accuracy and speed, compared to earlier work. Third, we evaluate automated parameter tuning by applying simple grid search and feature selection methods. The results are evaluated on a clinically validated dataset of 30 dysplastic and 30 non-dysplastic VLE images. Optimal classification accuracy is obtained by applying a support vector machine and using our modified Haralick features and optimal image cropping, obtaining an area under the receiver operating characteristic of 0.95 compared to the clinical prediction model at 0.81. Optimal execution time is achieved using a proposed mean and median feature, which is extracted at least factor 2.5 faster than alternative features with comparable performance.
Sayago, Ana; Asuero, Agustin G
2006-09-14
A bilogarithmic hyperbolic cosine method for the spectrophotometric evaluation of stability constants of 1:1 weak complexes from continuous variation data has been devised and applied to literature data. A weighting scheme, however, is necessary in order to take into account the transformation for linearization. The method may be considered a useful alternative to methods in which one variable is involved on both sides of the basic equation (i.e. Heller and Schwarzenbach, Likussar and Adsul and Ramanathan). Classical least squares lead in those instances to biased and approximate stability constants and limiting absorbance values. The advantages of the proposed method are: the method gives a clear indication of the existence of only one complex in solution, it is flexible enough to allow for weighting of measurements and the computation procedure yield the best value of logbeta11 and its limit of error. The agreement between the values obtained by applying the weighted hyperbolic cosine method and the non-linear regression (NLR) method is good, being in both cases the mean quadratic error at a minimum.
Hegde, Rahul J; Khare, Sumedh Suhas; Saraf, Tanvi A; Trivedi, Sonal; Naidu, Sonal
2015-01-01
Dental formation is superior to eruption as a method of dental age (DA) assessment. Eruption is only a brief occurrence, whereas formation may be related at different chronologic age levels, thereby providing a precise index for determining DA. The study was designed to determine the nature of inter-relationship between chronologic and DA. Age estimation depending upon tooth formation was done by Demirjian method and accuracy of Demirjian method was also evaluated. The sample for the study consisted of 197 children of Navi Mumbai. Significant positive correlation was found between chronologic age and DA that is, (r = 0.995), (P < 0.0001) for boys and (r = 0.995), (P < 0.0001) for girls. When age estimation was done by Demirjian method, mean the difference between true age (chronologic age) and assessed (DA) was 2 days for boys and 37 days for girls. Demirjian method showed high accuracy when applied to Navi Mumbai (Maharashtra - India) population. Demirjian method showed high accuracy when applied to Navi Mumbai (Maharashtra - India) population.
Fertigation uniformity under sprinkler irrigation: evaluation and analysis
USDA-ARS?s Scientific Manuscript database
n modern farming systems, fertigation is widely practiced as a cost effective and convenient method for applying soluble fertilizers to crops. Along with efficiency and adequacy, uniformity is an important fertigation performance evaluation criterion. Fertigation uniformity is defined here as a comp...
An Approach to the Evaluation of Hypermedia.
ERIC Educational Resources Information Center
Knussen, Christina; And Others
1991-01-01
Discusses methods that may be applied to the evaluation of hypermedia, based on six models described by Lawton. Techniques described include observation, self-report measures, interviews, automated measures, psychometric tests, checklists and criterion-based techniques, process models, Experimentally Measuring Usability (EMU), and a naturalistic…
NASA Astrophysics Data System (ADS)
Yehia, Ali M.; Arafa, Reham M.; Abbas, Samah S.; Amer, Sawsan M.
2016-01-01
Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL- 1. Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method.
NASA Astrophysics Data System (ADS)
Liang, Li; Takaaki, Ohkubo; Guang-hui, Li
2018-03-01
In recent years, earthquakes have occurred frequently, and the seismic performance of existing school buildings has become particularly important. The main method for improving the seismic resistance of existing buildings is reinforcement. However, there are few effective methods to evaluate the effect of reinforcement. Ambient vibration measurement experiments were conducted before and after seismic retrofitting using wireless measurement system and the changes of vibration characteristics were compared. The changes of acceleration response spectrum, natural periods and vibration modes indicate that the wireless vibration measurement system can be effectively applied to evaluate the effect of seismic retrofitting. The method can evaluate the effect of seismic retrofitting qualitatively, it is difficult to evaluate the effect of seismic retrofitting quantitatively at this stage.
Liu, Dinglin; Zhao, Xianglian
2013-01-01
In an effort to deal with more complicated evaluation situations, scientists have focused their efforts on dynamic comprehensive evaluation research. How to make full use of the subjective and objective information has become one of the noteworthy content. In this paper, a dynamic comprehensive evaluation method with subjective and objective information is proposed. We use the combination weighting method to determine the index weight. Analysis hierarchy process method is applied to dispose the subjective information, and criteria importance through intercriteria correlation method is used to handle the objective information. And for the time weight determination, we consider both time distance and information size to embody the principle of esteeming the present over the past. And then the linear weighted average model is constructed to make the evaluation process more practicable. Finally, an example is presented to illustrate the effectiveness of this method. Overall, the results suggest that the proposed method is reasonable and effective. PMID:24386176
Study on the Application of TOPSIS Method to the Introduction of Foreign Players in CBA Games
NASA Astrophysics Data System (ADS)
Zhongyou, Xing
The TOPSIS method is a multiple attribute decision-making method. This paper introduces the current situation of the introduction of foreign players in CBA games, presents the principles and calculation steps of TOPSIS method in detail, and applies it to the quantitative evaluation of the comprehensively competitive ability during the introduction of foreign players. Through the analysis of practical application, we found that the TOPSIS method has relatively high rationality and applicability when it is used to evaluate the comprehensively competitive ability during the introduction of foreign players.
Towards evidence-based practice in medical training: making evaluations more meaningful.
Drescher, Uta; Warren, Fiona; Norton, Kingsley
2004-12-01
The evaluation of training is problematic and the evidence base inconclusive. This situation may arise for 2 main reasons: training is not understood as a complex intervention and, related to this, the evaluation methods applied are often overly simplistic. This paper makes the case for construing training, especially in the field of specialist medical education, as a complex intervention. It also selectively reviews the available literature in order to match evaluative techniques with the demonstrated complexity. Construing training as a complex intervention can provide a framework for selecting the most appropriate methodology to evaluate a given training intervention and to appraise the evidence base for training fairly, choosing from among both quantitative and qualitative approaches and applying measurement at multiple levels of training impact.
Evaluation of several methods of applying sewage effluent to forested soils in the winter.
Alfred Ray Harris
1978-01-01
Surface application methods result in heat loss, deep soil frost, and surface ice accumulations; subsurface methods decrease heat loss and produce shallower frost. Distribution of effluent within the frozen soil is a function of surface application methods, piping due to macropores and biopores, and water movement due to temperature gradients. Nitrate is not...
NASA Astrophysics Data System (ADS)
Rasia, Rodolfo J.; Rasia-Valverde, Juana R.; Stoltz, Jean F.
1996-01-01
Laser backscattering is an excellent tool to investigate size and concentration of suspended particles. It was successfully applied to the analysis of erythrocyte aggregation. A method is proposed that applies laser backscattering to the evaluation of the strength of the immunologic erythrocyte agglutination by approaching the energy required for the mechanical dissociation of agglutinates. Mills and Snabre have proposed a theory of laser backscattering for erythrocyte aggregation analysis. It is applied here to analyze the dissociation process of erythrocyte agglutinates performed by imposing a constant shear rate to the agglutinate suspension in a couette viscometer until a dispersion of isolated red cells is attained. Experimental verifications of the method were performed on the erythrocytes of the ABO group reacting against an anti-A test serum in twofold series dilutions. Spent energy is approached by a numerical process carried out on the backscattered intensity data registered during mechanical dissociation. Velocities of agglutination and dissociation lead to the calculation of dissociation parameters These values are used to evaluate the strength of the immunological reaction and to discriminate weak subgroups of ABO system.
NASA Astrophysics Data System (ADS)
Yang, Yongying; Chai, Huiting; Li, Chen; Zhang, Yihui; Wu, Fan; Bai, Jian; Shen, Yibing
2017-05-01
Digitized evaluation of micro sparse defects on large fine optical surfaces is one of the challenges in the field of optical manufacturing and inspection. The surface defects evaluation system (SDES) for large fine optical surfaces is developed based on our previously reported work. In this paper, the electromagnetic simulation model based on Finite-Difference Time-Domain (FDTD) for vector diffraction theory is firstly established to study the law of microscopic scattering dark-field imaging. Given the aberration in actual optical systems, point spread function (PSF) approximated by a Gaussian function is introduced in the extrapolation from the near field to the far field and the scatter intensity distribution in the image plane is deduced. Analysis shows that both diffraction-broadening imaging and geometrical imaging should be considered in precise size evaluation of defects. Thus, a novel inverse-recognition calibration method is put forward to avoid confusion caused by diffraction-broadening effect. The evaluation method is applied to quantitative evaluation of defects information. The evaluation results of samples of many materials by SDES are compared with those by OLYMPUS microscope to verify the micron-scale resolution and precision. The established system has been applied to inspect defects on large fine optical surfaces and can achieve defects inspection of surfaces as large as 850 mm×500 mm with the resolution of 0.5 μm.
Recurrence of attic cholesteatoma: different methods of estimating recurrence rates.
Stangerup, S E; Drozdziewicz, D; Tos, M; Hougaard-Jensen, A
2000-09-01
One problem in cholesteatoma surgery is recurrence of cholesteatoma, which is reported to vary from 5% to 71%. This great variability can be explained by issues such as the type of cholesteatoma, surgical technique, follow-up rate, length of the postoperative observation period, and statistical method applied. The aim of this study was to illustrate the impact of applying different statistical methods to the same material. Thirty-three children underwent single-stage surgery for attic cholesteatoma during a 15-year period. Thirty patients (94%) attended a re-evaluation. During the observation period of 15 years, recurrence of cholesteatoma occurred in 10 ears. The cumulative total recurrence rate varied from 30% to 67%, depending on the statistical method applied. In conclusion, the choice of statistical method should depend on the number of patients, follow-up rates, length of the postoperative observation period and presence of censored data.
Evaluating co-creation of knowledge: from quality criteria and indicators to methods
NASA Astrophysics Data System (ADS)
Schuck-Zöller, Susanne; Cortekar, Jörg; Jacob, Daniela
2017-11-01
Basic research in the natural sciences rests on a long tradition of evaluation. However, since the San Francisco Declaration on Research Assessment (DORA) came out in 2012, there has been intense discussion in the natural sciences, above all amongst researchers and funding agencies in the different fields of applied research and scientific service. This discussion was intensified when climate services and other fields, used to make users participate in research and development activities (co-creation), demanded new evaluation methods appropriate to this new research mode. This paper starts by describing a comprehensive and interdisciplinary literature overview of indicators to evaluate co-creation of knowledge, including the different fields of integrated knowledge production. Then the authors harmonize the different elements of evaluation from literature in an evaluation cascade that scales down from very general evaluation dimensions to tangible assessment methods. They describe evaluation indicators already being documented and include a mixture of different assessment methods for two exemplary criteria. It is shown what can be deduced from already existing methodology for climate services and envisaged how climate services can further to develop their specific evaluation method.
A Pragmatic Smoothing Method for Improving the Quality of the Results in Atomic Spectroscopy
NASA Astrophysics Data System (ADS)
Bennun, Leonardo
2017-07-01
A new smoothing method for the improvement on the identification and quantification of spectral functions based on the previous knowledge of the signals that are expected to be quantified, is presented. These signals are used as weighted coefficients in the smoothing algorithm. This smoothing method was conceived to be applied in atomic and nuclear spectroscopies preferably to these techniques where net counts are proportional to acquisition time, such as particle induced X-ray emission (PIXE) and other X-ray fluorescence spectroscopic methods, etc. This algorithm, when properly applied, does not distort the form nor the intensity of the signal, so it is well suited for all kind of spectroscopic techniques. This method is extremely effective at reducing high-frequency noise in the signal much more efficient than a single rectangular smooth of the same width. As all of smoothing techniques, the proposed method improves the precision of the results, but in this case we found also a systematic improvement on the accuracy of the results. We still have to evaluate the improvement on the quality of the results when this method is applied over real experimental results. We expect better characterization of the net area quantification of the peaks, and smaller Detection and Quantification Limits. We have applied this method to signals that obey Poisson statistics, but with the same ideas and criteria, it could be applied to time series. In a general case, when this algorithm is applied over experimental results, also it would be required that the sought characteristic functions, required for this weighted smoothing method, should be obtained from a system with strong stability. If the sought signals are not perfectly clean, this method should be carefully applied
Evaluation of an automatic brain segmentation method developed for neonates on adult MR brain images
NASA Astrophysics Data System (ADS)
Moeskops, Pim; Viergever, Max A.; Benders, Manon J. N. L.; Išgum, Ivana
2015-03-01
Automatic brain tissue segmentation is of clinical relevance in images acquired at all ages. The literature presents a clear distinction between methods developed for MR images of infants, and methods developed for images of adults. The aim of this work is to evaluate a method developed for neonatal images in the segmentation of adult images. The evaluated method employs supervised voxel classification in subsequent stages, exploiting spatial and intensity information. Evaluation was performed using images available within the MRBrainS13 challenge. The obtained average Dice coefficients were 85.77% for grey matter, 88.66% for white matter, 81.08% for cerebrospinal fluid, 95.65% for cerebrum, and 96.92% for intracranial cavity, currently resulting in the best overall ranking. The possibility of applying the same method to neonatal as well as adult images can be of great value in cross-sectional studies that include a wide age range.
Persons Camp Using Interpolation Method
NASA Astrophysics Data System (ADS)
Tawfiq, Luma Naji Mohammed; Najm Abood, Israa
2018-05-01
The aim of this paper is to estimate the rate of contaminated soils by using suitable interpolation method as an alternative accurate tool to evaluate the concentration of heavy metals in soil then compared with standard universal value to determine the rate of contamination in the soil. In particular, interpolation methods are extensively applied in the models of the different phenomena where experimental data must be used in computer studies where expressions of those data are required. In this paper the extended divided difference method in two dimensions is used to solve suggested problem. Then, the modification method is applied to estimate the rate of contaminated soils of displaced persons camp in Diyala Governorate, in Iraq.
Evaluation of methods for managing censored results when calculating the geometric mean.
Mikkonen, Hannah G; Clarke, Bradley O; Dasika, Raghava; Wallis, Christian J; Reichman, Suzie M
2018-01-01
Currently, there are conflicting views on the best statistical methods for managing censored environmental data. The method commonly applied by environmental science researchers and professionals is to substitute half the limit of reporting for derivation of summary statistics. This approach has been criticised by some researchers, raising questions around the interpretation of historical scientific data. This study evaluated four complete soil datasets, at three levels of simulated censorship, to test the accuracy of a range of censored data management methods for calculation of the geometric mean. The methods assessed included removal of censored results, substitution of a fixed value (near zero, half the limit of reporting and the limit of reporting), substitution by nearest neighbour imputation, maximum likelihood estimation, regression on order substitution and Kaplan-Meier/survival analysis. This is the first time such a comprehensive range of censored data management methods have been applied to assess the accuracy of calculation of the geometric mean. The results of this study show that, for describing the geometric mean, the simple method of substitution of half the limit of reporting is comparable or more accurate than alternative censored data management methods, including nearest neighbour imputation methods. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Moustafa, Azza A.; Hegazy, Maha A.; Mohamed, Dalia; Ali, Omnia
2016-02-01
A novel approach for the resolution and quantitation of severely overlapped quaternary mixture of carbinoxamine maleate (CAR), pholcodine (PHL), ephedrine hydrochloride (EPH) and sunset yellow (SUN) in syrup was demonstrated utilizing different spectrophotometric assisted multivariate calibration methods. The applied methods have used different processing and pre-processing algorithms. The proposed methods were partial least squares (PLS), concentration residuals augmented classical least squares (CRACLS), and a novel method; continuous wavelet transforms coupled with partial least squares (CWT-PLS). These methods were applied to a training set in the concentration ranges of 40-100 μg/mL, 40-160 μg/mL, 100-500 μg/mL and 8-24 μg/mL for the four components, respectively. The utilized methods have not required any preliminary separation step or chemical pretreatment. The validity of the methods was evaluated by an external validation set. The selectivity of the developed methods was demonstrated by analyzing the drugs in their combined pharmaceutical formulation without any interference from additives. The obtained results were statistically compared with the official and reported methods where no significant difference was observed regarding both accuracy and precision.
In-Service Training Argumentation Application for Elementary School Teachers: Pilot Study
ERIC Educational Resources Information Center
Alkis-Küçükaydin, Mensure; Uluçinar Sagir, Safak; Kösterelioglu, Ilker
2016-01-01
Science Course Curriculum was revised in Turkey in 2013 and some methods and strategies were suggested to be included such as argumentation. This study includes the evaluation of in-service training applied as pilot study for introducing argumentation to elementary school teachers. The study consists of applying needs analysis, preparing and…
ERIC Educational Resources Information Center
Morris, Michael Lane; Storberg-Walker, Julia; McMillan, Heather S.
2009-01-01
This article presents a new model, generated through applied theory-building research methods, that helps human resource development (HRD) practitioners evaluate the return on investment (ROI) of organization development (OD) interventions. This model, called organization development human-capital accounting system (ODHCAS), identifies…
Areal Feature Matching Based on Similarity Using Critic Method
NASA Astrophysics Data System (ADS)
Kim, J.; Yu, K.
2015-10-01
In this paper, we propose an areal feature matching method that can be applied for many-to-many matching, which involves matching a simple entity with an aggregate of several polygons or two aggregates of several polygons with fewer user intervention. To this end, an affine transformation is applied to two datasets by using polygon pairs for which the building name is the same. Then, two datasets are overlaid with intersected polygon pairs that are selected as candidate matching pairs. If many polygons intersect at this time, we calculate the inclusion function between such polygons. When the value is more than 0.4, many of the polygons are aggregated as single polygons by using a convex hull. Finally, the shape similarity is calculated between the candidate pairs according to the linear sum of the weights computed in CRITIC method and the position similarity, shape ratio similarity, and overlap similarity. The candidate pairs for which the value of the shape similarity is more than 0.7 are determined as matching pairs. We applied the method to two geospatial datasets: the digital topographic map and the KAIS map in South Korea. As a result, the visual evaluation showed two polygons that had been well detected by using the proposed method. The statistical evaluation indicates that the proposed method is accurate when using our test dataset with a high F-measure of 0.91.
Fage-Butler, Antoinette
2013-01-01
The purpose of this paper is to present an evaluative model of patient-centredness for text and to illustrate how this can be applied to patient information leaflets (PILs) that accompany medication in the European Union. Patients have criticized PILs for sidelining their experiences, knowledge and affective needs, and denying their individuality. The health communication paradigm of patient-centredness provides valuable purchase on these issues, taking its starting point in the dignity and integrity of the patient as a person. Employing this evaluative model involves two stages. First, a Foucauldian Discourse Analysis is performed of sender and receiver and of the main discourses in PILs. These aspects are then evaluated using the perspectives of patient-centredness theory relating to the medical practitioner, patient and content. The evaluative model is illustrated via a PIL for medication for depression and panic attacks. Evaluation reveals a preponderance of biomedical statements, with a cluster of patient-centred statements primarily relating to the construction of the patient. The paper contributes a new method and evaluative approach to PIL and qualitative health research, as well as outlining a method that facilitates the investigation of interdiscursivity, a recent focus of critical genre analysis.
Xiao, Zhiyan; Zou, Wei J; Chen, Ting; Yue, Ning J; Jabbour, Salma K; Parikh, Rahul; Zhang, Miao
2018-03-01
The goal of this study was to exam the efficacy of current DVH based clinical guidelines draw from photon experience for lung cancer radiation therapy on proton therapy. Comparison proton plans and IMRT plans were generated for 10 lung patients treated in our proton facility. A gEUD based plan evaluation method was developed for plan evaluation. This evaluation method used normal lung gEUD(a) curve in which the model parameter "a" was sampled from the literature reported value. For all patients, the proton plans delivered lower normal lung V 5 Gy with similar V 20 Gy and similar target coverage. Based on current clinical guidelines, proton plans were ranked superior to IMRT plans for all 10 patients. However, the proton and IMRT normal lung gEUD(a) curves crossed for 8 patients within the tested range of "a", which means there was a possibility that proton plan would be worse than IMRT plan for lung sparing. A concept of deficiency index (DI) was introduced to quantify the probability of proton plans doing worse than IMRT plans. By applying threshold on DI, four patients' proton plan was ranked inferior to the IMRT plan. Meanwhile if a threshold to the location of curve crossing was applied, 6 patients' proton plan was ranked inferior to the IMRT plan. The contradictory ranking results between the current clinical guidelines and the gEUD(a) curve analysis demonstrated there is potential pitfalls by applying photon experience directly to the proton world. A comprehensive plan evaluation based on radio-biological models should be carried out to decide if a lung patient would really be benefit from proton therapy. © 2018 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
STRUCTURE-ACTIVITY RELATIONSHIPS (SARS) AMONG MUTAGENS AND CARCINOGENS: A REVIEW
The review is an introduction to methods for evaluating structure-activity relationships (SARs), and, in particular, to those methods that have been applied to study mutagenicity and carcinogenicity. A brief history and some background material on the earliest attempts to correla...
EVALUATION OF METHODS FOR SAMPLING, RECOVERY, AND ENUMERATION OF BACTERIA APPLIED TO THE PHYLLOPANE
Determining the fate and survival of genetically engineered microorganisms released into the environment requires the development and application of accurate and practical methods of detection and enumeration. everal experiments were performed to examine quantitative recovery met...
NASA Astrophysics Data System (ADS)
Hanan, Lu; Qiushi, Li; Shaobin, Li
2016-12-01
This paper presents an integrated optimization design method in which uniform design, response surface methodology and genetic algorithm are used in combination. In detail, uniform design is used to select the experimental sampling points in the experimental domain and the system performance is evaluated by means of computational fluid dynamics to construct a database. After that, response surface methodology is employed to generate a surrogate mathematical model relating the optimization objective and the design variables. Subsequently, genetic algorithm is adopted and applied to the surrogate model to acquire the optimal solution in the case of satisfying some constraints. The method has been applied to the optimization design of an axisymmetric diverging duct, dealing with three design variables including one qualitative variable and two quantitative variables. The method of modeling and optimization design performs well in improving the duct aerodynamic performance and can be also applied to wider fields of mechanical design and seen as a useful tool for engineering designers, by reducing the design time and computation consumption.
New approach in the evaluation of a fitness program at a worksite.
Shirasaya, K; Miyakawa, M; Yoshida, K; Tanaka, C; Shimada, N; Kondo, T
1999-03-01
The most common methods for the economic evaluation of a fitness program at a worksite are cost-effectiveness, cost-benefit, and cost-utility analyses. In this study, we applied a basic microeconomic theory, "neoclassical firm's problems," as the new approach for it. The optimal number of physical-exercise classes that constitute the core of the fitness program are determined using the cubic health production function. The optimal number is defined as the number that maximizes the profit of the program. The optimal number corresponding to any willingness-to-pay amount of the participants for the effectiveness of the program is presented using a graph. For example, if the willingness-to-pay is $800, the optimal number of classes is 23. Our method can be applied to the evaluation of any health care program if the health production function can be estimated.
Kigozi, Jesse; Jowett, Sue; Lewis, Martyn; Barton, Pelham; Coast, Joanna
2017-03-01
Given the significant costs of reduced productivity (presenteeism) in comparison to absenteeism, and overall societal costs, presenteeism has a potentially important role to play in economic evaluations. However, these costs are often excluded. The objective of this study is to review applied cost of illness studies and economic evaluations to identify valuation methods used for, and impact of including presenteeism costs in practice. A structured systematic review was carried out to explore (i) the extent to which presenteeism has been applied in cost of illness studies and economic evaluations and (ii) the overall impact of including presenteeism on overall costs and outcomes. Potential articles were identified by searching Medline, PsycINFO and NHS EED databases. A standard template was developed and used to extract information from economic evaluations and cost of illness studies incorporating presenteeism costs. A total of 28 studies were included in the systematic review which also demonstrated that presenteeism costs are rarely included in full economic evaluations. Estimation and monetisation methods differed between the instruments. The impact of disease on presenteeism whilst in paid work is high. The potential impact of presenteeism costs needs to be highlighted and greater consideration should be given to including these in economic evaluations and cost of illness studies. The importance of including presenteeism costs when conducting economic evaluation from a societal perspective should be emphasised in national economic guidelines and more methodological work is required to improve the practical application of presenteeism instruments to generate productivity cost estimates. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Evaluation of multilayered pavement structures from measurements of surface waves
Ryden, N.; Lowe, M.J.S.; Cawley, P.; Park, C.B.
2006-01-01
A method is presented for evaluating the thickness and stiffness of multilayered pavement structures from guided waves measured at the surface. Data is collected with a light hammer as the source and an accelerometer as receiver, generating a synthetic receiver array. The top layer properties are evaluated with a Lamb wave analysis. Multiple layers are evaluated by matching a theoretical phase velocity spectrum to the measured spectrum. So far the method has been applied to the testing of pavements, but it may also be applicable in other fields such as ultrasonic testing of coated materials. ?? 2006 American Institute of Physics.
Aghajani Mir, M; Taherei Ghazvinei, P; Sulaiman, N M N; Basri, N E A; Saheri, S; Mahmood, N Z; Jahan, A; Begum, R A; Aghamohammadi, N
2016-01-15
Selecting a suitable Multi Criteria Decision Making (MCDM) method is a crucial stage to establish a Solid Waste Management (SWM) system. Main objective of the current study is to demonstrate and evaluate a proposed method using Multiple Criteria Decision Making methods (MCDM). An improved version of Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) applied to obtain the best municipal solid waste management method by comparing and ranking the scenarios. Applying this method in order to rank treatment methods is introduced as one contribution of the study. Besides, Viekriterijumsko Kompromisno Rangiranje (VIKOR) compromise solution method applied for sensitivity analyses. The proposed method can assist urban decision makers in prioritizing and selecting an optimized Municipal Solid Waste (MSW) treatment system. Besides, a logical and systematic scientific method was proposed to guide an appropriate decision-making. A modified TOPSIS methodology as a superior to existing methods for first time was applied for MSW problems. Applying this method in order to rank treatment methods is introduced as one contribution of the study. Next, 11 scenarios of MSW treatment methods are defined and compared environmentally and economically based on the waste management conditions. Results show that integrating a sanitary landfill (18.1%), RDF (3.1%), composting (2%), anaerobic digestion (40.4%), and recycling (36.4%) was an optimized model of integrated waste management. An applied decision-making structure provides the opportunity for optimum decision-making. Therefore, the mix of recycling and anaerobic digestion and a sanitary landfill with Electricity Production (EP) are the preferred options for MSW management. Copyright © 2015 Elsevier Ltd. All rights reserved.
Fuzzy AHP Analysis on Enterprises’ Independent Innovation Capability Evaluation
NASA Astrophysics Data System (ADS)
Zhu, Yu; Lei, Huai-ying
Independent innovation has become a key factor in the rapid and healthy development of the enterprises. Therefore, an effective and reasonable comprehensive evaluation on the independent innovation capability of the businesses is especially important. This paper applies fuzzy AHP in the evaluation of the independent innovation capability of the businesses, and validates the rationality and feasibility of the evaluation methods and the indicators.
Emadi, Mostafa; Baghernejad, Majid; Pakparvar, Mojtaba; Kowsar, Sayyed Ahang
2010-05-01
This study was undertaken to incorporate geostatistics, remote sensing, and geographic information system (GIS) technologies to improve the qualitative land suitability assessment in arid and semiarid ecosystems of Arsanjan plain, southern Iran. The primary data were obtained from 85 soil samples collected from tree depths (0-30, 30-60, and 60-90 cm); the secondary information was acquired from the remotely sensed data from the linear imaging self-scanner (LISS-III) receiver of the IRS-P6 satellite. Ordinary kriging and simple kriging with varying local means (SKVLM) methods were used to identify the spatial dependency of soil important parameters. It was observed that using the data collected from the spectral values of band 1 of the LISS-III receiver as the secondary variable applying the SKVLM method resulted in the lowest mean square error for mapping the pH and electrical conductivity (ECe) in the 0-30-cm depth. On the other hand, the ordinary kriging method resulted in a reliable accuracy for the other soil properties with moderate to strong spatial dependency in the study area for interpolation in the unstamped points. The parametric land suitability evaluation method was applied on the density points (150 x 150 m(2)) instead of applying on the limited representative profiles conventionally, which were obtained by the kriging or SKVLM methods. Overlaying the information layers of the data was used with the GIS for preparing the final land suitability evaluation. Therefore, changes in land characteristics could be identified in the same soil uniform mapping units over a very short distance. In general, this new method can easily present the squares and limitation factors of the different land suitability classes with considerable accuracy in arbitrary land indices.
Simon, S; Higginson, I J
2009-01-01
Hospital palliative care teams (HPCTs) are well established as multi-professional services to provide palliative care in an acute hospital setting and are increasing in number. However, there is still limited evaluation of them, in terms of efficacy and effectiveness. The gold standard method of evaluation is a randomised control trial, but because of methodological (e.g., randomisation), ethical and practical difficulties such trials are often not possible. HPCT is a complex intervention, and the specific situation in palliative care makes it challenging to evaluate (e.g., distress and cognitive impairment of patients). The quasi-experimental before-after study design has the advantage of enabling an experimental character without randomisation. But this has other weaknesses and is prone to bias, for example, temporal trends and selection bias. As for every study design, avoidance and minimisation of bias is important to improve validity. Therefore, strategies of selecting an appropriate control group or time series and applying valid outcomes and measurement tools help reducing bias and strengthen the methods. Special attention is needed to plan and define the design and applied method.
Comparison of RCS prediction techniques, computations and measurements
NASA Astrophysics Data System (ADS)
Brand, M. G. E.; Vanewijk, L. J.; Klinker, F.; Schippers, H.
1992-07-01
Three calculation methods to predict radar cross sections (RCS) of three dimensional objects are evaluated by computing the radar cross sections of a generic wing inlet configuration. The following methods are applied: a three dimensional high frequency method, a three dimensional boundary element method, and a two dimensional finite difference time domain method. The results of the computations are compared with the data of measurements.
A new method for calculating ecological flow: Distribution flow method
NASA Astrophysics Data System (ADS)
Tan, Guangming; Yi, Ran; Chang, Jianbo; Shu, Caiwen; Yin, Zhi; Han, Shasha; Feng, Zhiyong; Lyu, Yiwei
2018-04-01
A distribution flow method (DFM) and its ecological flow index and evaluation grade standard are proposed to study the ecological flow of rivers based on broadening kernel density estimation. The proposed DFM and its ecological flow index and evaluation grade standard are applied into the calculation of ecological flow in the middle reaches of the Yangtze River and compared with traditional calculation method of hydrological ecological flow, method of flow evaluation, and calculation result of fish ecological flow. Results show that the DFM considers the intra- and inter-annual variations in natural runoff, thereby reducing the influence of extreme flow and uneven flow distributions during the year. This method also satisfies the actual runoff demand of river ecosystems, demonstrates superiority over the traditional hydrological methods, and shows a high space-time applicability and application value.
Methodology for Evaluating Cost-effectiveness of Commercial Energy Code Changes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, Philip R.; Liu, Bing
This document lays out the U.S. Department of Energy’s (DOE’s) method for evaluating the cost-effectiveness of energy code proposals and editions. The evaluation is applied to provisions or editions of the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) Standard 90.1 and the International Energy Conservation Code (IECC). The method follows standard life-cycle cost (LCC) economic analysis procedures. Cost-effectiveness evaluation requires three steps: 1) evaluating the energy and energy cost savings of code changes, 2) evaluating the incremental and replacement costs related to the changes, and 3) determining the cost-effectiveness of energy code changes based on those costs andmore » savings over time.« less
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2012 CFR
2012-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR § 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2013 CFR
2013-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2014 CFR
2014-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2010 CFR
2010-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2011 CFR
2011-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
An evaluation of the use of liquid calcium chloride to improve deicing and snow removal.
DOT National Transportation Integrated Search
1978-01-01
The Iowa method of spraying liquid calcium chloride onto sodium chloride applied in snow and ice removal operations was evaluated On four sections of highway in the Staunton District. From the relatively sparse data accumulated over three winters, it...
NASA Technical Reports Server (NTRS)
Gundersen, R. T.; Bond, R. L.
1976-01-01
Zero-g workstations were designed throughout manned spaceflight, based on different criteria and requirements for different programs. The history of design of these workstations is presented along with a thorough evaluation of selected Skylab workstations (the best zero-g experience available on the subject). The results were applied to on-going and future programs, with special emphasis on the correlation of neutral body posture in zero-g to workstation design. Where selected samples of shuttle orbiter workstations are shown as currently designed and compared to experience gained during prior programs in terms of man machine interface design, the evaluations were done in a generic sense to show the methods of applying evaluative techniques.
On the Adequacy of Bayesian Evaluations of Categorization Models: Reply to Vanpaemel and Lee (2012)
ERIC Educational Resources Information Center
Wills, Andy J.; Pothos, Emmanuel M.
2012-01-01
Vanpaemel and Lee (2012) argued, and we agree, that the comparison of formal models can be facilitated by Bayesian methods. However, Bayesian methods neither precede nor supplant our proposals (Wills & Pothos, 2012), as Bayesian methods can be applied both to our proposals and to their polar opposites. Furthermore, the use of Bayesian methods to…
Evaluation of Historical and Projected Agricultural Climate Risk Over the Continental US
NASA Astrophysics Data System (ADS)
Zhu, X.; Troy, T. J.; Devineni, N.
2016-12-01
Food demands are rising due to an increasing population with changing food preferences, which places pressure on agricultural systems. In addition, in the past decade climate extremes have highlighted the vulnerability of our agricultural production to climate variability. Quantitative analyses in the climate-agriculture research field have been performed in many studies. However, climate risk still remains difficult to evaluate at large scales yet shows great potential of help us better understand historical climate change impacts and evaluate the future risk given climate projections. In this study, we developed a framework to evaluate climate risk quantitatively by applying statistical methods such as Bayesian regression, distribution fitting, and Monte Carlo simulation. We applied the framework over different climate regions in the continental US both historically and for modeled climate projections. The relative importance of any major growing season climate index, such as maximum dry period or heavy precipitation, was evaluated to determine what climate indices play a role in affecting crop yields. The statistical modeling framework was applied using county yields, with irrigated and rainfed yields separated to evaluate the different risk. This framework provides estimates of the climate risk facing agricultural production in the near-term that account for the full uncertainty of climate occurrences, range of crop response, and spatial correlation in climate. In particular, the method provides robust estimates of importance of irrigation in mitigating agricultural climate risk. The results of this study can contribute to decision making about crop choice and water use in an uncertain climate.
Development of a REBCO HTS magnet for Maglev - repeated bending tests of HTS pancake coils -
NASA Astrophysics Data System (ADS)
Sugino, Motohikoa; Mizuno, Katsutoshi; Tanaka, Minoru; Ogata, Masafumi
2018-01-01
In the past study, two manufacturing methods were developed that can manufacture pancake coils by using REBCO coated conductors. It was confirmed that the conductors have no electric degradation that caused by the manufacturing method. The durability evaluation tests of the pancake coils were conducted as the final evaluation of the coil manufacturing method in this study. The repeated bending deformation was applied to manufactured pancake coils in the tests. As the results of these tests, it was confirmed that the pancake coils that were manufactured by two methods had the durability for the repeated bending deformation and the coils maintained the appropriate mechanical performance and electric performance. We adopted the fusion bonding method as the coil manufacturing method of the HTS magnet Furthermore, using the prototype pancake coil that was manufactured by the fusion bonding method as a test sample, the repeated bending test under the exited condition was conducted. Thus it was confirmed that the coil manufactured by the fusion bonding method has no degradation of the electricity performance and the mechanical properties even if the repeated bending deformation was applied under the exited condition.
A systematic review of gait analysis methods based on inertial sensors and adaptive algorithms.
Caldas, Rafael; Mundt, Marion; Potthast, Wolfgang; Buarque de Lima Neto, Fernando; Markert, Bernd
2017-09-01
The conventional methods to assess human gait are either expensive or complex to be applied regularly in clinical practice. To reduce the cost and simplify the evaluation, inertial sensors and adaptive algorithms have been utilized, respectively. This paper aims to summarize studies that applied adaptive also called artificial intelligence (AI) algorithms to gait analysis based on inertial sensor data, verifying if they can support the clinical evaluation. Articles were identified through searches of the main databases, which were encompassed from 1968 to October 2016. We have identified 22 studies that met the inclusion criteria. The included papers were analyzed due to their data acquisition and processing methods with specific questionnaires. Concerning the data acquisition, the mean score is 6.1±1.62, what implies that 13 of 22 papers failed to report relevant outcomes. The quality assessment of AI algorithms presents an above-average rating (8.2±1.84). Therefore, AI algorithms seem to be able to support gait analysis based on inertial sensor data. Further research, however, is necessary to enhance and standardize the application in patients, since most of the studies used distinct methods to evaluate healthy subjects. Copyright © 2017 Elsevier B.V. All rights reserved.
Kandadai, Venk; Yang, Haodong; Jiang, Ling; Yang, Christopher C; Fleisher, Linda; Winston, Flaura Koplin
2016-05-05
Little is known about the ability of individual stakeholder groups to achieve health information dissemination goals through Twitter. This study aimed to develop and apply methods for the systematic evaluation and optimization of health information dissemination by stakeholders through Twitter. Tweet content from 1790 followers of @SafetyMD (July-November 2012) was examined. User emphasis, a new indicator of Twitter information dissemination, was defined and applied to retweets across two levels of retweeters originating from @SafetyMD. User interest clusters were identified based on principal component analysis (PCA) and hierarchical cluster analysis (HCA) of a random sample of 170 followers. User emphasis of keywords remained across levels but decreased by 9.5 percentage points. PCA and HCA identified 12 statistically unique clusters of followers within the @SafetyMD Twitter network. This study is one of the first to develop methods for use by stakeholders to evaluate and optimize their use of Twitter to disseminate health information. Our new methods provide preliminary evidence that individual stakeholders can evaluate the effectiveness of health information dissemination and create content-specific clusters for more specific targeted messaging.
Applying a Mixed-Methods Evaluation to Healthy Kids, Healthy Communities
Brownson, Ross C.; Kemner, Allison L.; Brennan, Laura K.
2016-01-01
From 2008 to 2014, the Healthy Kids, Healthy Communities (HKHC) national program funded 49 communities across the United States and Puerto Rico to implement healthy eating and active living policy, system, and environmental changes to support healthier communities for children and families, with special emphasis on reaching children at highest risk for obesity on the basis of race, ethnicity, income, or geographic location. Evaluators designed a mixed-methods evaluation to capture the complexity of the HKHC projects, understand implementation, and document perceived and actual impacts of these efforts. PMID:25828217
NASA Technical Reports Server (NTRS)
Atkins, H. L.; Helenbrook, B. T.
2005-01-01
This paper describes numerical experiments with P-multigrid to corroborate analysis, validate the present implementation, and to examine issues that arise in the implementations of the various combinations of relaxation schemes, discretizations and P-multigrid methods. The two approaches to implement P-multigrid presented here are equivalent for most high-order discretization methods such as spectral element, SUPG, and discontinuous Galerkin applied to advection; however it is discovered that the approach that mimics the common geometric multigrid implementation is less robust, and frequently unstable when applied to discontinuous Galerkin discretizations of di usion. Gauss-Seidel relaxation converges 40% faster than block Jacobi, as predicted by analysis; however, the implementation of Gauss-Seidel is considerably more expensive that one would expect because gradients in most neighboring elements must be updated. A compromise quasi Gauss-Seidel relaxation method that evaluates the gradient in each element twice per iteration converges at rates similar to those predicted for true Gauss-Seidel.
NASA Astrophysics Data System (ADS)
Ferus, Martin; Koukal, Jakub; Lenža, Libor; Srba, Jiří; Kubelík, Petr; Laitl, Vojtěch; Zanozina, Ekaterina M.; Váňa, Pavel; Kaiserová, Tereza; Knížek, Antonín; Rimmer, Paul; Chatzitheodoridis, Elias; Civiš, Svatopluk
2018-03-01
Aims: We aim to analyse real-time Perseid and Leonid meteor spectra using a novel calibration-free (CF) method, which is usually applied in the laboratory for laser-induced breakdown spectroscopic (LIBS) chemical analysis. Methods: Reference laser ablation spectra of specimens of chondritic meteorites were measured in situ simultaneously with a high-resolution laboratory echelle spectrograph and a spectral camera for meteor observation. Laboratory data were subsequently evaluated via the CF method and compared with real meteor emission spectra. Additionally, spectral features related to airglow plasma were compared with the spectra of laser-induced breakdown and electric discharge in the air. Results: We show that this method can be applied in the evaluation of meteor spectral data observed in real time. Specifically, CF analysis can be used to determine the chemical composition of meteor plasma, which, in the case of the Perseid and Leonid meteors analysed in this study, corresponds to that of the C-group of chondrites.
Würtzen, G
1993-01-01
The principles of 'data-derived safety factors' are applied to toxicological and biochemical information on butylated hydroxyanisole (BHA). The calculated safety factor for an ADI is, by this method, comparable to the existing internationally recognized safety evaluations. Relevance for humans of forestomach tumours in rodents is discussed. The method provides a basis for organizing data in a way that permits an explicit assessment of its relevance.
Evaluating the evaluation of cancer driver genes
Tokheim, Collin J.; Papadopoulos, Nickolas; Kinzler, Kenneth W.; Vogelstein, Bert; Karchin, Rachel
2016-01-01
Sequencing has identified millions of somatic mutations in human cancers, but distinguishing cancer driver genes remains a major challenge. Numerous methods have been developed to identify driver genes, but evaluation of the performance of these methods is hindered by the lack of a gold standard, that is, bona fide driver gene mutations. Here, we establish an evaluation framework that can be applied to driver gene prediction methods. We used this framework to compare the performance of eight such methods. One of these methods, described here, incorporated a machine-learning–based ratiometric approach. We show that the driver genes predicted by each of the eight methods vary widely. Moreover, the P values reported by several of the methods were inconsistent with the uniform values expected, thus calling into question the assumptions that were used to generate them. Finally, we evaluated the potential effects of unexplained variability in mutation rates on false-positive driver gene predictions. Our analysis points to the strengths and weaknesses of each of the currently available methods and offers guidance for improving them in the future. PMID:27911828
Terrill Vosbein, Heidi A; Boatz, Jerry A; Kenney, John W
2005-12-22
The moment analysis method (MA) has been tested for the case of 2S --> 2P ([core]ns1 --> [core]np1) transitions of alkali metal atoms (M) doped into cryogenic rare gas (Rg) matrices using theoretically validated simulations. Theoretical/computational M/Rg system models are constructed with precisely defined parameters that closely mimic known M/Rg systems. Monte Carlo (MC) techniques are then employed to generate simulated absorption and magnetic circular dichroism (MCD) spectra of the 2S --> 2P M/Rg transition to which the MA method can be applied with the goal of seeing how effective the MA method is in re-extracting the M/Rg system parameters from these known simulated systems. The MA method is summarized in general, and an assessment is made of the use of the MA method in the rigid shift approximation typically used to evaluate M/Rg systems. The MC-MCD simulation technique is summarized, and validating evidence is presented. The simulation results and the assumptions used in applying MA to M/Rg systems are evaluated. The simulation results on Na/Ar demonstrate that the MA method does successfully re-extract the 2P spin-orbit coupling constant and Landé g-factor values initially used to build the simulations. However, assigning physical significance to the cubic and noncubic Jahn-Teller (JT) vibrational mode parameters in cryogenic M/Rg systems is not supported.
NASA Technical Reports Server (NTRS)
Hyder, Imran; Schaefer, Joseph; Justusson, Brian; Wanthal, Steve; Leone, Frank; Rose, Cheryl
2017-01-01
Reducing the timeline for development and certification for composite structures has been a long standing objective of the aerospace industry. This timeline can be further exacerbated when attempting to integrate new fiber-reinforced composite materials due to the large number of testing required at every level of design. computational progressive damage and failure analysis (PDFA) attempts to mitigate this effect; however, new PDFA methods have been slow to be adopted in industry since material model evaluation techniques have not been fully defined. This study presents an efficient evaluation framework which uses a piecewise verification and validation (V&V) approach for PDFA methods. Specifically, the framework is applied to evaluate PDFA research codes within the context of intralaminar damage. Methods are incrementally taken through various V&V exercises specifically tailored to study PDFA intralaminar damage modeling capability. Finally, methods are evaluated against a defined set of success criteria to highlight successes and limitations.
Scientific use of the finite element method in Orthodontics
Knop, Luegya; Gandini, Luiz Gonzaga; Shintcovsk, Ricardo Lima; Gandini, Marcia Regina Elisa Aparecida Schiavon
2015-01-01
INTRODUCTION: The finite element method (FEM) is an engineering resource applied to calculate the stress and deformation of complex structures, and has been widely used in orthodontic research. With the advantage of being a non-invasive and accurate method that provides quantitative and detailed data on the physiological reactions possible to occur in tissues, applying the FEM can anticipate the visualization of these tissue responses through the observation of areas of stress created from applied orthodontic mechanics. OBJECTIVE: This article aims at reviewing and discussing the stages of the finite element method application and its applicability in Orthodontics. RESULTS: FEM is able to evaluate the stress distribution at the interface between periodontal ligament and alveolar bone, and the shifting trend in various types of tooth movement when using different types of orthodontic devices. Therefore, it is necessary to know specific software for this purpose. CONCLUSIONS: FEM is an important experimental method to answer questions about tooth movement, overcoming the disadvantages of other experimental methods. PMID:25992996
Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2016-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize, 3272. We first attempted to obtain genome DNA from this maize using a DNeasy Plant Maxi kit and a DNeasy Plant Mini kit, which have been widely utilized in our previous studies, but DNA extraction yields from 3272 were markedly lower than those from non-GM maize seeds. However, lowering of DNA extraction yields was not observed with GM quicker or Genomic-tip 20/G. We chose GM quicker for evaluation of the quantitative method. We prepared a standard plasmid for 3272 quantification. The conversion factor (Cf), which is required to calculate the amount of a genetically modified organism (GMO), was experimentally determined for two real-time PCR instruments, the Applied Biosystems 7900HT (the ABI 7900) and the Applied Biosystems 7500 (the ABI7500). The determined Cf values were 0.60 and 0.59 for the ABI 7900 and the ABI 7500, respectively. To evaluate the developed method, a blind test was conducted as part of an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSDr). The determined values were similar to those in our previous validation studies. The limit of quantitation for the method was estimated to be 0.5% or less, and we concluded that the developed method would be suitable and practical for detection and quantification of 3272.
The Role of Applied Epidemiology Methods in the Disaster Management Cycle
Heumann, Michael; Perrotta, Dennis; Wolkin, Amy F.; Schnall, Amy H.; Podgornik, Michelle N.; Cruz, Miguel A.; Horney, Jennifer A.; Zane, David; Roisman, Rachel; Greenspan, Joel R.; Thoroughman, Doug; Anderson, Henry A.; Wells, Eden V.; Simms, Erin F.
2014-01-01
Disaster epidemiology (i.e., applied epidemiology in disaster settings) presents a source of reliable and actionable information for decision-makers and stakeholders in the disaster management cycle. However, epidemiological methods have yet to be routinely integrated into disaster response and fully communicated to response leaders. We present a framework consisting of rapid needs assessments, health surveillance, tracking and registries, and epidemiological investigations, including risk factor and health outcome studies and evaluation of interventions, which can be practiced throughout the cycle. Applying each method can result in actionable information for planners and decision-makers responsible for preparedness, response, and recovery. Disaster epidemiology, once integrated into the disaster management cycle, can provide the evidence base to inform and enhance response capability within the public health infrastructure. PMID:25211748
Zhang, Yan-zhen; Zhou, Yan-chun; Liu, Li; Zhu, Yan
2007-01-01
Simple, reliable and sensitive analytical methods to determine anticariogenic agents, preservatives, and artificial sweeteners contained in commercial gargles are necessary for evaluating their effectiveness, safety, and quality. An ion chromatography (IC) method has been described to analyze simultaneously eight anions including fluoride, chloride, sulfate, phosphate, monofluorophosphate, glycerophosphate (anticariogenic agents), sorbate (a preservative), and saccharin (an artificial sweetener) in gargles. In this IC system, we applied a mobile phased gradient elution with KOH, separation by IonPac AS18 columns, and suppressed conductivity detection. Optimized analytical conditions were further evaluated for accuracy. The relative standard deviations (RSDs) of the inter-day’s retention time and peak area of all species were less than 0.938% and 8.731%, respectively, while RSDs of 5-day retention time and peak area were less than 1.265% and 8.934%, respectively. The correlation coefficients for targeted analytes ranged from 0.999 7 to 1.000 0. The spiked recoveries for the anions were 90%~102.5%. We concluded that the method can be applied for comprehensive evaluation of commercial gargles. PMID:17610331
Zhang, Yan-zhen; Zhou, Yan-chun; Liu, Li; Zhu, Yan
2007-07-01
Simple, reliable and sensitive analytical methods to determine anticariogenic agents, preservatives, and artificial sweeteners contained in commercial gargles are necessary for evaluating their effectiveness, safety, and quality. An ion chromatography (IC) method has been described to analyze simultaneously eight anions including fluoride, chloride, sulfate, phosphate, monofluorophosphate, glycerophosphate (anticariogenic agents), sorbate (a preservative), and saccharin (an artificial sweetener) in gargles. In this IC system, we applied a mobile phased gradient elution with KOH, separation by IonPac AS18 columns, and suppressed conductivity detection. Optimized analytical conditions were further evaluated for accuracy. The relative standard deviations (RSDs) of the inter-day's retention time and peak area of all species were less than 0.938% and 8.731%, respectively, while RSDs of 5-day retention time and peak area were less than 1.265% and 8.934%, respectively. The correlation coefficients for targeted analytes ranged from 0.999 7 to 1.000 0. The spiked recoveries for the anions were 90% approximately 102.5%. We concluded that the method can be applied for comprehensive evaluation of commercial gargles.
NASA Technical Reports Server (NTRS)
Bennett, Floyd V.; Yntema, Robert T.
1959-01-01
Several approximate procedures for calculating the bending-moment response of flexible airplanes to continuous isotropic turbulence are presented and evaluated. The modal methods (the mode-displacement and force-summation methods) and a matrix method (segmented-wing method) are considered. These approximate procedures are applied to a simplified airplane for which an exact solution to the equation of motion can be obtained. The simplified airplane consists of a uniform beam with a concentrated fuselage mass at the center. Airplane motions are limited to vertical rigid-body translation and symmetrical wing bending deflections. Output power spectra of wing bending moments based on the exact transfer-function solutions are used as a basis for the evaluation of the approximate methods. It is shown that the force-summation and the matrix methods give satisfactory accuracy and that the mode-displacement method gives unsatisfactory accuracy.
Properties of a Formal Method for Prediction of Emergent Behaviors in Swarm-based Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James
2004-01-01
Autonomous intelligent swarms of satellites are being proposed for NASA missions that have complex behaviors and interactions. The emergent properties of swarms make these missions powerful, but at the same time more difficult to design and assure that proper behaviors will emerge. This paper gives the results of research into formal methods techniques for verification and validation of NASA swarm-based missions. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft. The NASA ANTS mission was used as an example of swarm intelligence for which to apply the formal methods. This paper will give the evaluation of these formal methods and give partial specifications of the ANTS mission using four selected methods. We then give an evaluation of the methods and the needed properties of a formal method for effective specification and prediction of emergent behavior in swarm-based systems.
48 CFR 2415.304 - Evaluation factors.
Code of Federal Regulations, 2010 CFR
2010-10-01
... DEVELOPMENT CONTRACTING METHODS AND CONTRACTING TYPES CONTRACTING BY NEGOTIATION Source Selection 2415.304... assigned a numerical weight (except for pass-fail factors) which shall appear in the RFP. When using LPTA, each evaluation factor is applied on a “pass-fail” basis; numerical scores are not assigned. “Pass-fail...
A Performance-Based Method of Student Evaluation
ERIC Educational Resources Information Center
Nelson, G. E.; And Others
1976-01-01
The Problem Oriented Medical Record (which allows practical definition of the behavioral terms thoroughness, reliability, sound analytical sense, and efficiency as they apply to the identification and management of patient problems) provides a vehicle to use in performance based type evaluation. A test-run use of the record is reported. (JT)
Charting the Impact of Federal Spending for Education Research: A Bibliometric Approach
ERIC Educational Resources Information Center
Milesi, Carolina; Brown, Kevin L.; Hawkley, Louise; Dropkin, Eric; Schneider, Barbara L.
2014-01-01
Impact evaluation plays a critical role in determining whether federally funded research programs in science, technology, engineering, and mathematics are wise investments. This paper develops quantitative methods for program evaluation and applies this approach to a flagship National Science Foundation-funded education research program, Research…
Teaching, Learning and Evaluation Techniques in the Engineering Courses.
ERIC Educational Resources Information Center
Vermaas, Luiz Lenarth G.; Crepaldi, Paulo Cesar; Fowler, Fabio Roberto
This article presents some techniques of professional formation from the Petra Model that can be applied in Engineering Programs. It shows its philosophy, teaching methods for listening, making abstracts, studying, researching, team working and problem solving. Some questions regarding planning and evaluation, based in the model are, as well,…
Evaluation of the Laplace Integral. Classroom Notes
ERIC Educational Resources Information Center
Chen, Hongwei
2004-01-01
Based on the dominated convergence theorem and parametric differentiation, two different evaluations of the Laplace integral are displayed. This article presents two different proofs of (1) which may be of interest since they are based on principles within the realm of real analysis. The first method applies the dominated convergence theorem to…
A method is presented and applied for evaluating an air quality model’s changes in pollutant concentrations stemming from changes in emissions while explicitly accounting for the uncertainties in the base emission inventory. Specifically, the Community Multiscale Air Quality (CMA...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... cognitive interviews, focus groups, Pilot household interviews, and experimental research in laboratory and field settings, both for applied questionnaire evaluation and more basic research on response errors in surveys. The most common evaluation method is the cognitive interview, in which a questionnaire design...
NASA Astrophysics Data System (ADS)
Federico, Alejandro; Kaufmann, Guillermo H.
2004-08-01
We evaluate the application of the Wigner-Ville distribution (WVD) to measure phase gradient maps in digital speckle pattern interferometry (DSPI), when the generated correlation fringes present phase discontinuities. The performance of the WVD method is evaluated using computer-simulated fringes. The influence of the filtering process to smooth DSPI fringes and additional drawbacks that emerge when this method is applied are discussed. A comparison with the conventional method based on the continuous wavelet transform in the stationary phase approximation is also presented.
Identification of stochastic interactions in nonlinear models of structural mechanics
NASA Astrophysics Data System (ADS)
Kala, Zdeněk
2017-07-01
In the paper, the polynomial approximation is presented by which the Sobol sensitivity analysis can be evaluated with all sensitivity indices. The nonlinear FEM model is approximated. The input area is mapped using simulations runs of Latin Hypercube Sampling method. The domain of the approximation polynomial is chosen so that it were possible to apply large number of simulation runs of Latin Hypercube Sampling method. The method presented also makes possible to evaluate higher-order sensitivity indices, which could not be identified in case of nonlinear FEM.
Reviews of Single Subject Research Designs: Applications to Special Education and School Psychology
ERIC Educational Resources Information Center
Nevin, Ann I., Ed.
2004-01-01
The authors of this collection of research reviews studied how single subject research designs might be a useful method to apply as part of being accountable to clients. The single subject research studies were evaluated in accordance with the following criteria: Was the study applied, behavioral, reliable, analytic, effective, and generalizable?…
The Formative Evaluation of a Web-based Course-Management System within a University Setting.
ERIC Educational Resources Information Center
Maslowski, Ralf; Visscher, Adrie J.; Collis, Betty; Bloemen, Paul P. M.
2000-01-01
Discussion of Web-based course management systems (W-CMSs) in higher education focuses on formative evaluation and its contribution in the design and development of high-quality W-CMSs. Reviews methods and techniques that can be applied in formative evaluation and examines TeLeTOP, a W-CMS produced at the University of Twente (Netherlands). (LRW)
Guermazi, Ali; Hunter, David J; Roemer, Frank W
2009-02-01
Osteoarthritis is the most common joint disorder worldwide, and it has an enormous socioeconomic impact both in the United States and throughout the world. Conventional radiography is the simplest and least expensive imaging method for assessing osteoarthritis of the knee. Radiography is able to directly visualize osseous features of osteoarthritis, including marginal osteophytes, subchondral sclerosis, and subchondral cysts, and it is used in clinical practice to confirm the diagnosis of osteoarthritis and to monitor progression of the disease. However, the assessment of joint-space width provides only an indirect estimate of cartilage thickness and meniscal integrity. Magnetic resonance imaging, with its unique ability to examine the joint as a whole organ, holds great promise with regard to the rapid advancement of knowledge about the disease and the evaluation of novel treatment approaches. Magnetic resonance imaging has been applied widely in quantitative morphometric cartilage assessment, and compositional measures have been introduced that evaluate chondral integrity. In addition, magnetic resonance imaging-based validated semiquantitative whole-organ scoring methods have been applied for cross-sectional and longitudinal joint evaluation. This review describes currently applied radiographic and magnetic resonance imaging staging and scoring methods for the assessment of osteoarthritis of the knee and focuses on the strengths and weaknesses of the two modalities with regard to their use in clinical trials and epidemiologic studies.
ERIC Educational Resources Information Center
General Accounting Office, Washington, DC. Program Evaluation and Methodology Div.
This general program evaluation framework provides a wide range of criteria that can be applied in the evaluation of diverse federal progams. The framework was developed from a literature search on program evaluation methods and their use, the experiences of the United States Government Accounting Office (GAO), and consideration of the types of…
Li, Zhidong; Marinova, Dora; Guo, Xiumei; Gao, Yuan
2015-01-01
Many steel-based cities in China were established between the 1950s and 1960s. After more than half a century of development and boom, these cities are starting to decline and industrial transformation is urgently needed. This paper focuses on evaluating the transformation capability of resource-based cities building an evaluation model. Using Text Mining and the Document Explorer technique as a way of extracting text features, the 200 most frequently used words are derived from 100 publications related to steel- and other resource-based cities. The Expert Evaluation Method (EEM) and Analytic Hierarchy Process (AHP) techniques are then applied to select 53 indicators, determine their weights and establish an index system for evaluating the transformation capability of the pillar industry of China’s steel-based cities. Using real data and expert reviews, the improved Fuzzy Relation Matrix (FRM) method is applied to two case studies in China, namely Panzhihua and Daye, and the evaluation model is developed using Fuzzy Comprehensive Evaluation (FCE). The cities’ abilities to carry out industrial transformation are evaluated with concerns expressed for the case of Daye. The findings have policy implications for the potential and required industrial transformation in the two selected cities and other resource-based towns. PMID:26422266
Li, Zhidong; Marinova, Dora; Guo, Xiumei; Gao, Yuan
2015-01-01
Many steel-based cities in China were established between the 1950s and 1960s. After more than half a century of development and boom, these cities are starting to decline and industrial transformation is urgently needed. This paper focuses on evaluating the transformation capability of resource-based cities building an evaluation model. Using Text Mining and the Document Explorer technique as a way of extracting text features, the 200 most frequently used words are derived from 100 publications related to steel- and other resource-based cities. The Expert Evaluation Method (EEM) and Analytic Hierarchy Process (AHP) techniques are then applied to select 53 indicators, determine their weights and establish an index system for evaluating the transformation capability of the pillar industry of China's steel-based cities. Using real data and expert reviews, the improved Fuzzy Relation Matrix (FRM) method is applied to two case studies in China, namely Panzhihua and Daye, and the evaluation model is developed using Fuzzy Comprehensive Evaluation (FCE). The cities' abilities to carry out industrial transformation are evaluated with concerns expressed for the case of Daye. The findings have policy implications for the potential and required industrial transformation in the two selected cities and other resource-based towns.
Recommendations for evaluation of computational methods
NASA Astrophysics Data System (ADS)
Jain, Ajay N.; Nicholls, Anthony
2008-03-01
The field of computational chemistry, particularly as applied to drug design, has become increasingly important in terms of the practical application of predictive modeling to pharmaceutical research and development. Tools for exploiting protein structures or sets of ligands known to bind particular targets can be used for binding-mode prediction, virtual screening, and prediction of activity. A serious weakness within the field is a lack of standards with respect to quantitative evaluation of methods, data set preparation, and data set sharing. Our goal should be to report new methods or comparative evaluations of methods in a manner that supports decision making for practical applications. Here we propose a modest beginning, with recommendations for requirements on statistical reporting, requirements for data sharing, and best practices for benchmark preparation and usage.
Currently there are no EPA reference sampling methods that have been promulgated for measuring stack emissions of Hg from coal combustion sources, however, EPA Method 29 is most commonly applied. The draft ASTM Ontario Hydro Method for measuring oxidized, elemental, particulate-b...
The overall goal of this task is to help reduce the uncertainties in the assessment of environmental health and human exposure by better characterizing hazardous wastes through cost-effective analytical methods. Research projects are directed towards the applied development and ...
Teacher Portfolios: An Effective Way to Assess Teacher Performance and Enhance Learning
ERIC Educational Resources Information Center
Gelfer, Jeff; 'O' Hara, Katie; Krasch, Delilah; Nguyen, Neal
2015-01-01
Often administrators seek alternative methods of evaluating staff while staff are frequently searching for methods to represent the breadth and quality of their efforts. One method proving to be effective for gathering and organising products of teacher activity is the portfolio. This article will discuss the procedures that teachers can apply in…
Validation of a pulsed electric field process to pasteurize strawberry puree
USDA-ARS?s Scientific Manuscript database
An inexpensive data acquisition method was developed to validate the exact number and shape of the pulses applied during pulsed electric fields (PEF) processing. The novel validation method was evaluated in conjunction with developing a pasteurization PEF process for strawberry puree. Both buffered...
Objective evaluation of antitussive agents under clinical conditions.
Beumer, H M; Hardonk, H J; Boter, J; van Eijnsbergen, B
1976-01-01
A new method for objective assessment of cough under normal or pathological conditions is described. Thoracic coughing can be discriminated from any other pressure wave because of its relatively high frequency. This method was applied in a double blind crossover trial in 18 patients with respiratory disease.
Yehia, Ali M; Arafa, Reham M; Abbas, Samah S; Amer, Sawsan M
2016-01-15
Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL(-1). Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method. Copyright © 2015 Elsevier B.V. All rights reserved.
Alternative methods to evaluate trial level surrogacy.
Abrahantes, Josè Cortiñas; Shkedy, Ziv; Molenberghs, Geert
2008-01-01
The evaluation and validation of surrogate endpoints have been extensively studied in the last decade. Prentice [1] and Freedman, Graubard and Schatzkin [2] laid the foundations for the evaluation of surrogate endpoints in randomized clinical trials. Later, Buyse et al. [5] proposed a meta-analytic methodology, producing different methods for different settings, which was further studied by Alonso and Molenberghs [9], in their unifying approach based on information theory. In this article, we focus our attention on the trial-level surrogacy and propose alternative procedures to evaluate such surrogacy measure, which do not pre-specify the type of association. A promising correction based on cross-validation is investigated. As well as the construction of confidence intervals for this measure. In order to avoid making assumption about the type of relationship between the treatment effects and its distribution, a collection of alternative methods, based on regression trees, bagging, random forests, and support vector machines, combined with bootstrap-based confidence interval and, should one wish, in conjunction with a cross-validation based correction, will be proposed and applied. We apply the various strategies to data from three clinical studies: in opthalmology, in advanced colorectal cancer, and in schizophrenia. The results obtained for the three case studies are compared; they indicate that using random forest or bagging models produces larger estimated values for the surrogacy measure, which are in general stabler and the confidence interval narrower than linear regression and support vector regression. For the advanced colorectal cancer studies, we even found the trial-level surrogacy is considerably different from what has been reported. In general the alternative methods are more computationally demanding, and specially the calculation of the confidence intervals, require more computational time that the delta-method counterpart. First, more flexible modeling techniques can be used, allowing for other type of association. Second, when no cross-validation-based correction is applied, overly optimistic trial-level surrogacy estimates will be found, thus cross-validation is highly recommendable. Third, the use of the delta method to calculate confidence intervals is not recommendable since it makes assumptions valid only in very large samples. It may also produce range-violating limits. We therefore recommend alternatives: bootstrap methods in general. Also, the information-theoretic approach produces comparable results with the bagging and random forest approaches, when cross-validation correction is applied. It is also important to observe that, even for the case in which the linear model might be a good option too, bagging methods perform well too, and their confidence intervals were more narrow.
Evaluating candidate reactions to selection practices using organisational justice theory.
Patterson, Fiona; Zibarras, Lara; Carr, Victoria; Irish, Bill; Gregory, Simon
2011-03-01
This study aimed to examine candidate reactions to selection practices in postgraduate medical training using organisational justice theory. We carried out three independent cross-sectional studies using samples from three consecutive annual recruitment rounds. Data were gathered from candidates applying for entry into UK general practice (GP) training during 2007, 2008 and 2009. Participants completed an evaluation questionnaire immediately after the short-listing stage and after the selection centre (interview) stage. Participants were doctors applying for GP training in the UK. Main outcome measures were participants' evaluations of the selection methods and perceptions of the overall fairness of each selection stage (short-listing and selection centre). A total of 23,855 evaluation questionnaires were completed (6893 in 2007, 10,497 in 2008 and 6465 in 2009). Absolute levels of perceptions of fairness of all the selection methods at both the short-listing and selection centre stages were consistently high over the 3years. Similarly, all selection methods were considered to be job-related by candidates. However, in general, candidates considered the selection centre stage to be significantly fairer than the short-listing stage. Of all the selection methods, the simulated patient consultation completed at the selection centre stage was rated as the most job-relevant. This is the first study to use a model of organisational justice theory to evaluate candidate reactions during selection into postgraduate specialty training. The high-fidelity selection methods are consistently viewed as more job-relevant and fairer by candidates. This has important implications for the design of recruitment systems for all specialties and, potentially, for medical school admissions. Using this approach, recruiters can systematically compare perceptions of the fairness and job relevance of various selection methods. © Blackwell Publishing Ltd 2011.
Iterative integral parameter identification of a respiratory mechanics model.
Schranz, Christoph; Docherty, Paul D; Chiew, Yeong Shiong; Möller, Knut; Chase, J Geoffrey
2012-07-18
Patient-specific respiratory mechanics models can support the evaluation of optimal lung protective ventilator settings during ventilation therapy. Clinical application requires that the individual's model parameter values must be identified with information available at the bedside. Multiple linear regression or gradient-based parameter identification methods are highly sensitive to noise and initial parameter estimates. Thus, they are difficult to apply at the bedside to support therapeutic decisions. An iterative integral parameter identification method is applied to a second order respiratory mechanics model. The method is compared to the commonly used regression methods and error-mapping approaches using simulated and clinical data. The clinical potential of the method was evaluated on data from 13 Acute Respiratory Distress Syndrome (ARDS) patients. The iterative integral method converged to error minima 350 times faster than the Simplex Search Method using simulation data sets and 50 times faster using clinical data sets. Established regression methods reported erroneous results due to sensitivity to noise. In contrast, the iterative integral method was effective independent of initial parameter estimations, and converged successfully in each case tested. These investigations reveal that the iterative integral method is beneficial with respect to computing time, operator independence and robustness, and thus applicable at the bedside for this clinical application.
Credit Risk Evaluation of Power Market Players with Random Forest
NASA Astrophysics Data System (ADS)
Umezawa, Yasushi; Mori, Hiroyuki
A new method is proposed for credit risk evaluation in a power market. The credit risk evaluation is to measure the bankruptcy risk of the company. The power system liberalization results in new environment that puts emphasis on the profit maximization and the risk minimization. There is a high probability that the electricity transaction causes a risk between companies. So, power market players are concerned with the risk minimization. As a management strategy, a risk index is requested to evaluate the worth of the business partner. This paper proposes a new method for evaluating the credit risk with Random Forest (RF) that makes ensemble learning for the decision tree. RF is one of efficient data mining technique in clustering data and extracting relationship between input and output data. In addition, the method of generating pseudo-measurements is proposed to improve the performance of RF. The proposed method is successfully applied to real financial data of energy utilities in the power market. A comparison is made between the proposed and the conventional methods.
NASA Astrophysics Data System (ADS)
Moon, Byung-Young
2005-12-01
The hybrid neural-genetic multi-model parameter estimation algorithm was demonstrated. This method can be applied to structured system identification of electro-hydraulic servo system. This algorithms consist of a recurrent incremental credit assignment(ICRA) neural network and a genetic algorithm. The ICRA neural network evaluates each member of a generation of model and genetic algorithm produces new generation of model. To evaluate the proposed method, electro-hydraulic servo system was designed and manufactured. The experiment was carried out to figure out the hybrid neural-genetic multi-model parameter estimation algorithm. As a result, the dynamic characteristics were obtained such as the parameters(mass, damping coefficient, bulk modulus, spring coefficient), which minimize total square error. The result of this study can be applied to hydraulic systems in industrial fields.
Coates, Jennifer; Colaiezzi, Brooke; Fiedler, John L; Wirth, James; Lividini, Keith; Rogers, Beatrice
2012-09-01
Dietary assessment data are essential for designing, monitoring, and evaluating food fortification and other food-based nutrition programs. Planners and managers must understand the validity, usefulness, and cost tradeoffs of employing alternative dietary assessment methods, but little guidance exists. To identify and apply criteria to assess the tradeoffs of using alternative dietary methods for meeting fortification programming needs. Twenty-five semistructured expert interviews were conducted and literature was reviewed for information on the validity, usefulness, and cost of using 24-hour recalls, Food Frequency Questionnaires/Fortification Rapid Assessment Tool (FFQ/FRAT), Food Balance Sheets (FBS), and Household Consumption and Expenditures Surveys (HCES) for program stage-specific information needs. Criteria were developed and applied to construct relative rankings of the four methods. Needs assessment: HCES offers the greatest suitability at the lowest cost for estimating the risk of inadequate intakes, but relative to 24-hour recall compromises validity. HCES should be used to identify vehicles and to estimate coverage and likely impact due to its low cost and moderate-to-high validity. Baseline assessment: 24-hour recall should be applied using a representative sample. Monitoring: A simple, low-cost FFQ can be used to monitor coverage. Impact evaluation: 24-hour recall should be used to assess changes in nutrient intakes. FBS have low validity relative to other methods for all programmatic purposes. Each dietary assessment method has strengths and weaknesses that vary by context and purpose. Method selection must be driven by the program's data needs, the suitability of the methods for the purpose, and a clear understanding of the tradeoffs involved.
Balbale, Salva N.; Locatelli, Sara M.; LaVela, Sherri L.
2016-01-01
In this methodological article, we examine participatory methods in-depth to demonstrate how these methods can be adopted for quality improvement (QI) projects in health care. We draw on existing literature and our QI initiatives in the Department of Veterans Affairs to discuss the application of photovoice and guided tours in QI efforts. We highlight lessons learned and several benefits of using participatory methods in this area. Using participatory methods, evaluators can engage patients, providers and other stakeholders as partners to enhance care. Participant involvement helps yield actionable data that can be translated into improved care practices. Use of these methods also helps generate key insights to inform improvements that truly resonate with stakeholders. Using participatory methods is a valuable strategy to harness participant engagement and drive improvements that address individual needs. In applying these innovative methodologies, evaluators can transcend traditional approaches to uniquely support evaluations and improvements in health care. PMID:26667882
Thébault, Caroline J; Ramniceanu, Grégory; Michel, Aude; Beauvineau, Claire; Girard, Christian; Seguin, Johanne; Mignet, Nathalie; Ménager, Christine; Doan, Bich-Thuy
2018-06-25
The development of theranostic nanocarriers as an innovative therapy against cancer has been improved by targeting properties in order to optimize the drug delivery to safely achieve its desired therapeutic effect. The aim of this paper is to evaluate the magnetic targeting (MT) efficiency of ultra-magnetic liposomes (UML) into CT26 murine colon tumor by magnetic resonance imaging (MRI). Dynamic susceptibility contrast MRI was applied to assess the bloodstream circulation time. A novel semi-quantitative method called %I 0.25 , based on the intensity distribution in T 2 * -weighted MRI images was developed to compare the accumulation of T 2 contrast agent in tumors with or without MT. To evaluate the efficiency of magnetic targeting, the percentage of pixels under the intensity value I 0.25 (I 0.25 = 0.25(I max - I min )) was calculated on the intensity distribution histogram. This innovative method of processing MRI images showed the MT efficiency by a %I 0.25 that was significantly higher in tumors using MT compared to passive accumulation, from 15.3 to 28.6 %. This methodology was validated by ex vivo methods with an iron concentration that is 3-fold higher in tumors using MT. We have developed a method that allows a semi-quantitative evaluation of targeting efficiency in tumors, which could be applied to different T 2 contrast agents.
Liu, Zhe; Geng, Yong; Zhang, Pan; Dong, Huijuan; Liu, Zuoxi
2014-09-01
In China, local governments of many areas prefer to give priority to the development of heavy industrial clusters in pursuit of high value of gross domestic production (GDP) growth to get political achievements, which usually results in higher costs from ecological degradation and environmental pollution. Therefore, effective methods and reasonable evaluation system are urgently needed to evaluate the overall efficiency of industrial clusters. Emergy methods links economic and ecological systems together, which can evaluate the contribution of ecological products and services as well as the load placed on environmental systems. This method has been successfully applied in many case studies of ecosystem but seldom in industrial clusters. This study applied the methodology of emergy analysis to perform the efficiency of industrial clusters through a series of emergy-based indices as well as the proposed indicators. A case study of Shenyang Economic Technological Development Area (SETDA) was investigated to show the emergy method's practical potential to evaluate industrial clusters to inform environmental policy making. The results of our study showed that the industrial cluster of electric equipment and electronic manufacturing produced the most economic value and had the highest efficiency of energy utilization among the four industrial clusters. However, the sustainability index of the industrial cluster of food and beverage processing was better than the other industrial clusters.
Research on Novel Algorithms for Smart Grid Reliability Assessment and Economic Dispatch
NASA Astrophysics Data System (ADS)
Luo, Wenjin
In this dissertation, several studies of electric power system reliability and economy assessment methods are presented. To be more precise, several algorithms in evaluating power system reliability and economy are studied. Furthermore, two novel algorithms are applied to this field and their simulation results are compared with conventional results. As the electrical power system develops towards extra high voltage, remote distance, large capacity and regional networking, the application of a number of new technique equipments and the electric market system have be gradually established, and the results caused by power cut has become more and more serious. The electrical power system needs the highest possible reliability due to its complication and security. In this dissertation the Boolean logic Driven Markov Process (BDMP) method is studied and applied to evaluate power system reliability. This approach has several benefits. It allows complex dynamic models to be defined, while maintaining its easy readability as conventional methods. This method has been applied to evaluate IEEE reliability test system. The simulation results obtained are close to IEEE experimental data which means that it could be used for future study of the system reliability. Besides reliability, modern power system is expected to be more economic. This dissertation presents a novel evolutionary algorithm named as quantum evolutionary membrane algorithm (QEPS), which combines the concept and theory of quantum-inspired evolutionary algorithm and membrane computation, to solve the economic dispatch problem in renewable power system with on land and offshore wind farms. The case derived from real data is used for simulation tests. Another conventional evolutionary algorithm is also used to solve the same problem for comparison. The experimental results show that the proposed method is quick and accurate to obtain the optimal solution which is the minimum cost for electricity supplied by wind farm system.
Anharmonic effects in the quantum cluster equilibrium method
NASA Astrophysics Data System (ADS)
von Domaros, Michael; Perlt, Eva
2017-03-01
The well-established quantum cluster equilibrium (QCE) model provides a statistical thermodynamic framework to apply high-level ab initio calculations of finite cluster structures to macroscopic liquid phases using the partition function. So far, the harmonic approximation has been applied throughout the calculations. In this article, we apply an important correction in the evaluation of the one-particle partition function and account for anharmonicity. Therefore, we implemented an analytical approximation to the Morse partition function and the derivatives of its logarithm with respect to temperature, which are required for the evaluation of thermodynamic quantities. This anharmonic QCE approach has been applied to liquid hydrogen chloride and cluster distributions, and the molar volume, the volumetric thermal expansion coefficient, and the isobaric heat capacity have been calculated. An improved description for all properties is observed if anharmonic effects are considered.
Automatic evaluation of skin histopathological images for melanocytic features
NASA Astrophysics Data System (ADS)
Koosha, Mohaddeseh; Hoseini Alinodehi, S. Pourya; Nicolescu, Mircea; Safaei Naraghi, Zahra
2017-03-01
Successfully detecting melanocyte cells in the skin epidermis has great significance in skin histopathology. Because of the existence of cells with similar appearance to melanocytes in hematoxylin and eosin (HE) images of the epidermis, detecting melanocytes becomes a challenging task. This paper proposes a novel technique for the detection of melanocytes in HE images of the epidermis, based on the melanocyte color features, in the HSI color domain. Initially, an effective soft morphological filter is applied to the HE images in the HSI color domain to remove noise. Then a novel threshold-based technique is applied to distinguish the candidate melanocytes' nuclei. Similarly, the method is applied to find the candidate surrounding halos of the melanocytes. The candidate nuclei are associated with their surrounding halos using the suggested logical and statistical inferences. Finally, a fuzzy inference system is proposed, based on the HSI color information of a typical melanocyte in the epidermis, to calculate the similarity ratio of each candidate cell to a melanocyte. As our review on the literature shows, this is the first method evaluating epidermis cells for melanocyte similarity ratio. Experimental results on various images with different zooming factors show that the proposed method improves the results of previous works.
Yabalak, Erdal
2018-05-18
This study was performed to investigate the mineralization of ticarcillin in the artificially prepared aqueous solution presenting ticarcillin contaminated waters, which constitute a serious problem for human health. 81.99% of total organic carbon removal, 79.65% of chemical oxygen demand removal, and 94.35% of ticarcillin removal were achieved by using eco-friendly, time-saving, powerful and easy-applying, subcritical water oxidation method in the presence of a safe-to-use oxidizing agent, hydrogen peroxide. Central composite design, which belongs to the response surface methodology, was applied to design the degradation experiments, to optimize the methods, to evaluate the effects of the system variables, namely, temperature, hydrogen peroxide concentration, and treatment time, on the responses. In addition, theoretical equations were proposed in each removal processes. ANOVA tests were utilized to evaluate the reliability of the performed models. F values of 245.79, 88.74, and 48.22 were found for total organic carbon removal, chemical oxygen demand removal, and ticarcillin removal, respectively. Moreover, artificial neural network modeling was applied to estimate the response in each case and its prediction and optimizing performance was statistically examined and compared to the performance of central composite design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhatt, Uma S.; Wackerbauer, Renate; Polyakov, Igor V.
The goal of this research was to apply fractional and non-linear analysis techniques in order to develop a more complete characterization of climate change and variability for the oceanic, sea ice and atmospheric components of the Earth System. This research applied two measures of dynamical characteristics of time series, the R/S method of calculating the Hurst exponent and Renyi entropy, to observational and modeled climate data in order to evaluate how well climate models capture the long-term dynamics evident in observations. Fractional diffusion analysis was applied to ARGO ocean buoy data to quantify ocean transport. Self organized maps were appliedmore » to North Pacific sea level pressure and analyzed in ways to improve seasonal predictability for Alaska fire weather. This body of research shows that these methods can be used to evaluate climate models and shed light on climate mechanisms (i.e., understanding why something happens). With further research, these methods show promise for improving seasonal to longer time scale forecasts of climate.« less
Statistical methods used in articles published by the Journal of Periodontal and Implant Science.
Choi, Eunsil; Lyu, Jiyoung; Park, Jinyoung; Kim, Hae-Young
2014-12-01
The purposes of this study were to assess the trend of use of statistical methods including parametric and nonparametric methods and to evaluate the use of complex statistical methodology in recent periodontal studies. This study analyzed 123 articles published in the Journal of Periodontal & Implant Science (JPIS) between 2010 and 2014. Frequencies and percentages were calculated according to the number of statistical methods used, the type of statistical method applied, and the type of statistical software used. Most of the published articles considered (64.4%) used statistical methods. Since 2011, the percentage of JPIS articles using statistics has increased. On the basis of multiple counting, we found that the percentage of studies in JPIS using parametric methods was 61.1%. Further, complex statistical methods were applied in only 6 of the published studies (5.0%), and nonparametric statistical methods were applied in 77 of the published studies (38.9% of a total of 198 studies considered). We found an increasing trend towards the application of statistical methods and nonparametric methods in recent periodontal studies and thus, concluded that increased use of complex statistical methodology might be preferred by the researchers in the fields of study covered by JPIS.
Interactive and Hands-on Methods for Professional Development of Undergraduate Researchers
NASA Astrophysics Data System (ADS)
Pressley, S. N.; LeBeau, J. E.
2016-12-01
Professional development workshops for undergraduate research programs can range from communicating science (i.e. oral, technical writing, poster presentations), applying for fellowships and scholarships, applying to graduate school, and learning about careers, among others. Novel methods of presenting the information on the above topics can result in positive outcomes beyond the obvious of transferring knowledge. Examples of innovative methods to present professional development information include 1) An interactive session on how to write an abstract where students are given an opportunity to draft an abstract from a short technical article, followed by discussion amongst a group of peers, and comparison with the "published" abstract. 2) Using the Process Oriented Guided Inquiry Learning (POGIL) method to evaluate and critique a research poster. 3) Inviting "experts" such as a Fulbright scholar graduate student to present on applying for fellowships and scholarships. These innovative methods of delivery provide more hands-on activities that engage the students, and in some cases (abstract writing) provide practice for the student. The methods also require that students develop team work skills, communicate amongst their peers, and develop networks with their cohort. All of these are essential non-technical skills needed for success in any career. Feedback from students on these sessions are positive and most importantly, the students walk out of the session with a smile on their face saying how much fun it was. Evaluating the impact of these sessions is more challenging and under investigation currently.
ON-SITE MERCURY ANALYSIS OF SOIL AT HAZARDOUS WASTE SITES BY IMMUNOASSAY AND ASV
Two field methods for Hg, immunoassay and anodic stripping voltammetry (ASV), that can provide onsite results for quick decisions at hazardous waste sites were evaluated. Each method was applied to samples from two Superfund sites that contain high levels of Hg; Sulphur Bank Me...
A number of investigators have recently examined the utility of applying probabilistic techniques in the derivation of toxic equivalency factors (TEFs) for polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs) and dioxin-like polychlorinated biphenyls (...
Chaurasia, Ashok; Harel, Ofer
2015-02-10
Tests for regression coefficients such as global, local, and partial F-tests are common in applied research. In the framework of multiple imputation, there are several papers addressing tests for regression coefficients. However, for simultaneous hypothesis testing, the existing methods are computationally intensive because they involve calculation with vectors and (inversion of) matrices. In this paper, we propose a simple method based on the scalar entity, coefficient of determination, to perform (global, local, and partial) F-tests with multiply imputed data. The proposed method is evaluated using simulated data and applied to suicide prevention data. Copyright © 2014 John Wiley & Sons, Ltd.
Strip Yield Model Numerical Application to Different Geometries and Loading Conditions
NASA Technical Reports Server (NTRS)
Hatamleh, Omar; Forman, Royce; Shivakumar, Venkataraman; Lyons, Jed
2006-01-01
A new numerical method based on the strip-yield analysis approach was developed for calculating the Crack Tip Opening Displacement (CTOD). This approach can be applied for different crack configurations having infinite and finite geometries, and arbitrary applied loading conditions. The new technique adapts the boundary element / dislocation density method to obtain crack-face opening displacements at any point on a crack, and succeeds by obtaining requisite values as a series of definite integrals, the functional parts of each being evaluated exactly in a closed form.
NASA Technical Reports Server (NTRS)
Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.
2012-01-01
There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.
New knowledge network evaluation method for design rationale management
NASA Astrophysics Data System (ADS)
Jing, Shikai; Zhan, Hongfei; Liu, Jihong; Wang, Kuan; Jiang, Hao; Zhou, Jingtao
2015-01-01
Current design rationale (DR) systems have not demonstrated the value of the approach in practice since little attention is put to the evaluation method of DR knowledge. To systematize knowledge management process for future computer-aided DR applications, a prerequisite is to provide the measure for the DR knowledge. In this paper, a new knowledge network evaluation method for DR management is presented. The method characterizes the DR knowledge value from four perspectives, namely, the design rationale structure scale, association knowledge and reasoning ability, degree of design justification support and degree of knowledge representation conciseness. The DR knowledge comprehensive value is also measured by the proposed method. To validate the proposed method, different style of DR knowledge network and the performance of the proposed measure are discussed. The evaluation method has been applied in two realistic design cases and compared with the structural measures. The research proposes the DR knowledge evaluation method which can provide object metric and selection basis for the DR knowledge reuse during the product design process. In addition, the method is proved to be more effective guidance and support for the application and management of DR knowledge.
Watbled, Ludivine; Marcilly, Romaric; Guerlinger, Sandra; Bastien, J-M Christian; Beuscart-Zéphir, Marie-Catherine; Beuscart, Régis
2018-02-01
Poor usability of health technology is thought to diminish work system performance, increase error rates and, potentially, harm patients. The present study (i) used a combination of usability evaluation methods to highlight the chain that leads from usability flaws to usage problems experienced by users and, ultimately, to negative patient outcomes, and (ii) validated this approach by studying two different discharge summary production systems. To comply with quality guidelines, the process of drafting and sending discharge summaries is increasingly being automated. However, the usability of these systems may modify their impact (or the absence thereof) in terms of production times and quality, and must therefore be evaluated. Here, we applied three successive techniques for usability evaluation (heuristic evaluation, user testing and field observation) to two discharge summary production systems (underpinned by different technologies). The systems' main usability flaws led respectively to an increase in the time need to produce a discharge summary and the risk of patient misidentification. Our results are discussed with regard to the possibility of linking the usability flaws, usage problems and the negative outcomes by successively applying three methods for evaluating usability (heuristic evaluation, user testing and in situ observations) throughout the system development life cycle. Copyright © 2018 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Voigt, Kristina; Benz, Joachim; Bruggemann, Rainer
An evaluation approach using the mathematical method of the Hasse diagram technique is applied on 20 environmental and chemical Internet resources. The data for this evaluation procedure are taken out of a metadatabase called DAIN (Metadatabase of Internet Resources for Environmental Chemicals) which is set up by the GSF Research Centre for…
A Qualitative Evaluation of an Online Expert-Facilitated Course on Tobacco Dependence Treatment.
Ebn Ahmady, Arezoo; Barker, Megan; Dragonetti, Rosa; Fahim, Myra; Selby, Peter
2017-01-01
Qualitative evaluations of courses prove difficult due to low response rates. Online courses may permit the analysis of qualitative feedback provided by health care providers (HCPs) during and after the course is completed. This study describes the use of qualitative methods for an online continuing medical education (CME) course through the analysis of HCP feedback for the purpose of quality improvement. We used formative and summative feedback from HCPs about their self-reported experiences of completing an online expert-facilitated course on tobacco dependence treatment (the Training Enhancement in Applied Cessation Counselling and Health [TEACH] Project). Phenomenological, inductive, and deductive approaches were applied to develop themes. QSR NVivo 11 was used to analyze the themes derived from free-text comments and responses to open-ended questions. A total of 277 out of 287 participants (96.5%) completed the course evaluations and provided 690 comments focused on how to improve the program. Five themes emerged from the formative evaluations: overall quality, content, delivery method, support, and time. The majority of comments (22.6%) in the formative evaluation expressed satisfaction with overall course quality. Suggestions for improvement were mostly for course content and delivery method (20.4% and 17.8%, respectively). Five themes emerged from the summative evaluation: feedback related to learning objectives, interprofessional collaboration, future topics of relevance, overall modifications, and overall satisfaction. Comments on course content, website function, timing, and support were the identified areas for improvement. This study provides a model to evaluate the effectiveness of online educational interventions. Significantly, this constructive approach to evaluation allows CME providers to take rapid corrective action.
A Qualitative Evaluation of an Online Expert-Facilitated Course on Tobacco Dependence Treatment
Ebn Ahmady, Arezoo; Barker, Megan; Dragonetti, Rosa; Fahim, Myra; Selby, Peter
2017-01-01
Qualitative evaluations of courses prove difficult due to low response rates. Online courses may permit the analysis of qualitative feedback provided by health care providers (HCPs) during and after the course is completed. This study describes the use of qualitative methods for an online continuing medical education (CME) course through the analysis of HCP feedback for the purpose of quality improvement. We used formative and summative feedback from HCPs about their self-reported experiences of completing an online expert-facilitated course on tobacco dependence treatment (the Training Enhancement in Applied Cessation Counselling and Health [TEACH] Project). Phenomenological, inductive, and deductive approaches were applied to develop themes. QSR NVivo 11 was used to analyze the themes derived from free-text comments and responses to open-ended questions. A total of 277 out of 287 participants (96.5%) completed the course evaluations and provided 690 comments focused on how to improve the program. Five themes emerged from the formative evaluations: overall quality, content, delivery method, support, and time. The majority of comments (22.6%) in the formative evaluation expressed satisfaction with overall course quality. Suggestions for improvement were mostly for course content and delivery method (20.4% and 17.8%, respectively). Five themes emerged from the summative evaluation: feedback related to learning objectives, interprofessional collaboration, future topics of relevance, overall modifications, and overall satisfaction. Comments on course content, website function, timing, and support were the identified areas for improvement. This study provides a model to evaluate the effectiveness of online educational interventions. Significantly, this constructive approach to evaluation allows CME providers to take rapid corrective action. PMID:28992759
A psychometric evaluation of the digital logic concept inventory
NASA Astrophysics Data System (ADS)
Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.
2014-10-01
Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.
Song, Bongyong; Park, Justin C; Song, William Y
2014-11-07
The Barzilai-Borwein (BB) 2-point step size gradient method is receiving attention for accelerating Total Variation (TV) based CBCT reconstructions. In order to become truly viable for clinical applications, however, its convergence property needs to be properly addressed. We propose a novel fast converging gradient projection BB method that requires 'at most one function evaluation' in each iterative step. This Selective Function Evaluation method, referred to as GPBB-SFE in this paper, exhibits the desired convergence property when it is combined with a 'smoothed TV' or any other differentiable prior. This way, the proposed GPBB-SFE algorithm offers fast and guaranteed convergence to the desired 3DCBCT image with minimal computational complexity. We first applied this algorithm to a Shepp-Logan numerical phantom. We then applied to a CatPhan 600 physical phantom (The Phantom Laboratory, Salem, NY) and a clinically-treated head-and-neck patient, both acquired from the TrueBeam™ system (Varian Medical Systems, Palo Alto, CA). Furthermore, we accelerated the reconstruction by implementing the algorithm on NVIDIA GTX 480 GPU card. We first compared GPBB-SFE with three recently proposed BB-based CBCT reconstruction methods available in the literature using Shepp-Logan numerical phantom with 40 projections. It is found that GPBB-SFE shows either faster convergence speed/time or superior convergence property compared to existing BB-based algorithms. With the CatPhan 600 physical phantom, the GPBB-SFE algorithm requires only 3 function evaluations in 30 iterations and reconstructs the standard, 364-projection FDK reconstruction quality image using only 60 projections. We then applied the algorithm to a clinically-treated head-and-neck patient. It was observed that the GPBB-SFE algorithm requires only 18 function evaluations in 30 iterations. Compared with the FDK algorithm with 364 projections, the GPBB-SFE algorithm produces visibly equivalent quality CBCT image for the head-and-neck patient with only 180 projections, in 131.7 s, further supporting its clinical applicability.
Application of Leadership Strategies Secondary to a Book Review
ERIC Educational Resources Information Center
Brahm, Nancy C.; Kissack, Julie C.; Grace, Susan M.; Lundquist, Lisa M.
2013-01-01
Purpose: Two-fold: [1] To evaluate available literature describing leadership management techniques and how these techniques can be used for pharmacy faculty transitioning into new academic roles and [2] To evaluate a leadership program and how it can be applied to the individual to enhance leadership skills. Methods: Literature related to…
ERIC Educational Resources Information Center
van Urk, Felix; Grant, Sean; Bonell, Chris
2016-01-01
The use of explicit programme theory to guide evaluation is widely recommended. However, practitioners and other partnering stakeholders often initiate programmes based on implicit theories, leaving researchers to explicate them before commencing evaluation. The current study aimed to apply a systematic method to undertake this process. We…
Evaluation Criteria for Competency-Based Syllabi: A Chilean Case Study Applying Mixed Methods
ERIC Educational Resources Information Center
Jerez, Oscar; Valenzuela, Leslier; Pizarro, Veronica; Hasbun, Beatriz; Valenzuela, Gabriela; Orsini, Cesar
2016-01-01
In recent decades, higher education institutions worldwide have been moving from knowledge-based to competence-based curricula. One of the greatest challenges in this transition is the difficulty in changing the knowledge-oriented practices of teachers. This study evaluates the consistency between syllabus design and the requirements imposed by a…
Designing Evaluations. 2012 Revision. Applied Research and Methods. GAO-12-208G
ERIC Educational Resources Information Center
US Government Accountability Office, 2012
2012-01-01
GAO assists congressional decision makers in their deliberations by furnishing them with analytical information on issues and options. Many diverse methodologies are needed to develop sound and timely answers to the questions the Congress asks. To provide GAO evaluators with basic information about the more commonly used methodologies, GAO's…
ERIC Educational Resources Information Center
Faw, Leyla; Hogue, Aaron; Liddle, Howard A.
2005-01-01
The authors applied contemporary methods from the evaluation literature to measure implementation in a residential treatment program for adolescent substance abuse. A logic model containing two main components was measured. Program structure (adherence to the intended framework of service delivery) was measured using data from daily activity logs…
Improving Survey Methods with Cognitive Interviews in Small- and Medium-Scale Evaluations
ERIC Educational Resources Information Center
Ryan, Katherine; Gannon-Slater, Nora; Culbertson, Michael J.
2012-01-01
Findings derived from self-reported, structured survey questionnaires are commonly used in evaluation and applied research to inform policy-making and program decisions. Although there are a variety of issues related to the quality of survey evidence (e.g., sampling precision), the validity of response processes--how respondents process thoughts…
From Speed Dating to Intimacy: Methodological Change in the Evaluation of a Writing Group
ERIC Educational Resources Information Center
Bosanquet, Agnes; Cahir, Jayde; Jacenyik-Trawoger, Christa; McNeill, Margot
2014-01-01
This paper explores an innovative approach to evaluating the effectiveness of a writing group in an Australian research-intensive university. Traditional qualitative and quantitative methods typically applied in higher-education research may be effective in analysing the output of writing groups; however, they do not always address the affective…
Evaluation of research in biomedical ontologies
Dumontier, Michel; Gkoutos, Georgios V.
2013-01-01
Ontologies are now pervasive in biomedicine, where they serve as a means to standardize terminology, to enable access to domain knowledge, to verify data consistency and to facilitate integrative analyses over heterogeneous biomedical data. For this purpose, research on biomedical ontologies applies theories and methods from diverse disciplines such as information management, knowledge representation, cognitive science, linguistics and philosophy. Depending on the desired applications in which ontologies are being applied, the evaluation of research in biomedical ontologies must follow different strategies. Here, we provide a classification of research problems in which ontologies are being applied, focusing on the use of ontologies in basic and translational research, and we demonstrate how research results in biomedical ontologies can be evaluated. The evaluation strategies depend on the desired application and measure the success of using an ontology for a particular biomedical problem. For many applications, the success can be quantified, thereby facilitating the objective evaluation and comparison of research in biomedical ontology. The objective, quantifiable comparison of research results based on scientific applications opens up the possibility for systematically improving the utility of ontologies in biomedical research. PMID:22962340
Method for Operating a Sensor to Differentiate Between Analytes in a Sample
Kunt, Tekin; Cavicchi, Richard E; Semancik, Stephen; McAvoy, Thomas J
1998-07-28
Disclosed is a method for operating a sensor to differentiate between first and second analytes in a sample. The method comprises the steps of determining a input profile for the sensor which will enhance the difference in the output profiles of the sensor as between the first analyte and the second analyte; determining a first analyte output profile as observed when the input profile is applied to the sensor; determining a second analyte output profile as observed when the temperature profile is applied to the sensor; introducing the sensor to the sample while applying the temperature profile to the sensor, thereby obtaining a sample output profile; and evaluating the sample output profile as against the first and second analyte output profiles to thereby determine which of the analytes is present in the sample.
Braioni, M G; Salmoiraghi, G; Bracco, F; Villani, M; Braioni, A; Girelli, L
2002-03-12
A model of analysis and environmental evaluation was applied to 11 stretches of the Adige River, where an innovative procedure was carried out to interpret ecological results. Within each stretch, the most suitable methods were used to assess the quality and processes of flood plains, banks, water column, bed, and interstitial environment. Indices were applied to evaluate the wild state and ecological quality of the banks (wild state index, buffer strip index) and the landscape quality of wide areas of the fluvial corridor (environmental landscape index). The biotic components (i.e., macrozoobenthos, phytoplankton and zooplankton, interstitial hyporheic fauna, vegetation in the riparian areas) were analysed by both quantitative and functional methods (as productivity, litter--processing and colonisation). The results achieved were then translated into five classes of functional evaluation. These qualitative assessments have thus preserved a high level of precision and sensitivity in quantifying both the quality of the environmental conditions and the integrity of the ecosystem processes. Read together with urban planning data, they indicate what actions are needed to restore and rehabilitate the Adige River corridor.
Evaluation of degree of blending colored diluents using color difference signal method.
Miyazaki, Yasunori; Uchino, Tomonobu; Kagawa, Yoshiyuki
2014-01-01
We developed a color difference signal method to evaluate the degree of blending powdered medicines in pharmacies. In the method, the degree of blending is expressed as the relative standard deviation of the color difference signal value (Cb or Cr) of the YCbCr color space after digital photos of the blended medicines are analyzed by image processing. While the method is effective to determine the degree of blending colored medicines, it remains unknown whether it can be applied to uncolored or white-colored medicines. To investigate this, we examined colored diluents to identify an indicator of the degree mixtures are blended. In this study, we applied this method to Pontal® and Prednisolone® powders, which were used as uncolored and white-colored medicines, respectively. Each of these medicines was blended with the colored lactose using a pestle and mortar, and then the uniformity of blending was evaluated. The degree of blending was well-monitored in both mixtures with various blending ratios (1 : 9-9 : 1), showing a sufficient uniformity at 60 rotations of the pestle. Moreover, the Cr values of the mixtures with various blending ratios were correlated with the concentration of active pharmaceutical ingredients in these medicines, which was determined using HPLC. This indicated the usefulness of the color difference signal method for the quantitative determination of medicines. Thus, we demonstrated the applicability and effectiveness of this method to check dispensing powders.
Lanying Lin; Sheng He; Feng Fu; Xiping Wang
2015-01-01
Wood failure percentage (WFP) is an important index for evaluating the bond strength of plywood. Currently, the method used for detecting WFP is visual inspection, which lacks efficiency. In order to improve it, image processing methods are applied to wood failure detection. The present study used thresholding and K-means clustering algorithms in wood failure detection...
ERIC Educational Resources Information Center
Weil, Joyce
2015-01-01
As Baby Boomers reach 65 years of age and methods of studying older populations are becoming increasingly varied (e.g., including mixed methods designs, on-line surveys, and video-based environments), there is renewed interest in evaluating methodologies used to collect data with older persons. The goal of this article is to examine…
An Ensemble Successive Project Algorithm for Liquor Detection Using Near Infrared Sensor.
Qu, Fangfang; Ren, Dong; Wang, Jihua; Zhang, Zhong; Lu, Na; Meng, Lei
2016-01-11
Spectral analysis technique based on near infrared (NIR) sensor is a powerful tool for complex information processing and high precision recognition, and it has been widely applied to quality analysis and online inspection of agricultural products. This paper proposes a new method to address the instability of small sample sizes in the successive projections algorithm (SPA) as well as the lack of association between selected variables and the analyte. The proposed method is an evaluated bootstrap ensemble SPA method (EBSPA) based on a variable evaluation index (EI) for variable selection, and is applied to the quantitative prediction of alcohol concentrations in liquor using NIR sensor. In the experiment, the proposed EBSPA with three kinds of modeling methods are established to test their performance. In addition, the proposed EBSPA combined with partial least square is compared with other state-of-the-art variable selection methods. The results show that the proposed method can solve the defects of SPA and it has the best generalization performance and stability. Furthermore, the physical meaning of the selected variables from the near infrared sensor data is clear, which can effectively reduce the variables and improve their prediction accuracy.
Improving Terminology Mapping in Clinical Text with Context-Sensitive Spelling Correction.
Dziadek, Juliusz; Henriksson, Aron; Duneld, Martin
2017-01-01
The mapping of unstructured clinical text to an ontology facilitates meaningful secondary use of health records but is non-trivial due to lexical variation and the abundance of misspellings in hurriedly produced notes. Here, we apply several spelling correction methods to Swedish medical text and evaluate their impact on SNOMED CT mapping; first in a controlled evaluation using medical literature text with induced errors, followed by a partial evaluation on clinical notes. It is shown that the best-performing method is context-sensitive, taking into account trigram frequencies and utilizing a corpus-based dictionary.
Signal analysis techniques for incipient failure detection in turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, T.
1985-01-01
Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.
Signal evaluations using singular value decomposition for Thomson scattering diagnostics.
Tojo, H; Yamada, I; Yasuhara, R; Yatsuka, E; Funaba, H; Hatae, T; Hayashi, H; Itami, K
2014-11-01
This paper provides a novel method for evaluating signal intensities in incoherent Thomson scattering diagnostics. A double-pass Thomson scattering system, where a laser passes through the plasma twice, generates two scattering pulses from the plasma. Evaluations of the signal intensities in the spectrometer are sometimes difficult due to noise and stray light. We apply the singular value decomposition method to Thomson scattering data with strong noise components. Results show that the average accuracy of the measured electron temperature (Te) is superior to that of temperature obtained using a low-pass filter (<20 MHz) or without any filters.
Signal evaluations using singular value decomposition for Thomson scattering diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tojo, H., E-mail: tojo.hiroshi@jaea.go.jp; Yatsuka, E.; Hatae, T.
2014-11-15
This paper provides a novel method for evaluating signal intensities in incoherent Thomson scattering diagnostics. A double-pass Thomson scattering system, where a laser passes through the plasma twice, generates two scattering pulses from the plasma. Evaluations of the signal intensities in the spectrometer are sometimes difficult due to noise and stray light. We apply the singular value decomposition method to Thomson scattering data with strong noise components. Results show that the average accuracy of the measured electron temperature (T{sub e}) is superior to that of temperature obtained using a low-pass filter (<20 MHz) or without any filters.
A Non-parametric Cutout Index for Robust Evaluation of Identified Proteins*
Serang, Oliver; Paulo, Joao; Steen, Hanno; Steen, Judith A.
2013-01-01
This paper proposes a novel, automated method for evaluating sets of proteins identified using mass spectrometry. The remaining peptide-spectrum match score distributions of protein sets are compared to an empirical absent peptide-spectrum match score distribution, and a Bayesian non-parametric method reminiscent of the Dirichlet process is presented to accurately perform this comparison. Thus, for a given protein set, the process computes the likelihood that the proteins identified are correctly identified. First, the method is used to evaluate protein sets chosen using different protein-level false discovery rate (FDR) thresholds, assigning each protein set a likelihood. The protein set assigned the highest likelihood is used to choose a non-arbitrary protein-level FDR threshold. Because the method can be used to evaluate any protein identification strategy (and is not limited to mere comparisons of different FDR thresholds), we subsequently use the method to compare and evaluate multiple simple methods for merging peptide evidence over replicate experiments. The general statistical approach can be applied to other types of data (e.g. RNA sequencing) and generalizes to multivariate problems. PMID:23292186
Triangulation and the importance of establishing valid methods for food safety culture evaluation.
Jespersen, Lone; Wallace, Carol A
2017-10-01
The research evaluates maturity of food safety culture in five multi-national food companies using method triangulation, specifically self-assessment scale, performance documents, and semi-structured interviews. Weaknesses associated with each individual method are known but there are few studies in food safety where a method triangulation approach is used for both data collection and data analysis. Significantly, this research shows that individual results taken in isolation can lead to wrong conclusions, resulting in potentially failing tactics and wasted investments. However, by applying method triangulation and reviewing results from a range of culture measurement tools it is possible to better direct investments and interventions. The findings add to the food safety culture paradigm beyond a single evaluation of food safety culture using generic culture surveys. Copyright © 2017. Published by Elsevier Ltd.
Studying distributed cognition of simulation-based team training with DiCoT.
Rybing, Jonas; Nilsson, Heléne; Jonson, Carl-Oscar; Bang, Magnus
2016-03-01
Health care organizations employ simulation-based team training (SBTT) to improve skill, communication and coordination in a broad range of critical care contexts. Quantitative approaches, such as team performance measurements, are predominantly used to measure SBTTs effectiveness. However, a practical evaluation method that examines how this approach supports cognition and teamwork is missing. We have applied Distributed Cognition for Teamwork (DiCoT), a method for analysing cognition and collaboration aspects of work settings, with the purpose of assessing the methodology's usefulness for evaluating SBTTs. In a case study, we observed and analysed four Emergo Train System® simulation exercises where medical professionals trained emergency response routines. The study suggests that DiCoT is an applicable and learnable tool for determining key distributed cognition attributes of SBTTs that are of importance for the simulation validity of training environments. Moreover, we discuss and exemplify how DiCoT supports design of SBTTs with a focus on transfer and validity characteristics. Practitioner Summary: In this study, we have evaluated a method to assess simulation-based team training environments from a cognitive ergonomics perspective. Using a case study, we analysed Distributed Cognition for Teamwork (DiCoT) by applying it to the Emergo Train System®. We conclude that DiCoT is useful for SBTT evaluation and simulator (re)design.
Multiwavelet grading of prostate pathological images
NASA Astrophysics Data System (ADS)
Soltanian-Zadeh, Hamid; Jafari-Khouzani, Kourosh
2002-05-01
We have developed image analysis methods to automatically grade pathological images of prostate. The proposed method generates Gleason grades to images, where each image is assigned a grade between 1 and 5. This is done using features extracted from multiwavelet transformations. We extract energy and entropy features from submatrices obtained in the decomposition. Next, we apply a k-NN classifier to grade the image. To find optimal multiwavelet basis, preprocessing, and classifier, we use features extracted by different multiwavelets with either critically sampled preprocessing or repeated row preprocessing and different k-NN classifiers and compare their performances, evaluated by total misclassification rate (TMR). To evaluate sensitivity to noise, we add white Gaussian noise to images and compare the results (TMR's). We applied proposed methods to 100 images. We evaluated the first and second levels of decomposition using Geronimo, Hardin, and Massopust (GHM), Chui and Lian (CL), and Shen (SA4) multiwavelets. We also evaluated k-NN classifier for k=1,2,3,4,5. Experimental results illustrate that first level of decomposition is quite noisy. They also show that critically sampled preprocessing outperforms repeated row preprocessing and has less sensitivity to noise. Finally, comparison studies indicate that SA4 multiwavelet and k-NN classifier (k=1) generates optimal results (with smallest TMR of 3%).
The balanced incomplete block design is not suitable for the evaluation of complex interventions.
Trietsch, Jasper; Leffers, Pieter; van Steenkiste, Ben; Grol, Richard; van der Weijden, Trudy
2014-12-01
In quality of care research, the balanced incomplete block (BIB) design is regularly claimed to have been used when evaluating complex interventions. In this article, we reflect on the appropriateness of using this design for evaluating complex interventions. Literature study using PubMed and handbooks. After studying various articles on health services research that claim to have applied the BIB and the original methodological literature on this design, it became clear that the applied method is in fact not a BIB design. We conclude that the use of this design is not suited for evaluating complex interventions. We stress that, to prevent improper use of terms, more attention should be paid to proper referencing of the original methodological literature. Copyright © 2014 Elsevier Inc. All rights reserved.
2011-01-01
Background Monitoring the time course of mortality by cause is a key public health issue. However, several mortality data production changes may affect cause-specific time trends, thus altering the interpretation. This paper proposes a statistical method that detects abrupt changes ("jumps") and estimates correction factors that may be used for further analysis. Methods The method was applied to a subset of the AMIEHS (Avoidable Mortality in the European Union, toward better Indicators for the Effectiveness of Health Systems) project mortality database and considered for six European countries and 13 selected causes of deaths. For each country and cause of death, an automated jump detection method called Polydect was applied to the log mortality rate time series. The plausibility of a data production change associated with each detected jump was evaluated through literature search or feedback obtained from the national data producers. For each plausible jump position, the statistical significance of the between-age and between-gender jump amplitude heterogeneity was evaluated by means of a generalized additive regression model, and correction factors were deduced from the results. Results Forty-nine jumps were detected by the Polydect method from 1970 to 2005. Most of the detected jumps were found to be plausible. The age- and gender-specific amplitudes of the jumps were estimated when they were statistically heterogeneous, and they showed greater by-age heterogeneity than by-gender heterogeneity. Conclusion The method presented in this paper was successfully applied to a large set of causes of death and countries. The method appears to be an alternative to bridge coding methods when the latter are not systematically implemented because they are time- and resource-consuming. PMID:21929756
NASA Astrophysics Data System (ADS)
Kaftan, Ilknur; Sindirgi, Petek
2013-04-01
Self-potential (SP) is one of the oldest geophysical methods that provides important information about near-surface structures. Several methods have been developed to interpret SP data using simple geometries. This study investigated inverse solution of a buried, polarized sphere-shaped self-potential (SP ) anomaly via Multilayer Perceptron Neural Networks ( MLPNN ). The polarization angle ( α ) and depth to the centre of sphere ( h )were estimated. The MLPNN is applied to synthetic and field SP data. In order to see the capability of the method in detecting the number of sources, MLPNN was applied to different spherical models at different depths and locations.. Additionally, the performance of MLPNN was tested by adding random noise to the same synthetic test data. The sphere model successfully obtained similar parameters under different S/N ratios. Then, MLPNN method was applied to two field examples. The first one is the cross section taken from the SP anomaly map of the Ergani-Süleymanköy (Turkey) copper mine. MLPNN was also applied to SP data from Seferihisar Izmir (Western Turkey) geothermal field. The MLPNN results showed good agreement with the original synthetic data set. The effect of The technique gave satisfactory results following the addition of 5% and 10% Gaussian noise levels. The MLPNN results were compared to other SP interpretation techniques, such as Normalized Full Gradient (NFG), inverse solution and nomogram methods. All of the techniques showed strong similarity. Consequently, the synthetic and field applications of this study show that MLPNN provides reliable evaluation of the self potential data modelled by the sphere model.
Proposed test method for and evaluation of wheelchair seating system (WCSS) crashworthiness.
van Roosmalen, L; Bertocci, G; Ha, D R; Karg, P; Szobota, S
2000-01-01
Safety of motor vehicle seats is of great importance in providing crash protection to the occupant. An increasing number of wheelchair users use their wheelchairs as motor vehicle seats when traveling. A voluntary standard requires that compliant wheelchairs be dynamically sled impact tested. However, testing to evaluate the crashworthiness of add-on wheelchair seating systems (WCSS) independent of their wheelchair frame is not addressed by this standard. To address this need, this study developed a method to evaluate the crash-worthiness of WCSS with independent frames. Federal Motor Vehicle Safety Standards (FMVSS) 207 test protocols, used to test the strength of motor vehicle seats, were modified and used to test the strength of three WCSS. Forward and rearward loads were applied at the WCSS center of gravity (CGSS), and a moment was applied at the uppermost point of the seat back. Each of the three tested WCSS met the strength requirements of FMVSS 207. Wheelchair seat-back stiffness was also investigated and compared to motor vehicle seat-back stiffness.
NASA Technical Reports Server (NTRS)
Chesler, L.; Pierce, S.
1971-01-01
Generalized, cyclic, and modified multistep numerical integration methods are developed and evaluated for application to problems of satellite orbit computation. Generalized methods are compared with the presently utilized Cowell methods; new cyclic methods are developed for special second-order differential equations; and several modified methods are developed and applied to orbit computation problems. Special computer programs were written to generate coefficients for these methods, and subroutines were written which allow use of these methods with NASA's GEOSTAR computer program.
Dick, Virginia R; Masters, Amanda E; McConnon, Patrick J; Engel, Jeffrey P; Underwood, Valerie N; Harrison, Robert J
2014-11-01
The Council of State and Territorial Epidemiologists (CSTE) implemented the Applied Epidemiology Fellowship (AEF) in 2003 to train public health professionals in applied epidemiology and strengthen applied epidemiology capacity within public health institutions to address the identified challenges. The CSTE recently evaluated the outcomes of the fellowship across the last 9 years. To review the findings from the outcome evaluation of the first nine classes of AEF alumni with particular attention to how the fellowship affected alumni careers, mentors' careers, host site agency capacity, and competencies of the applied epidemiology workforce. The mixed-methods evaluation used surveys and administrative data. Administrative data were gathered over the past 9 years and the surveys were collected in late 2013 and early 2014. Descriptive statistics and qualitative thematic analysis were conducted in early 2014 to examine the data from more than 130 alumni and 150 mentors. More than half the alumni (67%) indicated the fellowship was essential to their long-term career. In addition, 79% of the mentors indicated that participating in the fellowship had a positive impact on their career. Mentors also indicated significant impacts on host site capacity. A majority (88%) of alumni had worked for at least 1 year or more in government public health environments after the fellowship. Evaluation findings support previous research indicating need for competency-based field-based training programs that include a strong mentoring component. These characteristics in a field-based training program can increase applied epidemiology capacity in various ways. Copyright © 2014 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Neville, Timothy J; Salmon, Paul M
2016-07-01
As sport becomes more complex, there is potential for ergonomics concepts to help enhance the performance of sports officials. The concept of Situation Awareness (SA) appears pertinent given the requirement for officials to understand what is going on in order to make decisions. Although numerous models exist, none have been applied to examine officials, and only several recent examples have been applied to sport. This paper examines SA models and methods to identify if any have applicability to officials in sport (OiS). Evaluation of the models and methods identified potential applications of individual, team and systems models of SA. The paper further demonstrates that the Distributed Situation Awareness model is suitable for studying officials in fastball sports. It is concluded that the study of SA represents a key area of multidisciplinary research for both ergonomics and sports science in the context of OiS. Practitioner Summary: Despite obvious synergies, applications of cognitive ergonomics concepts in sport are sparse. This is especially so for Officials in Sport (OiS). This article presents an evaluation of Situation Awareness models and methods, providing practitioners with guidance on which are the most suitable for OiS system design and evaluation.
Parametric system identification of catamaran for improving controller design
NASA Astrophysics Data System (ADS)
Timpitak, Surasak; Prempraneerach, Pradya; Pengwang, Eakkachai
2018-01-01
This paper presents an estimation of simplified dynamic model for only surge- and yaw- motions of catamaran by using system identification (SI) techniques to determine associated unknown parameters. These methods will enhance the performance of designing processes for the motion control system of Unmanned Surface Vehicle (USV). The simulation results demonstrate an effective way to solve for damping forces and to determine added masses by applying least-square and AutoRegressive Exogenous (ARX) methods. Both methods are then evaluated according to estimated parametric errors from the vehicle’s dynamic model. The ARX method, which yields better estimated accuracy, can then be applied to identify unknown parameters as well as to help improving a controller design of a real unmanned catamaran.
Quality Evaluation of Raw Moutan Cortex Using the AHP and Gray Correlation-TOPSIS Method
Zhou, Sujuan; Liu, Bo; Meng, Jiang
2017-01-01
Background: Raw Moutan cortex (RMC) is an important Chinese herbal medicine. Comprehensive and objective quality evaluation of Chinese herbal medicine has been one of the most important issues in the modern herbs development. Objective: To evaluate and compare the quality of RMC using the weighted gray correlation- Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) method. Materials and Methods: The percentage composition of gallic acid, catechin, oxypaeoniflorin, paeoniflorin, quercetin, benzoylpaeoniflorin, paeonol in different batches of RMC was determined, and then adopting MATLAB programming to construct the gray correlation-TOPSIS assessment model for quality evaluation of RMC. Results: The quality evaluation results of model evaluation and objective evaluation were consistent, reliable, and stable. Conclusion: The model of gray correlation-TOPSIS can be well applied to the quality evaluation of traditional Chinese medicine with multiple components and has broad prospect in application. SUMMARY The experiment tries to construct a model to evaluate the quality of RMC using the weighted gray correlation- Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) method. Results show the model is reliable and provide a feasible way in evaluating quality of traditional Chinese medicine with multiple components. PMID:28839384
Economic evaluation of diagnostic methods used in dentistry. A systematic review.
Christell, Helena; Birch, Stephen; Horner, Keith; Lindh, Christina; Rohlin, Madeleine
2014-11-01
To review the literature of economic evaluations regarding diagnostic methods used in dentistry. Four databases (MEDLINE, Web of Science, The Cochrane library, the NHS Economic Evaluation Database) were searched for studies, complemented by hand search, until February 2013. Two authors independently screened all titles or abstracts and then applied inclusion and exclusion criteria to select full-text publications published in English, which reported an economic evaluation comparing at least two alternative methods. Studies of diagnostic methods were assessed by four reviewers using a protocol based on the QUADAS tool regarding diagnostic methods and a check-list for economic evaluations. The results of the data extraction were summarized in a structured table and as a narrative description. From 476 identified full-text publications, 160 were considered to be economic evaluations. Only 12 studies (7%) were on diagnostic methods, whilst 78 studies (49%) were on prevention and 70 (40%) on treatment. Among studies on diagnostic methods, there was between-study heterogeneity methodologically, regarding the diagnostic method analysed and type of economic evaluation addressed. Generally, the choice of economic evaluation method was not justified and the perspective of the study not stated. Costing of diagnostic methods varied. A small body of literature addresses economic evaluation of diagnostic methods in dentistry. Thus, there is a need for studies from various perspectives with well defined research questions and measures of the cost and effectiveness. Economic resources in healthcare are finite. For diagnostic methods, an understanding of efficacy provides only part of the information needed for evidence-based practice. This study highlighted a paucity of economic evaluations of diagnostic methods used in dentistry, indicating that much of what we practise lacks sufficient evidence.
Automatic Keyword Extraction from Individual Documents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Stuart J.; Engel, David W.; Cramer, Nicholas O.
2010-05-03
This paper introduces a novel and domain-independent method for automatically extracting keywords, as sequences of one or more words, from individual documents. We describe the method’s configuration parameters and algorithm, and present an evaluation on a benchmark corpus of technical abstracts. We also present a method for generating lists of stop words for specific corpora and domains, and evaluate its ability to improve keyword extraction on the benchmark corpus. Finally, we apply our method of automatic keyword extraction to a corpus of news articles and define metrics for characterizing the exclusivity, essentiality, and generality of extracted keywords within a corpus.
[Studies on HPLC fingerprint chromatogram of Folium Fici Microcarpa].
Fang, Zhi-Jian; Dai, Zhen; Li, Shu-Yuan
2008-10-01
To establish a sensitive and specific method for quality control of Folium Fici Microcarpa, HPLC method was applied for studies on the fingerprint chromatogram of Folium Fici Microcarpa. Isovitexin was used as reference substance to evaluate the chromatogram of 10 samples from different regions and 12 samples collected in different months. The result revealed that all the chromatographic peaks were seperated efficiently. There were 17 common peaks showed in the fingerprint chromatogram. The method of fingerprint chromatogram with characteristic and specificity will be used to identify the quality and evaluate different origins and collection period of Folium Fici Microcarpa.
Evaluation of Piloted Inputs for Onboard Frequency Response Estimation
NASA Technical Reports Server (NTRS)
Grauer, Jared A.; Martos, Borja
2013-01-01
Frequency response estimation results are presented using piloted inputs and a real-time estimation method recently developed for multisine inputs. A nonlinear simulation of the F-16 and a Piper Saratoga research aircraft were subjected to different piloted test inputs while the short period stabilator/elevator to pitch rate frequency response was estimated. Results show that the method can produce accurate results using wide-band piloted inputs instead of multisines. A new metric is introduced for evaluating which data points to include in the analysis and recommendations are provided for applying this method with piloted inputs.
Donovan, Sarah-Louise; Salmon, Paul M; Lenné, Michael G; Horberry, Tim
2017-10-01
Safety leadership is an important factor in supporting safety in high-risk industries. This article contends that applying systems-thinking methods to examine safety leadership can support improved learning from incidents. A case study analysis was undertaken of a large-scale mining landslide incident in which no injuries or fatalities were incurred. A multi-method approach was adopted, in which the Critical Decision Method, Rasmussen's Risk Management Framework and Accimap method were applied to examine the safety leadership decisions and actions which enabled the safe outcome. The approach enabled Rasmussen's predictions regarding safety and performance to be examined in the safety leadership context, with findings demonstrating the distribution of safety leadership across leader and system levels, and the presence of vertical integration as key to supporting the successful safety outcome. In doing so, the findings also demonstrate the usefulness of applying systems-thinking methods to examine and learn from incidents in terms of what 'went right'. The implications, including future research directions, are discussed. Practitioner Summary: This paper presents a case study analysis, in which systems-thinking methods are applied to the examination of safety leadership decisions and actions during a large-scale mining landslide incident. The findings establish safety leadership as a systems phenomenon, and furthermore, demonstrate the usefulness of applying systems-thinking methods to learn from incidents in terms of what 'went right'. Implications, including future research directions, are discussed.
A guideline for the validation of likelihood ratio methods used for forensic evidence evaluation.
Meuwly, Didier; Ramos, Daniel; Haraksim, Rudolf
2017-07-01
This Guideline proposes a protocol for the validation of forensic evaluation methods at the source level, using the Likelihood Ratio framework as defined within the Bayes' inference model. In the context of the inference of identity of source, the Likelihood Ratio is used to evaluate the strength of the evidence for a trace specimen, e.g. a fingermark, and a reference specimen, e.g. a fingerprint, to originate from common or different sources. Some theoretical aspects of probabilities necessary for this Guideline were discussed prior to its elaboration, which started after a workshop of forensic researchers and practitioners involved in this topic. In the workshop, the following questions were addressed: "which aspects of a forensic evaluation scenario need to be validated?", "what is the role of the LR as part of a decision process?" and "how to deal with uncertainty in the LR calculation?". The questions: "what to validate?" focuses on the validation methods and criteria and "how to validate?" deals with the implementation of the validation protocol. Answers to these questions were deemed necessary with several objectives. First, concepts typical for validation standards [1], such as performance characteristics, performance metrics and validation criteria, will be adapted or applied by analogy to the LR framework. Second, a validation strategy will be defined. Third, validation methods will be described. Finally, a validation protocol and an example of validation report will be proposed, which can be applied to the forensic fields developing and validating LR methods for the evaluation of the strength of evidence at source level under the following propositions. Copyright © 2016. Published by Elsevier B.V.
Evaluating Payments for Environmental Services: Methodological Challenges
2016-01-01
Over the last fifteen years, Payments for Environmental Services (PES) schemes have become very popular environmental policy instruments, but the academic literature has begun to question their additionality. The literature attempts to estimate the causal effect of these programs by applying impact evaluation (IE) techniques. However, PES programs are complex instruments and IE methods cannot be directly applied without adjustments. Based on a systematic review of the literature, this article proposes a framework for the methodological process of designing an IE for PES schemes. It revises and discusses the methodological choices at each step of the process and proposes guidelines for practitioners. PMID:26910850
A robustness test of the braided device foreshortening algorithm
NASA Astrophysics Data System (ADS)
Moyano, Raquel Kale; Fernandez, Hector; Macho, Juan M.; Blasco, Jordi; San Roman, Luis; Narata, Ana Paula; Larrabide, Ignacio
2017-11-01
Different computational methods have been recently proposed to simulate the virtual deployment of a braided stent inside a patient vasculature. Those methods are primarily based on the segmentation of the region of interest to obtain the local vessel morphology descriptors. The goal of this work is to evaluate the influence of the segmentation quality on the method named "Braided Device Foreshortening" (BDF). METHODS: We used the 3DRA images of 10 aneurysmatic patients (cases). The cases were segmented by applying a marching cubes algorithm with a broad range of thresholds in order to generate 10 surface models each. We selected a braided device to apply the BDF algorithm to each surface model. The range of the computed flow diverter lengths for each case was obtained to calculate the variability of the method against the threshold segmentation values. RESULTS: An evaluation study over 10 clinical cases indicates that the final length of the deployed flow diverter in each vessel model is stable, shielding maximum difference of 11.19% in vessel diameter and maximum of 9.14% in the simulated stent length for the threshold values. The average coefficient of variation was found to be 4.08 %. CONCLUSION: A study evaluating how the threshold segmentation affects the simulated length of the deployed FD, was presented. The segmentation algorithm used to segment intracranial aneurysm 3D angiography images presents small variation in the resulting stent simulation.
Validation of the ULCEAT methodology by applying it in retrospect to the Roboticbed.
Nakamura, Mio; Suzurikawa, Jun; Tsukada, Shohei; Kume, Yohei; Kawakami, Hideo; Inoue, Kaoru; Inoue, Takenobu
2015-01-01
In answer to the increasing demand for care by the Japanese oldest portion of the population, an extensive programme of life support robots is under development, advocated by the Japanese government. Roboticbed® (RB) is developed to facilitate patients in their daily life in making independent transfers from and to the bed. The bed is intended both for elderly and persons with a disability. The purpose of this study is to examine the validity of the user and user's life centred clinical evaluation of assistive technology (ULCEAT) methodology. To support user centred development of life support robots the ULCEAT method was developed. By means of the ULCEAT method the target users and the use environment were re-established in an earlier study. The validity of the method is tested by re-evaluating the development of RB in retrospect. Six participants used the first prototype of RB (RB1) and eight participants used the second prototype of RB (RB2). The results indicated that the functionality was improved owing to the end-user evaluations. Therefore, we confirmed the content validity of the proposed ULCEAT method. In this study we confirmed the validation of the ULCEAT methodology by applying it in retrospect to RB using development process. This method will be used for the development of Life-support robots and prototype assistive technologies.
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
Recent studies have shown the detection of pharmaceuticals in surface waters across the United States. The objective of this study was to develop methods, and apply them, to evaluate the potential for food chain transfer when pharmaceutical containing wastewaters are used for cr...
ERIC Educational Resources Information Center
Wu, YuLung
2010-01-01
In Taiwan, when students learn in experiment-related courses, they are often grouped into several teams. The familiar method of grouping learning is "Cooperative Learning". A well-organized grouping strategy improves cooperative learning and increases the number of activities. This study proposes a novel pedagogical method by adopting…
DOT National Transportation Integrated Search
1977-07-01
The workshop focused on current methods of assessing the effectiveness of crime and vandalism reduction methods that are used in conventional urban mass transit systems, and on how they might be applied to new AGT systems. Conventional as well as nov...
Fecal bacteria, including those originating from concentrated animal feeding operations, are a leading contributor to water quality impairments in agricultural areas. Rapid and reliable methods are needed that can accurately characterize fecal pollution in agricultural settings....
Fecal bacteria, including those originating from concentrated animal feeding operations, are a leading contributor to water quality impairments in agricultural areas. Rapid and reliable methods are needed that can accurately characterize fecal pollution in agricultural settings....
Opening "The Door": An Evaluation of the Efficacy of a Problem-Based Learning Game
ERIC Educational Resources Information Center
Warren, Scott J.; Dondlinger, Mary Jo; McLeod, Julie; Bigenho, Chris
2012-01-01
As higher education institutions seek to improve undergraduate education, initiatives are underway to target instructional methods, re-examine curricula, and apply innovative technologies to better engage students with content. This article discusses the findings of an exploratory study focused on a course redesign that game elements, PBL methods,…
Does Choice of Multicriteria Method Matter? An Experiment in Water Resources Planning
NASA Astrophysics Data System (ADS)
Hobbs, Benjamin F.; Chankong, Vira; Hamadeh, Wael; Stakhiv, Eugene Z.
1992-07-01
Many multiple criteria decision making methods have been proposed and applied to water planning. Their purpose is to provide information on tradeoffs among objectives and to help users articulate value judgments in a systematic, coherent, and documentable manner. The wide variety of available techniques confuses potential users, causing inappropriate matching of methods with problems. Experiments in which water planners apply more than one multicriteria procedure to realistic problems can help dispel this confusion by testing method appropriateness, ease of use, and validity. We summarize one such experiment where U.S. Army Corps of Engineers personnel used several methods to screen urban water supply plans. The methods evaluated include goal programming, ELECTRE I, additive value functions, multiplicative utility functions, and three techniques for choosing weights (direct rating, indifference tradeoff, and the analytical hierarchy process). Among the conclusions we reach are the following. First, experienced planners generally prefer simpler, more transparent methods. Additive value functions are favored. Yet none of the methods are endorsed by a majority of the participants; many preferred to use no formal method at all. Second, there is strong evidence that rating, the most commonly applied weight selection method, is likely to lead to weights that fail to represent the trade-offs that users are willing to make among criteria. Finally, we show that decisions can be as or more sensitive to the method used as to which person applies it. Therefore, if who chooses is important, then so too is how a choice is made.
Time-Domain Evaluation of Fractional Order Controllers’ Direct Discretization Methods
NASA Astrophysics Data System (ADS)
Ma, Chengbin; Hori, Yoichi
Fractional Order Control (FOC), in which the controlled systems and/or controllers are described by fractional order differential equations, has been applied to various control problems. Though it is not difficult to understand FOC’s theoretical superiority, realization issue keeps being somewhat problematic. Since the fractional order systems have an infinite dimension, proper approximation by finite difference equation is needed to realize the designed fractional order controllers. In this paper, the existing direct discretization methods are evaluated by their convergences and time-domain comparison with the baseline case. Proposed sampling time scaling property is used to calculate the baseline case with full memory length. This novel discretization method is based on the classical trapezoidal rule but with scaled sampling time. Comparative studies show good performance and simple algorithm make the Short Memory Principle method most practically superior. The FOC research is still at its primary stage. But its applications in modeling and robustness against non-linearities reveal the promising aspects. Parallel to the development of FOC theories, applying FOC to various control problems is also crucially important and one of top priority issues.
NASA Astrophysics Data System (ADS)
Asgari, Ali; Dehestani, Pouya; Poruraminaie, Iman
2018-02-01
Shot peening is a well-known process in applying the residual stress on the surface of industrial parts. The induced residual stress improves fatigue life. In this study, the effects of shot peening parameters such as shot diameter, shot speed, friction coefficient, and the number of impacts on the applied residual stress will be evaluated. To assess these parameters effect, firstly the shot peening process has been simulated by finite element method. Then, effects of the process parameters on the residual stress have been evaluated by response surface method as a statistical approach. Finally, a strong model is presented to predict the maximum residual stress induced by shot peening process in AISI 4340 steel. Also, the optimum parameters for the maximum residual stress are achieved. The results indicate that effect of shot diameter on the induced residual stress is increased by increasing the shot speed. Also, enhancing the friction coefficient magnitude always cannot lead to increase in the residual stress.
NASA Astrophysics Data System (ADS)
Song, Jun Hee; Kim, Hak Kun; Kim, Sam Yeon
2014-07-01
Laminated fiber-reinforced composites can be applied to an insulating structure of a nuclear fusion device. It is necessary to investigate the interlaminar fracture characteristics of the laminated composites for the assurance of design and structural integrity. The three methods used to prepare the glass fiber reinforced plastic composites tested in this study were vacuum pressure impregnation, high pressure laminate (HPL), and prepreg laminate. We discuss the design criteria for safe application of composites and the shear-compressive test methods for evaluating mechanical properties of the material. Shear-compressive tests could be performed successfully using series-type test jigs that were inclined 0°, 30°, 45°, 60°, and 75° to the normal axis. Shear strength depends strongly on the applied compressive stress. The design range of allowable shear stress was extended by use of the appropriate composite fabrication method. HPL had the largest design range, and the allowable interlaminar shear stress was 0.254 times the compressive stress.
ERIC Educational Resources Information Center
Singh-Ackbarali, Dimple; Maharaj, Rohanie
2014-01-01
This paper discusses the comprehensive and practical training that was delivered to students in a university classroom on how sensory evaluation can be used to determine acceptability of food products. The report presents how students used their training on sensory evaluation methods and analysis and applied it to improving and predicting…
Westbrook, J. I.
2015-01-01
Summary Objectives To examine if human factors methods were applied in the design, development, and evaluation of mobile applications developed to facilitate aspects of patient-centered care coordination. Methods We searched MEDLINE and EMBASE (2013-2014) for studies describing the design or the evaluation of a mobile health application that aimed to support patients’ active involvement in the coordination of their care. Results 34 papers met the inclusion criteria. Applications ranged from tools that supported self-management of specific conditions (e.g. asthma) to tools that provided coaching or education. Twelve of the 15 papers describing the design or development of an app reported the use of a human factors approach. The most frequently used methods were interviews and surveys, which often included an exploration of participants’ current use of information technology. Sixteen papers described the evaluation of a patient application in practice. All of them adopted a human factors approach, typically an examination of the use of app features and/or surveys or interviews which enquired about patients’ views of the effects of using the app on their behaviors (e.g. medication adherence), knowledge, and relationships with healthcare providers. No study in our review assessed the impact of mobile applications on health outcomes. Conclusion The potential of mobile health applications to assist patients to more actively engage in the management of their care has resulted in a large number of applications being developed. Our review showed that human factors approaches are nearly always adopted to some extent in the design, development, and evaluation of mobile applications. PMID:26293851
Feng, Tao; Wang, Jizhe; Tsui, Benjamin M W
2018-04-01
The goal of this study was to develop and evaluate four post-reconstruction respiratory and cardiac (R&C) motion vector field (MVF) estimation methods for cardiac 4D PET data. In Method 1, the dual R&C motions were estimated directly from the dual R&C gated images. In Method 2, respiratory motion (RM) and cardiac motion (CM) were separately estimated from the respiratory gated only and cardiac gated only images. The effects of RM on CM estimation were modeled in Method 3 by applying an image-based RM correction on the cardiac gated images before CM estimation, the effects of CM on RM estimation were neglected. Method 4 iteratively models the mutual effects of RM and CM during dual R&C motion estimations. Realistic simulation data were generated for quantitative evaluation of four methods. Almost noise-free PET projection data were generated from the 4D XCAT phantom with realistic R&C MVF using Monte Carlo simulation. Poisson noise was added to the scaled projection data to generate additional datasets of two more different noise levels. All the projection data were reconstructed using a 4D image reconstruction method to obtain dual R&C gated images. The four dual R&C MVF estimation methods were applied to the dual R&C gated images and the accuracy of motion estimation was quantitatively evaluated using the root mean square error (RMSE) of the estimated MVFs. Results show that among the four estimation methods, Methods 2 performed the worst for noise-free case while Method 1 performed the worst for noisy cases in terms of quantitative accuracy of the estimated MVF. Methods 4 and 3 showed comparable results and achieved RMSE lower by up to 35% than that in Method 1 for noisy cases. In conclusion, we have developed and evaluated 4 different post-reconstruction R&C MVF estimation methods for use in 4D PET imaging. Comparison of the performance of four methods on simulated data indicates separate R&C estimation with modeling of RM before CM estimation (Method 3) to be the best option for accurate estimation of dual R&C motion in clinical situation. © 2018 American Association of Physicists in Medicine.
Wang, Yihong; Guo, Qing; Wang, Huafu; Qian, Kun; Tian, Liang; Yao, Chen; Song, Wei; Shu, Weixia; Chen, Ping; Qi, Jinxu
2017-02-01
Quaternized chitosan is a cationic biopolymer with good antibacterial activity, biocompatibility, and biodegradability, and it has been widely applied in many fields. We have developed a convenient method to evaluate the antibacterial activity of hydroxypropyltrimethylammonium chloride chitosan (HACC) with a nonionic surfactant poloxamer in aqueous solution by monitoring the change of the oxidation peak current in cyclic voltammetry. Increasing values of the oxidation peak current were positively correlated with the antibacterial activity of HACC-poloxamer solutions. Optical microscope images, the zeta potential, and fluorescence spectroscopy showed that the aggregation state of HACC-poloxamer was related to the ratio of the two polymers and also to the antibacterial activity and oxidation peak current. At an HACC-to-poloxamer ratio of 1:0.75, the maximum surface charge density and the smooth edge of HACC-poloxamer aggregates can accelerate diffusion in aqueous solution. It is expected that this convenient method can be applied for a quick evaluation of the antibacterial activity of cationic biopolymers in aqueous solution. Graphical Abstract The cyclic voltammograms of MB in HACC/poloxamer solution, and the antibacterial efficiency against S. aureus after incubated with HACC (a) and 1/0.75 of HACC/poloxamer (b).
Is STAPLE algorithm confident to assess segmentation methods in PET imaging?
NASA Astrophysics Data System (ADS)
Dewalle-Vignion, Anne-Sophie; Betrouni, Nacim; Baillet, Clio; Vermandel, Maximilien
2015-12-01
Accurate tumor segmentation in [18F]-fluorodeoxyglucose positron emission tomography is crucial for tumor response assessment and target volume definition in radiation therapy. Evaluation of segmentation methods from clinical data without ground truth is usually based on physicians’ manual delineations. In this context, the simultaneous truth and performance level estimation (STAPLE) algorithm could be useful to manage the multi-observers variability. In this paper, we evaluated how this algorithm could accurately estimate the ground truth in PET imaging. Complete evaluation study using different criteria was performed on simulated data. The STAPLE algorithm was applied to manual and automatic segmentation results. A specific configuration of the implementation provided by the Computational Radiology Laboratory was used. Consensus obtained by the STAPLE algorithm from manual delineations appeared to be more accurate than manual delineations themselves (80% of overlap). An improvement of the accuracy was also observed when applying the STAPLE algorithm to automatic segmentations results. The STAPLE algorithm, with the configuration used in this paper, is more appropriate than manual delineations alone or automatic segmentations results alone to estimate the ground truth in PET imaging. Therefore, it might be preferred to assess the accuracy of tumor segmentation methods in PET imaging.
Is STAPLE algorithm confident to assess segmentation methods in PET imaging?
Dewalle-Vignion, Anne-Sophie; Betrouni, Nacim; Baillet, Clio; Vermandel, Maximilien
2015-12-21
Accurate tumor segmentation in [18F]-fluorodeoxyglucose positron emission tomography is crucial for tumor response assessment and target volume definition in radiation therapy. Evaluation of segmentation methods from clinical data without ground truth is usually based on physicians' manual delineations. In this context, the simultaneous truth and performance level estimation (STAPLE) algorithm could be useful to manage the multi-observers variability. In this paper, we evaluated how this algorithm could accurately estimate the ground truth in PET imaging. Complete evaluation study using different criteria was performed on simulated data. The STAPLE algorithm was applied to manual and automatic segmentation results. A specific configuration of the implementation provided by the Computational Radiology Laboratory was used. Consensus obtained by the STAPLE algorithm from manual delineations appeared to be more accurate than manual delineations themselves (80% of overlap). An improvement of the accuracy was also observed when applying the STAPLE algorithm to automatic segmentations results. The STAPLE algorithm, with the configuration used in this paper, is more appropriate than manual delineations alone or automatic segmentations results alone to estimate the ground truth in PET imaging. Therefore, it might be preferred to assess the accuracy of tumor segmentation methods in PET imaging.
Fuggle, Peter; Bevington, Dickon; Cracknell, Liz; Hanley, James; Hare, Suzanne; Lincoln, John; Richardson, Garry; Stevens, Nina; Tovey, Heather; Zlotowitz, Sally
2015-07-01
AMBIT (Adolescent Mentalization-Based Integrative Treatment) is a developing team approach to working with hard-to-reach adolescents. The approach applies the principle of mentalization to relationships with clients, team relationships and working across agencies. It places a high priority on the need for locally developed evidence-based practice, and proposes that outcome evaluation needs to be explicitly linked with processes of team learning using a learning organization framework. A number of innovative methods of team learning are incorporated into the AMBIT approach, particularly a system of web-based wiki-formatted AMBIT manuals individualized for each participating team. The paper describes early development work of the model and illustrates ways of establishing explicit links between outcome evaluation, team learning and manualization by describing these methods as applied to two AMBIT-trained teams; one team working with young people on the edge of care (AMASS - the Adolescent Multi-Agency Support Service) and another working with substance use (CASUS - Child and Adolescent Substance Use Service in Cambridgeshire). Measurement of the primary outcomes for each team (which were generally very positive) facilitated team learning and adaptations of methods of practice that were consolidated through manualization. © The Author(s) 2014.
Shugars, D A; Trent, P J; Heymann, H O
1979-08-01
Two instructional strategies, the traditional lecture method and a standardized self-instructional (ACORDE) format, were compared for efficiency and perceived usefulness in a preclinical restorative dentistry technique course through the use of a posttest-only control group research design. Control and experimental groups were compared on (a) technique grades, (b) didactic grades, (c) amount of time spent, (d) student and faculty perceptions, and (e) observation of social dynamics. The results of this study demonstrated the effectiveness of Project ACORDE materials in teaching dental students, provided an example of applied research designed to test contemplated instructional innovations prior to use and used a method which highlighted qualitative, as well as quantitative, techniques for data gathering in applied research.
Extreme data compression for the CMB
NASA Astrophysics Data System (ADS)
Zablocki, Alan; Dodelson, Scott
2016-04-01
We apply the Karhunen-Loéve methods to cosmic microwave background (CMB) data sets, and show that we can recover the input cosmology and obtain the marginalized likelihoods in Λ cold dark matter cosmologies in under a minute, much faster than Markov chain Monte Carlo methods. This is achieved by forming a linear combination of the power spectra at each multipole l , and solving a system of simultaneous equations such that the Fisher matrix is locally unchanged. Instead of carrying out a full likelihood evaluation over the whole parameter space, we need evaluate the likelihood only for the parameter of interest, with the data compression effectively marginalizing over all other parameters. The weighting vectors contain insight about the physical effects of the parameters on the CMB anisotropy power spectrum Cl . The shape and amplitude of these vectors give an intuitive feel for the physics of the CMB, the sensitivity of the observed spectrum to cosmological parameters, and the relative sensitivity of different experiments to cosmological parameters. We test this method on exact theory Cl as well as on a Wilkinson Microwave Anisotropy Probe (WMAP)-like CMB data set generated from a random realization of a fiducial cosmology, comparing the compression results to those from a full likelihood analysis using CosmoMC. After showing that the method works, we apply it to the temperature power spectrum from the WMAP seven-year data release, and discuss the successes and limitations of our method as applied to a real data set.
Kaserer, Teresa; Temml, Veronika; Kutil, Zsofia; Vanek, Tomas; Landa, Premysl; Schuster, Daniela
2015-01-01
Computational methods can be applied in drug development for the identification of novel lead candidates, but also for the prediction of pharmacokinetic properties and potential adverse effects, thereby aiding to prioritize and identify the most promising compounds. In principle, several techniques are available for this purpose, however, which one is the most suitable for a specific research objective still requires further investigation. Within this study, the performance of several programs, representing common virtual screening methods, was compared in a prospective manner. First, we selected top-ranked virtual screening hits from the three methods pharmacophore modeling, shape-based modeling, and docking. For comparison, these hits were then additionally predicted by external pharmacophore- and 2D similarity-based bioactivity profiling tools. Subsequently, the biological activities of the selected hits were assessed in vitro, which allowed for evaluating and comparing the prospective performance of the applied tools. Although all methods performed well, considerable differences were observed concerning hit rates, true positive and true negative hits, and hitlist composition. Our results suggest that a rational selection of the applied method represents a powerful strategy to maximize the success of a research project, tightly linked to its aims. We employed cyclooxygenase as application example, however, the focus of this study lied on highlighting the differences in the virtual screening tool performances and not in the identification of novel COX-inhibitors. Copyright © 2015 The Authors. Published by Elsevier Masson SAS.. All rights reserved.
A field evaluation of a SO 2 passive sampler in tropical industrial and urban air
NASA Astrophysics Data System (ADS)
Cruz, Lícia P. S.; Campos, Vânia P.; Silva, Adriana M. C.; Tavares, Tania M.
Passive samplers have been widely used for over 30 years in the measurement of personal exposure to vapours and gases in the workplace. These samplers have just recently been applied in the monitoring of ambient air, which presents concentrations that are normally much smaller than those found in occupational environments. The locally constructed passive sampler was based on gas molecular diffusion through static air layer. The design used minimizes particle interference and turbulent diffusion. After exposure, the SO 2 trapped in impregnated filters with Na 2CO 3 was extracted by means of an ultrasonic bath, for 15 min, using 1.0×10 -2 mol L -1 H 2O 2. It was determined as SO 4-2 by ion chromatography. The performance of the passive sampler was evaluated at different exposure periods, being applied in industrial and urban areas. Method precision as relative standard deviation for three simultaneously applied passive samplers was within 10%. Passive sampling, when compared to active monitoring methods under real conditions, used in urban and industrial areas, showed an overall accuracy of 15%. A statistical comparison with an active method was performed to demonstrate the validity of the passive method. Sampler capacity varied between 98 and 421 μg SO 2 m -3 for exposure periods of one month and one week, respectively, which allows its use in highly polluted areas.
Veronese, Paola; Bogana, Gianna; Cerutti, Alessia; Yeo, Lami; Romero, Roberto; Gervasi, Maria Teresa
2016-01-01
Objective To evaluate the performance of Fetal Intelligent Navigation Echocardiography (FINE) applied to spatiotemporal image correlation (STIC) volume datasets of the normal fetal heart in generating standard fetal echocardiography views. Methods In this prospective cohort study of patients with normal fetal hearts (19-30 gestational weeks), one or more STIC volume datasets were obtained of the apical four-chamber view. Each STIC volume successfully obtained was evaluated by STICLoop™ to determine its appropriateness before applying the FINE method. Visualization rates for standard fetal echocardiography views using diagnostic planes and/or Virtual Intelligent Sonographer Assistance (VIS-Assistance®) were calculated. Results One or more STIC volumes (n=463 total) were obtained in 246 patients. A single STIC volume per patient was analyzed using the FINE method. In normal cases, FINE was able to generate nine fetal echocardiography views using: 1) diagnostic planes in 76-100% of cases; 2) VIS-Assistance® in 96-100% of cases; and 3) a combination of diagnostic planes and/or VIS-Assistance® in 96-100% of cases. Conclusion FINE applied to STIC volumes can successfully generate nine standard fetal echocardiography views in 96-100% of cases in the second and third trimesters. This suggests that the technology can be used as a method to screen for congenital heart disease. PMID:27309391
New parameters in adaptive testing of ferromagnetic materials utilizing magnetic Barkhausen noise
NASA Astrophysics Data System (ADS)
Pal'a, Jozef; Ušák, Elemír
2016-03-01
A new method of magnetic Barkhausen noise (MBN) measurement and optimization of the measured data processing with respect to non-destructive evaluation of ferromagnetic materials was tested. Using this method we tried to found, if it is possible to enhance sensitivity and stability of measurement results by replacing the traditional MBN parameter (root mean square) with some new parameter. In the tested method, a complex set of the MBN from minor hysteresis loops is measured. Afterward, the MBN data are collected into suitably designed matrices and optimal parameters of MBN with respect to maximum sensitivity to the evaluated variable are searched. The method was verified on plastically deformed steel samples. It was shown that the proposed measuring method and measured data processing bring an improvement of the sensitivity to the evaluated variable when comparing with measuring traditional MBN parameter. Moreover, we found a parameter of MBN, which is highly resistant to the changes of applied field amplitude and at the same time it is noticeably more sensitive to the evaluated variable.
Testolin, Renan C; Almeida, Tito C M; Polette, Marcus; Branco, Joaquim O; Fischer, Larissa L; Niero, Guilherme; Poyer-Radetski, Gabriel; Silva, Valéria C; Somensi, Cleder A; Corrêa, Albertina X R; Corrêa, Rogério; Rörig, Leonardo R; Itokazu, Ana Gabriela; Férard, Jean-François; Cotelle, Sylvie; Radetski, Claudemir M
2017-05-15
There is scientific evidence that beach sands are a significant contributor to the pathogen load to which visitors are exposed. To develop beach quality guidelines all beach zones must be included in microbiological evaluations, but monitoring methods for beach sand quality are relatively longstanding, expensive, laborious and require moderate laboratory infrastructure. This paper aimed to evaluate the microorganism activity in different beach zones applying and comparing a classical method of membrane filtration (MF) with two colorimetric screening methods based on fluorescein (FDA) and tetrazolium (TTC) salt biotransformation to evaluate a new rapid and low-cost method for beach sand microbiological contamination assessments. The colorimetric results can help beach managers to evaluate rapidly and at low cost the microbiological quality of different beach zones in order to decide whether remedial actions need to be adopted to prevent exposure of the public to microbes due to beach sand and/or water contamination. Copyright © 2017. Published by Elsevier Ltd.
MRBrainS Challenge: Online Evaluation Framework for Brain Image Segmentation in 3T MRI Scans.
Mendrik, Adriënne M; Vincken, Koen L; Kuijf, Hugo J; Breeuwer, Marcel; Bouvy, Willem H; de Bresser, Jeroen; Alansary, Amir; de Bruijne, Marleen; Carass, Aaron; El-Baz, Ayman; Jog, Amod; Katyal, Ranveer; Khan, Ali R; van der Lijn, Fedde; Mahmood, Qaiser; Mukherjee, Ryan; van Opbroek, Annegreet; Paneri, Sahil; Pereira, Sérgio; Persson, Mikael; Rajchl, Martin; Sarikaya, Duygu; Smedby, Örjan; Silva, Carlos A; Vrooman, Henri A; Vyas, Saurabh; Wang, Chunliang; Zhao, Liang; Biessels, Geert Jan; Viergever, Max A
2015-01-01
Many methods have been proposed for tissue segmentation in brain MRI scans. The multitude of methods proposed complicates the choice of one method above others. We have therefore established the MRBrainS online evaluation framework for evaluating (semi)automatic algorithms that segment gray matter (GM), white matter (WM), and cerebrospinal fluid (CSF) on 3T brain MRI scans of elderly subjects (65-80 y). Participants apply their algorithms to the provided data, after which their results are evaluated and ranked. Full manual segmentations of GM, WM, and CSF are available for all scans and used as the reference standard. Five datasets are provided for training and fifteen for testing. The evaluated methods are ranked based on their overall performance to segment GM, WM, and CSF and evaluated using three evaluation metrics (Dice, H95, and AVD) and the results are published on the MRBrainS13 website. We present the results of eleven segmentation algorithms that participated in the MRBrainS13 challenge workshop at MICCAI, where the framework was launched, and three commonly used freeware packages: FreeSurfer, FSL, and SPM. The MRBrainS evaluation framework provides an objective and direct comparison of all evaluated algorithms and can aid in selecting the best performing method for the segmentation goal at hand.
MRBrainS Challenge: Online Evaluation Framework for Brain Image Segmentation in 3T MRI Scans
Mendrik, Adriënne M.; Vincken, Koen L.; Kuijf, Hugo J.; Breeuwer, Marcel; Bouvy, Willem H.; de Bresser, Jeroen; Alansary, Amir; de Bruijne, Marleen; Carass, Aaron; El-Baz, Ayman; Jog, Amod; Katyal, Ranveer; Khan, Ali R.; van der Lijn, Fedde; Mahmood, Qaiser; Mukherjee, Ryan; van Opbroek, Annegreet; Paneri, Sahil; Pereira, Sérgio; Rajchl, Martin; Sarikaya, Duygu; Smedby, Örjan; Silva, Carlos A.; Vrooman, Henri A.; Vyas, Saurabh; Wang, Chunliang; Zhao, Liang; Biessels, Geert Jan; Viergever, Max A.
2015-01-01
Many methods have been proposed for tissue segmentation in brain MRI scans. The multitude of methods proposed complicates the choice of one method above others. We have therefore established the MRBrainS online evaluation framework for evaluating (semi)automatic algorithms that segment gray matter (GM), white matter (WM), and cerebrospinal fluid (CSF) on 3T brain MRI scans of elderly subjects (65–80 y). Participants apply their algorithms to the provided data, after which their results are evaluated and ranked. Full manual segmentations of GM, WM, and CSF are available for all scans and used as the reference standard. Five datasets are provided for training and fifteen for testing. The evaluated methods are ranked based on their overall performance to segment GM, WM, and CSF and evaluated using three evaluation metrics (Dice, H95, and AVD) and the results are published on the MRBrainS13 website. We present the results of eleven segmentation algorithms that participated in the MRBrainS13 challenge workshop at MICCAI, where the framework was launched, and three commonly used freeware packages: FreeSurfer, FSL, and SPM. The MRBrainS evaluation framework provides an objective and direct comparison of all evaluated algorithms and can aid in selecting the best performing method for the segmentation goal at hand. PMID:26759553
NASA Astrophysics Data System (ADS)
El-Gafy, Inas
2017-10-01
Analysis the water-food-energy nexus is the first step to assess the decision maker in developing and evaluating national strategies that take into account the nexus. The main objective of the current research is providing a method for the decision makers to analysis the water-food-energy nexus of the crop production system at the national level and carrying out a quantitative assessment of it. Through the proposed method, indicators considering the water and energy consumption, mass productivity, and economic productivity were suggested. Based on these indicators a water-food-energy nexus index (WFENI) was performed. The study showed that the calculated WFENI of the Egyptian summer crops have scores that range from 0.21 to 0.79. Comparing to onion (the highest scoring WFENI,i.e., the best score), rice has the lowest WFENI among the summer food crops. Analysis of the water-food-energy nexus of forty-two Egyptian crops in year 2010 was caried out (energy consumed for irrigation represent 7.4% of the total energy footprint). WFENI can be applied to developed strategies for the optimal cropping pattern that minimizing the water and energy consumption and maximizing their productivity. It can be applied as a holistic tool to evaluate the progress in the water and agricultural national strategies. Moreover, WFENI could be applied yearly to evaluate the performance of the water-food-energy nexus managmant.
He, Y; Zhang, W; Huang, T; Wang, X; Wang, M
2015-10-01
To evaluate a diagnostic flow chart applying medical thoracoscoy (MT), adenosine deaminase (ADA) and T-SPOT.TB in diagnosis of tuberculous pleural effusion (TPE) at a high TB burden country. 136 patients with pleural effusion (PE) were enrolled and divided into TPE and Non-TPE group. MT (histology), PE ADA and T-SPOT.TB were conducted on all patients. ROC analysis was performed for the best cut-off value of PE ADA in detection of TPE. The diagnostic flow chart applying MT, ADA and T-SPOT.TB was evaluated for improving the limitations of each diagnostic method. ROC analysis showed that the best cut-off value of PE ADA was 30U/L. The sensitivity and specificity of these tests were calculated respectively to be: 71.4% (58.5%-81.6%) and 100% (95.4-100.0%) for MT, 92.9% (83.0-97.2%) and 68.8% (57.9-77.9%) for T-SPOT.TB, and 80.0% (69.6-88.1%) and 92.9% (82.7-98.0%) for PE ADA. The sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, positive predictive value and negative predictive value of the diagnostic flow chart were 96.4% (87.9-99.0%), 96.3% (89.6-98.7%), 25.714, 0.037, 97.4 and 94.9, respectively. The diagnostic flow chart applying MT, ADA and T-SPOT.TB is an accurate and rapid diagnostic method in detection of TPE.
A New Automated Design Method Based on Machine Learning for CMOS Analog Circuits
NASA Astrophysics Data System (ADS)
Moradi, Behzad; Mirzaei, Abdolreza
2016-11-01
A new simulation based automated CMOS analog circuit design method which applies a multi-objective non-Darwinian-type evolutionary algorithm based on Learnable Evolution Model (LEM) is proposed in this article. The multi-objective property of this automated design of CMOS analog circuits is governed by a modified Strength Pareto Evolutionary Algorithm (SPEA) incorporated in the LEM algorithm presented here. LEM includes a machine learning method such as the decision trees that makes a distinction between high- and low-fitness areas in the design space. The learning process can detect the right directions of the evolution and lead to high steps in the evolution of the individuals. The learning phase shortens the evolution process and makes remarkable reduction in the number of individual evaluations. The expert designer's knowledge on circuit is applied in the design process in order to reduce the design space as well as the design time. The circuit evaluation is made by HSPICE simulator. In order to improve the design accuracy, bsim3v3 CMOS transistor model is adopted in this proposed design method. This proposed design method is tested on three different operational amplifier circuits. The performance of this proposed design method is verified by comparing it with the evolutionary strategy algorithm and other similar methods.
Training needs for toxicity testing in the 21st century: a survey-informed analysis.
Lapenna, Silvia; Gabbert, Silke; Worth, Andrew
2012-12-01
Current training needs on the use of alternative methods in predictive toxicology, including new approaches based on mode-of-action (MoA) and adverse outcome pathway (AOP) concepts, are expected to evolve rapidly. In order to gain insight into stakeholder preferences for training, the European Commission's Joint Research Centre (JRC) conducted a single-question survey with twelve experts in regulatory agencies, industry, national research organisations, NGOs and consultancies. Stakeholder responses were evaluated by means of theory-based qualitative data analysis. Overall, a set of training topics were identified that relate both to general background information and to guidance for applying alternative testing methods. In particular, for the use of in silico methods, stakeholders emphasised the need for training on data integration and evaluation, in order to increase confidence in applying these methods for regulatory purposes. Although the survey does not claim to offer an exhaustive overview of the training requirements, its findings support the conclusion that the development of well-targeted and tailor-made training opportunities that inform about the usefulness of alternative methods, in particular those that offer practical experience in the application of in silico methods, deserves more attention. This should be complemented by transparent information and guidance on the interpretation of the results generated by these methods and software tools. 2012 FRAME.
ERIC Educational Resources Information Center
Leite, Walter L.; Zuo, Youzhen
2011-01-01
Among the many methods currently available for estimating latent variable interactions, the unconstrained approach is attractive to applied researchers because of its relatively easy implementation with any structural equation modeling (SEM) software. Using a Monte Carlo simulation study, we extended and evaluated the unconstrained approach to…
ERIC Educational Resources Information Center
Tai, Joanna Hong-Meng; Canny, Benedict J.; Haines, Terry P.; Molloy, Elizabeth K.
2016-01-01
This study explored the contribution of peer-assisted learning (PAL) in the development of evaluative judgement capacity; the ability to understand work quality and apply those standards to appraising performance. The study employed a mixed methods approach, collecting self-reported survey data, observations of, and reflective interviews with, the…
A Proposed Model for the Analysis and Interpretation of Focus Groups in Evaluation Research
ERIC Educational Resources Information Center
Massey, Oliver T.
2011-01-01
Focus groups have an established history in applied research and evaluation. The fundamental methods of the focus group technique have been well discussed, as have their potential advantages. Less guidance tends to be provided regarding the analysis of data resulting from focus groups or how to organize and defend conclusions drawn from the…
Based on long-term monitoring conducted in Chang-ning county, a pilot site of the ‘Grain for Green Program’ (GFGP), an integrated emergy and economic method was applied to evaluate the dynamic ecological-economic performance of 3 kinds of bamboo systems planted on slo...
Sensitizing Children to the Social and Emotional Mechanisms Involved in Racism: A Program Evaluation
ERIC Educational Resources Information Center
Triliva, Sofia; Anagnostopoulou, Tanya; Vleioras, Georgios
2014-01-01
This paper describes and discusses the results of an intervention aiming to sensitize children to the social and emotional processes involved in racism. The intervention was applied and evaluated in 10 Greek elementary schools. The goals and the intervention methods of the program modules are briefly outlined and the results of the program…
Computational approaches have been applied to studying the toxicology of environmental agents for more than 50 years. These approaches have been used to enhance existing data, to provide understanding of the mechanisms of toxicity and as an aid in the evaluation of risks. However...
Grassland response to herbicides and seeding of native grasses 6 years posttreatment
Bryan A. Endress; Catherine G. Parks; Bridgett J. Naylor; Steven R. Radosevich; Mark Porter
2012-01-01
Herbicides are the primary method used to control exotic, invasive plants. This study evaluated restoration efforts applied to grasslands dominated by an invasive plant, sulfur cinquefoil, 6 yr after treatments. Of the five herbicides we evaluated, picloram continued to provide the best control of sulfur cinquefoil over 6 yr. We found the timing of picloram...
A novel evaluation strategy for fatigue reliability of flexible nanoscale films
NASA Astrophysics Data System (ADS)
Zheng, Si-Xue; Luo, Xue-Mei; Wang, Dong; Zhang, Guang-Ping
2018-03-01
In order to evaluate fatigue reliability of nanoscale metal films on flexible substrates, here we proposed an effective evaluation way to obtain critical fatigue cracking strain based on the direct observation of fatigue damage sites through conventional dynamic bending testing technique. By this method, fatigue properties and damage behaviors of 930 nm-thick Au films and 600 nm-thick Mo-W multilayers with individual layer thickness 100 nm on flexible polyimide substrates were investigated. Coffin-Manson relationship between the fatigue life and the applied strain range was obtained for the Au films and Mo-W multilayers. The characterization of fatigue damage behaviors verifies the feasibility of this method, which seems easier and more effective comparing with the other testing methods.
Wind Plant Performance Prediction (WP3) Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craig, Anna
The methods for analysis of operational wind plant data are highly variable across the wind industry, leading to high uncertainties in the validation and bias-correction of preconstruction energy estimation methods. Lack of credibility in the preconstruction energy estimates leads to significant impacts on project financing and therefore the final levelized cost of energy for the plant. In this work, the variation in the evaluation of a wind plant's operational energy production as a result of variations in the processing methods applied to the operational data is examined. Preliminary results indicate that selection of the filters applied to the data andmore » the filter parameters can have significant impacts in the final computed assessment metrics.« less
Evaluation of normalization methods in mammalian microRNA-Seq data
Garmire, Lana Xia; Subramaniam, Shankar
2012-01-01
Simple total tag count normalization is inadequate for microRNA sequencing data generated from the next generation sequencing technology. However, so far systematic evaluation of normalization methods on microRNA sequencing data is lacking. We comprehensively evaluate seven commonly used normalization methods including global normalization, Lowess normalization, Trimmed Mean Method (TMM), quantile normalization, scaling normalization, variance stabilization, and invariant method. We assess these methods on two individual experimental data sets with the empirical statistical metrics of mean square error (MSE) and Kolmogorov-Smirnov (K-S) statistic. Additionally, we evaluate the methods with results from quantitative PCR validation. Our results consistently show that Lowess normalization and quantile normalization perform the best, whereas TMM, a method applied to the RNA-Sequencing normalization, performs the worst. The poor performance of TMM normalization is further evidenced by abnormal results from the test of differential expression (DE) of microRNA-Seq data. Comparing with the models used for DE, the choice of normalization method is the primary factor that affects the results of DE. In summary, Lowess normalization and quantile normalization are recommended for normalizing microRNA-Seq data, whereas the TMM method should be used with caution. PMID:22532701
Identification of the traditional methods of newborn mothers regarding jaundice in Turkey.
Aydin, Diler; Karaca Ciftci, Esra; Karatas, Hulya
2014-02-01
To detect traditional methods applied for the treatment of newborn jaundice by mothers in Turkey. Traditional methods are generally used in our society. Instead of using medical services, people often use already-known traditional methods to treat the disease. In such cases, the prognosis of the disease generally becomes worse, the treatment period longer and healthcare costs higher, and more medicine is used. A cross-sectional descriptive study. The participants of this study were 229 mothers with newborn babies aged 0-28 days in one university hospital and one public children's hospital in Sanliurfa. The study was conducted between March and May 2012. In this research, the Beliefs and Traditional Methods of Mothers for Jaundice Questionnaire, which was formed by searching the relevant literature, is used as a data collection tool. The data are evaluated by percentage distributions. Mothers apply conventional practices in cases of health problems such as jaundice, and application of these methods is important to mothers. Moreover, mothers reported applying hazardous conventional methods in cases of neonatal jaundice, such as cutting the area between the baby's eyebrows with a blade, cutting the back of the ear and the body and burning the body, which are not applied in different cultures. Education regarding the effects of conventional methods being applied in families should be provided, and the results of this study should serve to guide further studies in assessing the effects of such education. This approach can support beneficial practices involving individual care and prevent the negative health effects of hazardous practices. © 2013 John Wiley & Sons Ltd.
A method to improve the nutritional quality of foods and beverages based on dietary recommendations.
Nijman, C A J; Zijp, I M; Sierksma, A; Roodenburg, A J C; Leenen, R; van den Kerkhoff, C; Weststrate, J A; Meijer, G W
2007-04-01
The increasing consumer interest in health prompted Unilever to develop a globally applicable method (Nutrition Score) to evaluate and improve the nutritional composition of its foods and beverages portfolio. Based on (inter)national dietary recommendations, generic benchmarks were developed to evaluate foods and beverages on their content of trans fatty acids, saturated fatty acids, sodium and sugars. High intakes of these key nutrients are associated with undesirable health effects. In principle, the developed generic benchmarks can be applied globally for any food and beverage product. Product category-specific benchmarks were developed when it was not feasible to meet generic benchmarks because of technological and/or taste factors. The whole Unilever global foods and beverages portfolio has been evaluated and actions have been taken to improve the nutritional quality. The advantages of this method over other initiatives to assess the nutritional quality of foods are that it is based on the latest nutritional scientific insights and its global applicability. The Nutrition Score is the first simple, transparent and straightforward method that can be applied globally and across all food and beverage categories to evaluate the nutritional composition. It can help food manufacturers to improve the nutritional value of their products. In addition, the Nutrition Score can be a starting point for a powerful health indicator front-of-pack. This can have a significant positive impact on public health, especially when implemented by all food manufacturers.
Schaarup, Clara; Hartvigsen, Gunnar; Larsen, Lars Bo; Tan, Zheng-Hua; Årsand, Eirik; Hejlesen, Ole Kristian
2015-01-01
The Online Diabetes Exercise System was developed to motivate people with Type 2 diabetes to do a 25 minutes low-volume high-intensity interval training program. In a previous multi-method evaluation of the system, several usability issues were identified and corrected. Despite the thorough testing, it was unclear whether all usability problems had been identified using the multi-method evaluation. Our hypothesis was that adding the eye-tracking triangulation to the multi-method evaluation would increase the accuracy and completeness when testing the usability of the system. The study design was an Eye-tracking Triangulation; conventional eye-tracking with predefined tasks followed by The Post-Experience Eye-Tracked Protocol (PEEP). Six Areas of Interests were the basis for the PEEP-session. The eye-tracking triangulation gave objective and subjective results, which are believed to be highly relevant for designing, implementing, evaluating and optimizing systems in the field of health informatics. Future work should include testing the method on a larger and more representative group of users and apply the method on different system types.
Riley, William; Parsons, Helen; McCoy, Kim; Burns, Debra; Anderson, Donna; Lee, Suhna; Sainfort, François
2009-10-01
To test the feasibility and assess the preliminary impact of a unique statewide quality improvement (QI) training program designed for public health departments. One hundred and ninety-five public health employees/managers from 38 local health departments throughout Minnesota were selected to participate in a newly developed QI training program and 65 of those engaged in and completed eight expert-supported QI projects over a period of 10 months from June 2007 through March 2008. As part of the Minnesota Quality Improvement Initiative, a structured distance education QI training program was designed and deployed in a first large-scale pilot. To evaluate the preliminary impact of the program, a mixed-method evaluation design was used based on four dimensions: learner reaction, knowledge, intention to apply, and preliminary outcomes. Subjective ratings of three dimensions of training quality were collected from participants after each of the scheduled learning sessions. Pre- and post-QI project surveys were administered to collect participant reactions, knowledge, future intention to apply learning, and perceived outcomes. Monthly and final QI project reports were collected to further inform success and preliminary outcomes of the projects. The participants reported (1) high levels of satisfaction with the training sessions, (2) increased perception of the relevance of the QI techniques, (3) increased perceived knowledge of all specific QI methods and techniques, (4) increased confidence in applying QI techniques on future projects, (5) increased intention to apply techniques on future QI projects, and (6) high perceived success of, and satisfaction with, the projects. Finally, preliminary outcomes data show moderate to large improvements in quality and/or efficiency for six out of eight projects. QI methods and techniques can be successfully implemented in local public health agencies on a statewide basis using the collaborative model through distance training and expert facilitation. This unique training can improve both core and support processes and lead to favorable staff reactions, increased knowledge, and improved health outcomes. The program can be further improved and deployed and holds great promise to facilitate the successful dissemination of proven QI methods throughout local public health departments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.
2010-02-28
Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper develops a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detecting ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliablymore » and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis. In addition, the proposed method is applied to field measurement data from WECC to show the performance of the proposed algorithm.« less
Multilevel acceleration of scattering-source iterations with application to electron transport
Drumm, Clif; Fan, Wesley
2017-08-18
Acceleration/preconditioning strategies available in the SCEPTRE radiation transport code are described. A flexible transport synthetic acceleration (TSA) algorithm that uses a low-order discrete-ordinates (S N) or spherical-harmonics (P N) solve to accelerate convergence of a high-order S N source-iteration (SI) solve is described. Convergence of the low-order solves can be further accelerated by applying off-the-shelf incomplete-factorization or algebraic-multigrid methods. Also available is an algorithm that uses a generalized minimum residual (GMRES) iterative method rather than SI for convergence, using a parallel sweep-based solver to build up a Krylov subspace. TSA has been applied as a preconditioner to accelerate the convergencemore » of the GMRES iterations. The methods are applied to several problems involving electron transport and problems with artificial cross sections with large scattering ratios. These methods were compared and evaluated by considering material discontinuities and scattering anisotropy. Observed accelerations obtained are highly problem dependent, but speedup factors around 10 have been observed in typical applications.« less
Nekhay, Olexandr; Arriaza, Manuel; Boerboom, Luc
2009-07-01
The study presents an approach that combined objective information such as sampling or experimental data with subjective information such as expert opinions. This combined approach was based on the Analytic Network Process method. It was applied to evaluate soil erosion risk and overcomes one of the drawbacks of USLE/RUSLE soil erosion models, namely that they do not consider interactions among soil erosion factors. Another advantage of this method is that it can be used if there are insufficient experimental data. The lack of experimental data can be compensated for through the use of expert evaluations. As an example of the proposed approach, the risk of soil erosion was evaluated in olive groves in Southern Spain, showing the potential of the ANP method for modelling a complex physical process like soil erosion.
A methodology for evaluating the usability of audiovisual consumer electronic products.
Kwahk, Jiyoung; Han, Sung H
2002-09-01
Usability evaluation is now considered an essential procedure in consumer product development. Many studies have been conducted to develop various techniques and methods of usability evaluation hoping to help the evaluators choose appropriate methods. However, planning and conducting usability evaluation requires considerations of a number of factors surrounding the evaluation process including the product, user, activity, and environmental characteristics. In this perspective, this study suggested a new methodology of usability evaluation through a simple, structured framework. The framework was outlined by three major components: the interface features of a product as design variables, the evaluation context consisting of user, product, activity, and environment as context variables, and the usability measures as dependent variables. Based on this framework, this study established methods to specify the product interface features, to define evaluation context, and to measure usability. The effectiveness of this methodology was demonstrated through case studies in which the usability of audiovisual products was evaluated by using the methods developed in this study. This study is expected to help the usability practitioners in consumer electronics industry in various ways. Most directly, it supports the evaluators' plan and conduct usability evaluation sessions in a systematic and structured manner. In addition, it can be applied to other categories of consumer products (such as appliances, automobiles, communication devices, etc.) with minor modifications as necessary.
DOT National Transportation Integrated Search
1999-12-01
This study evaluates the use of seal coating as a method to protect bituminous pavements from oxidation, water infiltration, and raveling. The Minnesota Department of Transportation (Mn/DOT) applied seal coating to a roadway segment of Trunk Highway ...
[Validation of Differential Extraction Kit in forensic sexual assault cases].
Wu, Dan; Cao, Yu; Xu, Yan; He, Bai-Fang; Bi, Gang; Zhou, Huai-Gu
2009-12-01
To evaluate the validity of Differential Extraction Kit in isolating spermatozoa and epithelial cell DNA from mixture samples. Selective lysis of spermatid and epithelial cells combined with paramagnetic particle method were applied to extract the DNA from the mock samples under controlled conditions and forensic case samples, and template DNA were analyzed by STR genotype method. This Differential Extraction Kit is efficient to obtain high quality spermatid and epithelial cell DNA from the mixture samples with different proportion of sperm to epithelial cell. The Differential Extraction Kit can be applied in DNA extraction for mixed stain from forensic sexual assault samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W.; Agnew, Ken; Goldberg, Mimi
Whole-building retrofits involve the installation of multiple measures. Whole-building retrofit programs take many forms. With a focus on overall building performance, these programs usually begin with an energy audit to identify cost-effective energy efficiency measures for the home. Measures are then installed, either at no cost to the homeowner or partially paid for by rebates and/or financing. The methods described here may also be applied to evaluation of single-measure retrofit programs. Related methods exist for replace-on-failure programs and for new construction, but are not the subject of this chapter.
The biospeckle method for the investigation of agricultural crops: A review
NASA Astrophysics Data System (ADS)
Zdunek, Artur; Adamiak, Anna; Pieczywek, Piotr M.; Kurenda, Andrzej
2014-01-01
Biospeckle is a nondestructive method for the evaluation of living objects. It has been applied to medicine, agriculture and microbiology for monitoring processes related to the movement of material particles. Recently, this method is extensively used for evaluation of quality of agricultural crops. In the case of botanical materials, the sources of apparent biospeckle activity are the Brownian motions and biological processes such as cyclosis, growth, transport, etc. Several different applications have been shown to monitor aging and maturation of samples, organ development and the detection and development of defects and diseases. This review will focus on three aspects: on the image analysis and mathematical methods for biospeckle activity evaluation, on published applications to botanical samples, with special attention to agricultural crops, and on interpretation of the phenomena from a biological point of view.
Petruseviciene, Daiva; Krisciūnas, Aleksandras; Sameniene, Jūrate
2002-01-01
In this article we analyze influence of rehabilitation methods in treatment of arm lymphedema. In Kaunas oncological hospital were examined 60 women after surgery for breast cancer. The work objective was to evaluate efficiency of rehabilitation methods in treatment of arm lymphedema and in evaluate movement amplitude of shoulder joint. Two groups of women depending on rehabilitation start were evaluated. The same methods of rehabilitation were applied to both groups: physical therapy, electrostimulation, massage, lymphodrainage with apparate. Our study indicated that women, who were treated at early period of rehabilitation (3 months), showed statistically significantly (p < 0.01) better results in increase of movement amplitude of shoulder joint. However, results of treatment of arm lymphedema, comparing with women who started rehabilitation after 12 months, were equally successful--results were not statistically significantly better (p > 0.05).
[Identification of Dens Draconis and Os Draconis by XRD method].
Chen, Guang-Yun; Wu, Qi-Nan; Shen, Bei; Chen, Rong
2012-04-01
To establish an XRD method for evaluating the quality of Os Draconis and Dens Draconis and applying in judgement of the counterfeit. Dens Draconis, Os Draconis and the counterfeit of Os Draconis were analyzed by XRD. Their diffraction patterns were clustered analysis and evaluated their similarity degree. Established the analytical method of Dens Draconis and Os Draconis basing the features fingerprint information of the 10 common peaks by XRD pattern. Obtained the XRD pattern of the counterfeit of Os Draconis. The similarity degree of separate sources of Dens Draconis was high,while the similarity degree of separate sources of Os Draconis was significant different from each other. This method can be used for identification and evaluation of Os Draconis and Dens Draconis. It also can be used for identification the counterfeit of Os Draconis effectively.
Development Of Methodologies Using PhabrOmeter For Fabric Drape Evaluation
NASA Astrophysics Data System (ADS)
Lin, Chengwei
Evaluation of fabric drape is important for textile industry as it reveals the aesthetic and functionality of the cloth and apparel. Although many fabric drape measuring methods have been developed for several decades, they are falling behind the need for fast product development by the industry. To meet the requirement of industries, it is necessary to develop an effective and reliable method to evaluate fabric drape. The purpose of the present study is to determine if PhabrOmeter can be applied to fabric drape evaluation. PhabrOmeter is a fabric sensory performance evaluating instrument which is developed to provide fast and reliable quality testing results. This study was sought to determine the relationship between fabric drape and other fabric attributes. In addition, a series of conventional methods including AATCC standards, ASTM standards and ISO standards were used to characterize the fabric samples. All the data were compared and analyzed with linear correlation method. The results indicate that PhabrOmeter is reliable and effective instrument for fabric drape evaluation. Besides, some effects including fabric structure, testing directions were considered to examine their impact on fabric drape.
Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment
NASA Astrophysics Data System (ADS)
David, S.; Visvikis, D.; Roux, C.; Hatt, M.
2011-09-01
In positron emission tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumor volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumor metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using a stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on clinical datasets it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracer datasets in order to evaluate its potential impact on the biological tumor volume definition for radiotherapy applications.
Morpho-functional implications of myofascial stretching applied to muscle chains: A case study.
Raţ, Bogdan Constantin; Raţă, Marinela; Antohe, Bogdan
2018-03-16
Most lesions of the soft tissues, especially those at the muscle level, are due to the lack of elasticity of the connective tissue and fascia. Stretching is one of the most commonly used methods of treatment for such musculoskeletal issues. This study tracks the effects of stretching on the electromyographic activity of muscle chains, applied to a 24-year-old athlete diagnosed with the Haglund's disease. For the evaluation, we used visual examination and surface electromyography (maximum volumetric isometric contraction). The therapeutic intervention consisted in the application of the static stretching positions, which intended the elongation of the shortened muscle chains. The treatment program had a duration of 2 months, with a frequency of 2 sessions per week and an average duration of 60 minutes. The posterior muscle chains recorded an increase in the EMG activity, while the anterior muscle chains tended to diminish their EMG activity. As a result of the applied treatment, all the evaluated muscle chains recorded a rebalancing of the electromyographic activity, demonstrating the efficiency of stretching as a method of global treatment of muscle chains. By analysing all the data, we have come to the conclusion that static stretching is an effective treatment method for shortened muscle chains.
Application of resistivity monitoring to evaluate cement grouting effect in earth filled dam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jin-Mo; Yoon, Wang-Jung
In this paper, we applied electrical resistivity monitoring method to evaluate the cement grouting effect. There are a lot of ways to evaluate cement grouting effect. In order to do this evaluation in a great safety, high efficiency, and lower cost, resistivity monitoring is found to be the most appropriate technique. In this paper we have selected a dam site from Korea to acquire resistivity monitoring data and compare the results of inversion to estimate the cement grouting effect.
NASA Astrophysics Data System (ADS)
Reinhardt, Katja; Samimi, Cyrus
2018-01-01
While climatological data of high spatial resolution are largely available in most developed countries, the network of climatological stations in many other regions of the world still constitutes large gaps. Especially for those regions, interpolation methods are important tools to fill these gaps and to improve the data base indispensible for climatological research. Over the last years, new hybrid methods of machine learning and geostatistics have been developed which provide innovative prospects in spatial predictive modelling. This study will focus on evaluating the performance of 12 different interpolation methods for the wind components \\overrightarrow{u} and \\overrightarrow{v} in a mountainous region of Central Asia. Thereby, a special focus will be on applying new hybrid methods on spatial interpolation of wind data. This study is the first evaluating and comparing the performance of several of these hybrid methods. The overall aim of this study is to determine whether an optimal interpolation method exists, which can equally be applied for all pressure levels, or whether different interpolation methods have to be used for the different pressure levels. Deterministic (inverse distance weighting) and geostatistical interpolation methods (ordinary kriging) were explored, which take into account only the initial values of \\overrightarrow{u} and \\overrightarrow{v} . In addition, more complex methods (generalized additive model, support vector machine and neural networks as single methods and as hybrid methods as well as regression-kriging) that consider additional variables were applied. The analysis of the error indices revealed that regression-kriging provided the most accurate interpolation results for both wind components and all pressure heights. At 200 and 500 hPa, regression-kriging is followed by the different kinds of neural networks and support vector machines and for 850 hPa it is followed by the different types of support vector machine and ordinary kriging. Overall, explanatory variables improve the interpolation results.
Balbale, Salva N; Locatelli, Sara M; LaVela, Sherri L
2016-08-01
In this methodological article, we examine participatory methods in depth to demonstrate how these methods can be adopted for quality improvement (QI) projects in health care. We draw on existing literature and our QI initiatives in the Department of Veterans Affairs to discuss the application of photovoice and guided tours in QI efforts. We highlight lessons learned and several benefits of using participatory methods in this area. Using participatory methods, evaluators can engage patients, providers, and other stakeholders as partners to enhance care. Participant involvement helps yield actionable data that can be translated into improved care practices. Use of these methods also helps generate key insights to inform improvements that truly resonate with stakeholders. Using participatory methods is a valuable strategy to harness participant engagement and drive improvements that address individual needs. In applying these innovative methodologies, evaluators can transcend traditional approaches to uniquely support evaluations and improvements in health care. © The Author(s) 2015.
Method for assessing in-service motor efficiency and in-service motor/load efficiency
Kueck, John D.; Otaduy, Pedro J.
1997-01-01
A method and apparatus for assessing the efficiency of an in-service motor. The operating characteristics of the in-service motor are remotely measured. The operating characteristics are then applied to an equivalent circuit for electrical motors. Finally the equivalent circuit is evaluated to determine the performance characteristics of said in-service motor. Based upon the evaluation an individual is able to determine the rotor speed, power output, efficiency, and toque of the in-service motor. Additionally, an individual is able to confirm the calculations by comparing measured values with values obtained as a result of the motor equivalent circuit evaluation.
Evaluation of verifiability in HAL/S. [programming language for aerospace computers
NASA Technical Reports Server (NTRS)
Young, W. D.; Tripathi, A. R.; Good, D. I.; Browne, J. C.
1979-01-01
The ability of HAL/S to write verifiable programs, a characteristic which is highly desirable in aerospace applications, is lacking since many of the features of HAL/S do not lend themselves to existing verification techniques. The methods of language evaluation are described along with the means in which language features are evaluated for verifiability. These methods are applied in this study to various features of HAL/S to identify specific areas in which the language fails with respect to verifiability. Some conclusions are drawn for the design of programming languages for aerospace applications and ongoing work to identify a verifiable subset of HAL/S is described.
Flipping one-shot library instruction: using Canvas and Pecha Kucha for peer teaching*†
Carroll, Alexander J.; Tchangalova, Nedelina; Harrington, Eileen G.
2016-01-01
Objective This study sought to determine whether a flipped classroom that facilitated peer learning would improve undergraduate health sciences students' abilities to find, evaluate, and use appropriate evidence for research assignments. Methods Students completed online modules in a learning management system, with librarians facilitating subsequent student-directed, in-person sessions. Mixed methods assessment was used to evaluate program outcomes. Results Students learned information literacy concepts but did not consistently apply them in research assignments. Faculty interviews revealed strengthened partnerships between librarians and teaching faculty. Conclusion This pedagogy shows promise for implementing and evaluating a successful flipped information literacy program. PMID:27076799
Ehlers, Jan P; Kaap-Fröhlich, Sylvia; Mahler, Cornelia; Scherer, Theresa; Huber, Marion
2017-01-01
Background: More and more institutions worldwide and in German-speaking countries are developing and establishing interprofessional seminars in undergraduate education of health professions. In order to evaluate the different didactic approaches and different outcomes regarding the anticipated interprofessional competencies, it is necessary to apply appropriate instruments. Cross-cultural instruments are particularly helpful for international comparability. The Interprofessional Education working group of the German Medical Association (GMA) aims at identifying existing instruments for the evaluation of interprofessional education in order to make recommendations for German-speaking countries. Methods: Systematic literature research was performed on the websites of international interprofessional organisations (CAIPE, EIPEN, AIPEN), as well as in the PubMed and Cinahl databases. Reviews focusing on quantitative instruments to evaluate competencies according to the modified Kirkpatrick competency levels were searched for. Psychometrics, language/country and setting, in which the instrument was applied, were recorded. Results: Six reviews out of 73 literature research hits were included. A large number of instruments were identified; however, their psychometrics and the applied setting were very heterogeneous. The instruments can mainly be assigned to Kirkpatrick levels 1, 2a & 2b. Most instruments have been developed in English but their psychometrics were not always reported rigorously. Only very few instruments are available in German. Conclusion: It is difficult to find appropriate instruments in German. Internationally, there are different approaches and objectives in the measurement and evaluation of interprofessional competencies. The question arises whether it makes sense to translate existing instruments or to go through the lengthy process of developing new ones. The evaluation of interprofessional seminars with quantitative instruments remains mainly on Kirkpatrick levels 1 and 2. Levels 3 and 4 can probably only be assessed with qualitative or mixed methods. German language instruments are necessary.
Ehlers, Jan P.; Kaap-Fröhlich, Sylvia; Mahler, Cornelia; Scherer, Theresa; Huber, Marion
2017-01-01
Background: More and more institutions worldwide and in German-speaking countries are developing and establishing interprofessional seminars in undergraduate education of health professions. In order to evaluate the different didactic approaches and different outcomes regarding the anticipated interprofessional competencies, it is necessary to apply appropriate instruments. Cross-cultural instruments are particularly helpful for international comparability. The Interprofessional Education working group of the German Medical Association (GMA) aims at identifying existing instruments for the evaluation of interprofessional education in order to make recommendations for German-speaking countries. Methods: Systematic literature research was performed on the websites of international interprofessional organisations (CAIPE, EIPEN, AIPEN), as well as in the PubMed and Cinahl databases. Reviews focusing on quantitative instruments to evaluate competencies according to the modified Kirkpatrick competency levels were searched for. Psychometrics, language/country and setting, in which the instrument was applied, were recorded. Results: Six reviews out of 73 literature research hits were included. A large number of instruments were identified; however, their psychometrics and the applied setting were very heterogeneous. The instruments can mainly be assigned to Kirkpatrick levels 1, 2a & 2b. Most instruments have been developed in English but their psychometrics were not always reported rigorously. Only very few instruments are available in German. Conclusion: It is difficult to find appropriate instruments in German. Internationally, there are different approaches and objectives in the measurement and evaluation of interprofessional competencies. The question arises whether it makes sense to translate existing instruments or to go through the lengthy process of developing new ones. The evaluation of interprofessional seminars with quantitative instruments remains mainly on Kirkpatrick levels 1 and 2. Levels 3 and 4 can probably only be assessed with qualitative or mixed methods. German language instruments are necessary. PMID:28890927
Freckmann, Guido; Baumstark, Annette; Schmid, Christina; Pleus, Stefan; Link, Manuela; Haug, Cornelia
2014-02-01
Systems for self-monitoring of blood glucose (SMBG) have to provide accurate and reproducible blood glucose (BG) values in order to ensure adequate therapeutic decisions by people with diabetes. Twelve SMBG systems were compared in a standardized manner under controlled laboratory conditions: nine systems were available on the German market and were purchased from a local pharmacy, and three systems were obtained from the manufacturer (two systems were available on the U.S. market, and one system was not yet introduced to the German market). System accuracy was evaluated following DIN EN ISO (International Organization for Standardization) 15197:2003. In addition, measurement reproducibility was assessed following a modified TNO (Netherlands Organization for Applied Scientific Research) procedure. Comparison measurements were performed with either the glucose oxidase method (YSI 2300 STAT Plus™ glucose analyzer; YSI Life Sciences, Yellow Springs, OH) or the hexokinase method (cobas(®) c111; Roche Diagnostics GmbH, Mannheim, Germany) according to the manufacturer's measurement procedure. The 12 evaluated systems showed between 71.5% and 100% of the measurement results within the required system accuracy limits. Ten systems fulfilled with the evaluated test strip lot minimum accuracy requirements specified by DIN EN ISO 15197:2003. In addition, accuracy limits of the recently published revision ISO 15197:2013 were applied and showed between 54.5% and 100% of the systems' measurement results within the required accuracy limits. Regarding measurement reproducibility, each of the 12 tested systems met the applied performance criteria. In summary, 83% of the systems fulfilled with the evaluated test strip lot minimum system accuracy requirements of DIN EN ISO 15197:2003. Each of the tested systems showed acceptable measurement reproducibility. In order to ensure sufficient measurement quality of each distributed test strip lot, regular evaluations are required.
A revision of the gamma-evaluation concept for the comparison of dose distributions.
Bakai, Annemarie; Alber, Markus; Nüsslin, Fridtjof
2003-11-07
A method for the quantitative four-dimensional (4D) evaluation of discrete dose data based on gradient-dependent local acceptance thresholds is presented. The method takes into account the local dose gradients of a reference distribution for critical appraisal of misalignment and collimation errors. These contribute to the maximum tolerable dose error at each evaluation point to which the local dose differences between comparison and reference data are compared. As shown, the presented concept is analogous to the gamma-concept of Low et al (1998a Med. Phys. 25 656-61) if extended to (3+1) dimensions. The pointwise dose comparisons of the reformulated concept are easier to perform and speed up the evaluation process considerably, especially for fine-grid evaluations of 3D dose distributions. The occurrences of false negative indications due to the discrete nature of the data are reduced with the method. The presented method was applied to film-measured, clinical data and compared with gamma-evaluations. 4D and 3D evaluations were performed. Comparisons prove that 4D evaluations have to be given priority, especially if complex treatment situations are verified, e.g., non-coplanar beam configurations.
Data-based adjoint and H2 optimal control of the Ginzburg-Landau equation
NASA Astrophysics Data System (ADS)
Banks, Michael; Bodony, Daniel
2017-11-01
Equation-free, reduced-order methods of control are desirable when the governing system of interest is of very high dimension or the control is to be applied to a physical experiment. Two-phase flow optimal control problems, our target application, fit these criteria. Dynamic Mode Decomposition (DMD) is a data-driven method for model reduction that can be used to resolve the dynamics of very high dimensional systems and project the dynamics onto a smaller, more manageable basis. We evaluate the effectiveness of DMD-based forward and adjoint operator estimation when applied to H2 optimal control approaches applied to the linear and nonlinear Ginzburg-Landau equation. Perspectives on applying the data-driven adjoint to two phase flow control will be given. Office of Naval Research (ONR) as part of the Multidisciplinary University Research Initiatives (MURI) Program, under Grant Number N00014-16-1-2617.
Fachi, Mariana Millan; Leonart, Letícia Paula; Cerqueira, Letícia Bonancio; Pontes, Flavia Lada Degaut; de Campos, Michel Leandro; Pontarolo, Roberto
2017-06-15
A systematic and critical review was conducted on bioanalytical methods validated to quantify combinations of antidiabetic agents in human blood. The aim of this article was to verify how the validation process of bioanalytical methods is performed and the quality of the published records. The validation assays were evaluated according to international guidelines. The main problems in the validation process are pointed out and discussed to help researchers to choose methods that are truly reliable and can be successfully applied for their intended use. The combination of oral antidiabetic agents was chosen as these are some of the most studied drugs and several methods are present in the literature. Moreover, this article may be applied to the validation process of all bioanalytical. Copyright © 2017 Elsevier B.V. All rights reserved.
Performance evaluation of a mobile satellite system modem using an ALE method
NASA Technical Reports Server (NTRS)
Ohsawa, Tomoki; Iwasaki, Motoya
1990-01-01
Experimental performance of a newly designed demodulation concept is presented. This concept applies an Adaptive Line Enhancer (ALE) to a carrier recovery circuit, which makes pull-in time significantly shorter in noisy and large carrier offset conditions. This new demodulation concept was actually developed as an INMARSAT standard-C modem, and was evaluated. On a performance evaluation, 50 symbol pull-in time is confirmed under 4 dB Eb/No condition.
Sanagi, M Marsin; Nasir, Zalilah; Ling, Susie Lu; Hermawan, Dadan; Ibrahim, Wan Aini Wan; Naim, Ahmedy Abu
2010-01-01
Linearity assessment as required in method validation has always been subject to different interpretations and definitions by various guidelines and protocols. However, there are very limited applicable implementation procedures that can be followed by a laboratory chemist in assessing linearity. Thus, this work proposes a simple method for linearity assessment in method validation by a regression analysis that covers experimental design, estimation of the parameters, outlier treatment, and evaluation of the assumptions according to the International Union of Pure and Applied Chemistry guidelines. The suitability of this procedure was demonstrated by its application to an in-house validation for the determination of plasticizers in plastic food packaging by GC.
Using a Linear Regression Method to Detect Outliers in IRT Common Item Equating
ERIC Educational Resources Information Center
He, Yong; Cui, Zhongmin; Fang, Yu; Chen, Hanwei
2013-01-01
Common test items play an important role in equating alternate test forms under the common item nonequivalent groups design. When the item response theory (IRT) method is applied in equating, inconsistent item parameter estimates among common items can lead to large bias in equated scores. It is prudent to evaluate inconsistency in parameter…
An Evaluation of a Computer-Based Training on the Visual Analysis of Single-Subject Data
ERIC Educational Resources Information Center
Snyder, Katie
2013-01-01
Visual analysis is the primary method of analyzing data in single-subject methodology, which is the predominant research method used in the fields of applied behavior analysis and special education. Previous research on the reliability of visual analysis suggests that judges often disagree about what constitutes an intervention effect. Considering…
Rudakov, M L
2000-01-01
Method of secondary sources (method of integral equations) was applied to calculate specific absorbed intensity in hands of operators working at non-shielded high-frequency (27.12 Mhz) welding devices. The authors present calculations for "female" and "male" hand sizes, give recommendations on lower level of specific absorption.
ERIC Educational Resources Information Center
Said, Asnah; Syarif, Edy
2016-01-01
This research aimed to evaluate of online tutorial program design by applying problem-based learning Research Methods currently implemented in the system of Open Distance Learning (ODL). The students must take a Research Methods course to prepare themselves for academic writing projects. Problem-based learning basically emphasizes the process of…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-01
... following methods: Internet: Access the Federal e-rulemaking portal at http://www.regulations.gov . Follow... initial health evaluations, diagnostic and treatment services for residents, students, and others in the... rulemaking and the public will have the opportunity to consider and comment on the review methods applied in...
A Primer on Bootstrap Factor Analysis as Applied to Health Studies Research
ERIC Educational Resources Information Center
Lu, Wenhua; Miao, Jingang; McKyer, E. Lisako J.
2014-01-01
Objectives: To demonstrate how the bootstrap method could be conducted in exploratory factor analysis (EFA) with a syntax written in SPSS. Methods: The data obtained from the Texas Childhood Obesity Prevention Policy Evaluation project (T-COPPE project) were used for illustration. A 5-step procedure to conduct bootstrap factor analysis (BFA) was…
What Is Design-Based Causal Inference for RCTs and Why Should I Use It? NCEE 2017-4025
ERIC Educational Resources Information Center
Schochet, Peter Z.
2017-01-01
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The impact estimators are derived using the building blocks of experimental designs with minimal assumptions, and have good statistical properties. The methods apply to randomized controlled trials (RCTs) and…
ERIC Educational Resources Information Center
Sopromadze, Natia; Moorosi, Pontso
2017-01-01
The paper aims to demonstrate the value of cognitive interviewing (CI) as a survey pretesting method in comparative education research. Although rarely used by education researchers, CI has been successfully applied in different disciplines to evaluate and improve question performance. The method assumes that observing people's thought processes…
NASA Astrophysics Data System (ADS)
Zhao, Huifu; Chen, Yu; Liu, Dongmei
2017-08-01
There is a saying that "The teacher, proselytizes instructs dispels doubt." Traditional teaching methods, constantly let the students learn the knowledge in order to pursue the knowledge of a solid grasp, then assess the teaching result by evaluating of the degree of knowledge and memory. This approach cannot mobilize the enthusiasm of students to learn, and hinders the development of innovative thinking of students. And this assessment results have no practical significance, decoupling from practical application. As we all know, the course of Applied Optics is based on abstract theory. If the same teaching methods using for this course by such a "duck", it is unable to mobilize students' learning initiative, and then students' study results will be affected by passive acceptance of knowledge. How to take the initiative to acquire knowledge in the class to the students, and fully mobilize the initiative of students and to explore the potential of students, finally evaluation contents more research on the practical significance? Scholars continue to innovate teaching methods, as well as teaching evaluation indicators, the best teaching effect to promote the development of students. Therefore, this paper puts forward a set of teaching evaluation model of teaching autonomy. This so-called "autonomous teaching" is that teachers put forward the request or arrange the task and students complete the learning content in the form of a group to discuss learning before the lesson, and to complete the task of the layout, then teachers accept of students' learning achievements and answer questions. Every task is designed to evaluate the effectiveness of teaching. Every lesson should be combined with the progress of science and technology frontier of Applied Optics, let students understand the relationship between research and application in the future, mobilize the students interest in learning, training ability, learn to take the initiative to explore, team cooperation ability. As well, it has practical significance to every evaluation, making the teaching to active learning in teaching, cultivating students' creative potential, deep, solid foundation for the day after learning work.
Wang, Penghao; Wilson, Susan R
2013-01-01
Mass spectrometry-based protein identification is a very challenging task. The main identification approaches include de novo sequencing and database searching. Both approaches have shortcomings, so an integrative approach has been developed. The integrative approach firstly infers partial peptide sequences, known as tags, directly from tandem spectra through de novo sequencing, and then puts these sequences into a database search to see if a close peptide match can be found. However the current implementation of this integrative approach has several limitations. Firstly, simplistic de novo sequencing is applied and only very short sequence tags are used. Secondly, most integrative methods apply an algorithm similar to BLAST to search for exact sequence matches and do not accommodate sequence errors well. Thirdly, by applying these methods the integrated de novo sequencing makes a limited contribution to the scoring model which is still largely based on database searching. We have developed a new integrative protein identification method which can integrate de novo sequencing more efficiently into database searching. Evaluated on large real datasets, our method outperforms popular identification methods.
Nonprobability and probability-based sampling strategies in sexual science.
Catania, Joseph A; Dolcini, M Margaret; Orellana, Roberto; Narayanan, Vasudah
2015-01-01
With few exceptions, much of sexual science builds upon data from opportunistic nonprobability samples of limited generalizability. Although probability-based studies are considered the gold standard in terms of generalizability, they are costly to apply to many of the hard-to-reach populations of interest to sexologists. The present article discusses recent conclusions by sampling experts that have relevance to sexual science that advocates for nonprobability methods. In this regard, we provide an overview of Internet sampling as a useful, cost-efficient, nonprobability sampling method of value to sex researchers conducting modeling work or clinical trials. We also argue that probability-based sampling methods may be more readily applied in sex research with hard-to-reach populations than is typically thought. In this context, we provide three case studies that utilize qualitative and quantitative techniques directed at reducing limitations in applying probability-based sampling to hard-to-reach populations: indigenous Peruvians, African American youth, and urban men who have sex with men (MSM). Recommendations are made with regard to presampling studies, adaptive and disproportionate sampling methods, and strategies that may be utilized in evaluating nonprobability and probability-based sampling methods.
Parameter identification for structural dynamics based on interval analysis algorithm
NASA Astrophysics Data System (ADS)
Yang, Chen; Lu, Zixing; Yang, Zhenyu; Liang, Ke
2018-04-01
A parameter identification method using interval analysis algorithm for structural dynamics is presented in this paper. The proposed uncertain identification method is investigated by using central difference method and ARMA system. With the help of the fixed memory least square method and matrix inverse lemma, a set-membership identification technology is applied to obtain the best estimation of the identified parameters in a tight and accurate region. To overcome the lack of insufficient statistical description of the uncertain parameters, this paper treats uncertainties as non-probabilistic intervals. As long as we know the bounds of uncertainties, this algorithm can obtain not only the center estimations of parameters, but also the bounds of errors. To improve the efficiency of the proposed method, a time-saving algorithm is presented by recursive formula. At last, to verify the accuracy of the proposed method, two numerical examples are applied and evaluated by three identification criteria respectively.
NASA Astrophysics Data System (ADS)
Jankowiak, Iwona; Madaj, Arkadiusz
2017-12-01
One of the methods to increase the load carrying capacity of the reinforced concrete (RC) structure is its strengthening by using carbon fiber (CFRP) strips. There are two methods of strengthening using CFRP strips - passive method and active method. In the passive method a strip is applied to the concrete surface without initial strains, unlike in the active method a strip is initially pretensioned before its application. In the case of a steel-concrete composite beam, strips may be used to strengthen the concrete slab located in the tension zone (in the parts of beams with negative bending moments). The finite element model has been developed and validated by experimental tests to evaluate the strengthening efficiency of the composite girder with pretensioned CFRP strips applied to concrete slab in its tension zone.
Evaluation of Hamaker coefficients using Diffusion Monte Carlo method
NASA Astrophysics Data System (ADS)
Maezono, Ryo; Hongo, Kenta
We evaluated the Hamaker's constant for Cyclohexasilane to investigate its wettability, which is used as an ink of 'liquid silicon' in 'printed electronics'. Taking three representative geometries of the dimer coalescence (parallel, lined, and T-shaped), we evaluated these binding curves using diffusion Monte Carlo method. The parallel geometry gave the most long-ranged exponent, ~ 1 /r6 , in its asymptotic behavior. Evaluated binding lengths are fairly consistent with the experimental density of the molecule. The fitting of the asymptotic curve gave an estimation of Hamaker's constant being around 100 [zJ]. We also performed a CCSD(T) evaluation and got almost similar result. To check its justification, we applied the same scheme to Benzene and compared the estimation with those by other established methods, Lifshitz theory and SAPT (Symmetry-adopted perturbation theory). The result by the fitting scheme turned to be twice larger than those by Lifshitz and SAPT, both of which coincide with each other. It is hence implied that the present evaluation for Cyclohexasilane would be overestimated.
Schumacher, I; Zechmeister, I
2012-04-01
In Austria research in Health Technology Assessment (HTA) has been conducted since the 1990s. Research in HTA aims at supporting an adequate and efficient use of health care resources in order to sustain a publicly financed and solidary health care system. Ultimately, HTA research should result in better health of the population. Research results should provide independent information for decision makers. For legitimizing further research resources and for prioritizing future HTA research and guaranteeing the value of future research, HTA research needs itself to undergo evaluation. Aim of the study is to design a conceptual framework for evaluating the impact of HTA research in Austria on the basis of the existing literature. An already existing review which presents methods and concepts how to evaluate HTA-impact was updated by a systematic research including literature of the years 2004-January 2010. Results were analysed in regard to 4 categories: definition of the term impact, target groups and system levels, operationalisation of indicators and evaluation methods. Overall, 19 publications were included. Referring to the 4 categories, an explanation of impact has to take into account HTAs multidisciplinary setting and needs a context related definition. Target groups, system levels, indicators and methods depend on the impact defined. Studies investigated direct and indirect impact and were focused on different target groups like physicians, nurses and decision makers on the micro-, and meso level, as well as politicians and reimbursement institutions on the macro level. Except for one reference all studies applied already known and mostly qualitative methods for measuring the impact of HTA research. Thus, an appropriate pool of instruments seems to be available. There is a lack of information about validity of applied methods and indicators. By adapting adequate methods and concepts a conceptual framework for the Austrian HTA-Impact evaluation has been designed. The paper presents an overview of existing methods for the evaluation of the HTA research. This has been used to identify useful approaches for measuring the HTA-impact in Austria. By providing a context sensitive framework for impact evaluation in Austria the Austrian HTA-research contributes to the international trend of impact-evaluation. © Georg Thieme Verlag KG Stuttgart · New York.
Reeves, Anthony P; Xie, Yiting; Liu, Shuang
2017-04-01
With the advent of fully automated image analysis and modern machine learning methods, there is a need for very large image datasets having documented segmentations for both computer algorithm training and evaluation. This paper presents a method and implementation for facilitating such datasets that addresses the critical issue of size scaling for algorithm validation and evaluation; current evaluation methods that are usually used in academic studies do not scale to large datasets. This method includes protocols for the documentation of many regions in very large image datasets; the documentation may be incrementally updated by new image data and by improved algorithm outcomes. This method has been used for 5 years in the context of chest health biomarkers from low-dose chest CT images that are now being used with increasing frequency in lung cancer screening practice. The lung scans are segmented into over 100 different anatomical regions, and the method has been applied to a dataset of over 20,000 chest CT images. Using this framework, the computer algorithms have been developed to achieve over 90% acceptable image segmentation on the complete dataset.
Health systems research training enhances workplace research skills: a qualitative evaluation.
Adams, Jolene; Schaffer, Angela; Lewin, Simon; Zwarenstein, Merrick; van der Walt, Hester
2003-01-01
In-service education is a widely used means of enhancing the skills of health service providers, for example, in undertaking research. However, the transfer of skills acquired during an education course to the workplace is seldom evaluated. The objectives of this study were to assess learner, teacher, and health service manager perceptions of the usefulness, in the work setting, of skills taught on a health systems research education course in South Africa and to assess the extent to which the course stimulated awareness and development of health systems research in the work setting. The education course was evaluated using a qualitative approach. Respondents were selected for interview using purposive sampling. Interviews were conducted with 39 respondents, including all of the major stakeholders. The interviews lasted between 20 and 60 minutes and were conducted either face to face or over the telephone. Thematic analysis was applied to the data, and key themes were identified. The course demystified health systems research and stimulated interest in reading and applying research findings. The course also changed participants' attitudes to routine data collection and was reported to have facilitated the application of informal research or problem-solving methods to everyday work situations. However, inadequate support within the workplace was a significant obstacle to applying the skills learned. A 2-week intensive, experiential course in health systems research methods can provide a mechanism for introducing basic research skills to a wide range of learners. Qualitative evaluation is a useful approach for assessing the impacts of education courses.
Roberts-Ashby, Tina; Brandon N. Ashby,
2016-01-01
This paper demonstrates geospatial modification of the USGS methodology for assessing geologic CO2 storage resources, and was applied to the Pre-Punta Gorda Composite and Dollar Bay reservoirs of the South Florida Basin. The study provides detailed evaluation of porous intervals within these reservoirs and utilizes GIS to evaluate the potential spatial distribution of reservoir parameters and volume of CO2 that can be stored. This study also shows that incorporating spatial variation of parameters using detailed and robust datasets may improve estimates of storage resources when compared to applying uniform values across the study area derived from small datasets, like many assessment methodologies. Geospatially derived estimates of storage resources presented here (Pre-Punta Gorda Composite = 105,570 MtCO2; Dollar Bay = 24,760 MtCO2) were greater than previous assessments, which was largely attributed to the fact that detailed evaluation of these reservoirs resulted in higher estimates of porosity and net-porous thickness, and areas of high porosity and thick net-porous intervals were incorporated into the model, likely increasing the calculated volume of storage space available for CO2 sequestration. The geospatial method for evaluating CO2 storage resources also provides the ability to identify areas that potentially contain higher volumes of storage resources, as well as areas that might be less favorable.
Yang, Weichao; Xu, Kui; Lian, Jijian; Bin, Lingling; Ma, Chao
2018-05-01
Flood is a serious challenge that increasingly affects the residents as well as policymakers. Flood vulnerability assessment is becoming gradually relevant in the world. The purpose of this study is to develop an approach to reveal the relationship between exposure, sensitivity and adaptive capacity for better flood vulnerability assessment, based on the fuzzy comprehensive evaluation method (FCEM) and coordinated development degree model (CDDM). The approach is organized into three parts: establishment of index system, assessment of exposure, sensitivity and adaptive capacity, and multiple flood vulnerability assessment. Hydrodynamic model and statistical data are employed for the establishment of index system; FCEM is used to evaluate exposure, sensitivity and adaptive capacity; and CDDM is applied to express the relationship of the three components of vulnerability. Six multiple flood vulnerability types and four levels are proposed to assess flood vulnerability from multiple perspectives. Then the approach is applied to assess the spatiality of flood vulnerability in Hainan's eastern area, China. Based on the results of multiple flood vulnerability, a decision-making process for rational allocation of limited resources is proposed and applied to the study area. The study shows that multiple flood vulnerability assessment can evaluate vulnerability more completely, and help decision makers learn more information about making decisions in a more comprehensive way. In summary, this study provides a new way for flood vulnerability assessment and disaster prevention decision. Copyright © 2018 Elsevier Ltd. All rights reserved.
Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin
2014-01-01
The purpose of this article was to present a new drying method for agricultural products. Electrohydrodynamic (EHD) has been applied for drying of agricultural materials due to several advantages such as energy saving, low cost equipment, low drying temperatures, and superior material quality. To evaluate this method, an EHD dryer based on solar (photovoltaic) energy was designed and fabricated. Moreover, the optimum condition for the EHD drying of kiwi fruit was studied by applying the Box–Behnken design of response surface methodology. The desirability function was applied for optimization in case of single objective and multiobjective functions. By using the multiobjective optimization method, maximum desirability value of 0.865 was obtained based on the following: applied voltage of 15 kV, field strength of 5.2 kV cm−1, without forced air stream, and finally a combination of 17 discharge electrodes (needles). The results indicated that increasing the applied voltage from 6 to 15 kV, moisture ratio (MR) decreased, though energy efficiency and energy consumption were increasing. On the other hand, field strength of 5.2 kV cm−1 was the optimal point in terms of MR. PMID:25493195
Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin
2014-11-01
The purpose of this article was to present a new drying method for agricultural products. Electrohydrodynamic (EHD) has been applied for drying of agricultural materials due to several advantages such as energy saving, low cost equipment, low drying temperatures, and superior material quality. To evaluate this method, an EHD dryer based on solar (photovoltaic) energy was designed and fabricated. Moreover, the optimum condition for the EHD drying of kiwi fruit was studied by applying the Box-Behnken design of response surface methodology. The desirability function was applied for optimization in case of single objective and multiobjective functions. By using the multiobjective optimization method, maximum desirability value of 0.865 was obtained based on the following: applied voltage of 15 kV, field strength of 5.2 kV cm(-1), without forced air stream, and finally a combination of 17 discharge electrodes (needles). The results indicated that increasing the applied voltage from 6 to 15 kV, moisture ratio (MR) decreased, though energy efficiency and energy consumption were increasing. On the other hand, field strength of 5.2 kV cm(-1) was the optimal point in terms of MR.
Effective evaluation of privacy protection techniques in visible and thermal imagery
NASA Astrophysics Data System (ADS)
Nawaz, Tahir; Berg, Amanda; Ferryman, James; Ahlberg, Jörgen; Felsberg, Michael
2017-09-01
Privacy protection may be defined as replacing the original content in an image region with a (less intrusive) content having modified target appearance information to make it less recognizable by applying a privacy protection technique. Indeed, the development of privacy protection techniques also needs to be complemented with an established objective evaluation method to facilitate their assessment and comparison. Generally, existing evaluation methods rely on the use of subjective judgments or assume a specific target type in image data and use target detection and recognition accuracies to assess privacy protection. An annotation-free evaluation method that is neither subjective nor assumes a specific target type is proposed. It assesses two key aspects of privacy protection: "protection" and "utility." Protection is quantified as an appearance similarity, and utility is measured as a structural similarity between original and privacy-protected image regions. We performed an extensive experimentation using six challenging datasets (having 12 video sequences), including a new dataset (having six sequences) that contains visible and thermal imagery. The new dataset is made available online for the community. We demonstrate effectiveness of the proposed method by evaluating six image-based privacy protection techniques and also show comparisons of the proposed method over existing methods.
Evaluation of the long-term performance of six alternative disposal methods for LLRW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kossik, R.; Sharp, G.; Chau, T.
1995-12-31
The State of New York has carried out a comparison of six alternative disposal methods for low-level radioactive waste (LLRW). An important part of these evaluations involved quantitatively analyzing the long-term (10,000 yr) performance of the methods with respect to dose to humans, radionuclide concentrations in the environment, and cumulative release from the facility. Four near-surface methods (covered above-grade vault, uncovered above-grade vault, below-grade vault, augered holes) and two mine methods (vertical shaft mine and drift mine) were evaluated. Each method was analyzed for several generic site conditions applicable for the state. The evaluations were carried out using RIP (Repositorymore » Integration Program), an integrated, total system performance assessment computer code which has been applied to radioactive waste disposal facilities both in the U.S. (Yucca Mountain, WIPP) and worldwide. The evaluations indicate that mines in intact low-permeability rock and near-surface facilities with engineered covers generally have a high potential to perform well (within regulatory limits). Uncovered above-grade vaults and mines in highly fractured crystalline rock, however, have a high potential to perform poorly, exceeding regulatory limits.« less
Health state evaluation of shield tunnel SHM using fuzzy cluster method
NASA Astrophysics Data System (ADS)
Zhou, Fa; Zhang, Wei; Sun, Ke; Shi, Bin
2015-04-01
Shield tunnel SHM is in the path of rapid development currently while massive monitoring data processing and quantitative health grading remain a real challenge, since multiple sensors belonging to different types are employed in SHM system. This paper addressed the fuzzy cluster method based on fuzzy equivalence relationship for the health evaluation of shield tunnel SHM. The method was optimized by exporting the FSV map to automatically generate the threshold value. A new holistic health score(HHS) was proposed and its effectiveness was validated by conducting a pilot test. A case study on Nanjing Yangtze River Tunnel was presented to apply this method. Three types of indicators, namely soil pressure, pore pressure and steel strain, were used to develop the evaluation set U. The clustering results were verified by analyzing the engineering geological conditions; the applicability and validity of the proposed method was also demonstrated. Besides, the advantage of multi-factor evaluation over single-factor model was discussed by using the proposed HHS. This investigation indicated the fuzzy cluster method and HHS is capable of characterizing the fuzziness of tunnel health, and it is beneficial to clarify the tunnel health evaluation uncertainties.
NASA Astrophysics Data System (ADS)
Bergese, P.; Bontempi, E.; Depero, L. E.
2006-10-01
X-ray reflectivity (XRR) is a non-destructive, accurate and fast technique for evaluating film density. Indeed, sample-goniometer alignment is a critical experimental factor and the overriding error source in XRR density determination. With commercial single-wavelength X-ray reflectometers, alignment is difficult to control and strongly depends on the operator. In the present work, the contribution of misalignment on density evaluation error is discussed, and a novel procedure (named XRR-density evaluation or XRR-DE method) to minimize the problem will be presented. The method allows to overcome the alignment step through the extrapolation of the correct density value from appropriate non-specular XRR data sets. This procedure is operator independent and suitable for commercial single-wavelength X-ray reflectometers. To test the XRR-DE method, single crystals of TiO 2 and SrTiO 3 were used. In both cases the determined densities differed from the nominal ones less than 5.5%. Thus, the XRR-DE method can be successfully applied to evaluate the density of thin films for which only optical reflectivity is today used. The advantage is that this method can be considered thickness independent.
2014-12-26
additive value function, which assumes mutual preferential independence (Gregory S. Parnell, 2013). In other words, this method can be used if the... additive value function method to calculate the aggregate value of multiple objectives. Step 9 : Sensitivity Analysis Once the global values are...gravity metric, the additive method will be applied using equal weights for each axis value function. Pilot Satisfaction (Usability) As expressed
NASA Technical Reports Server (NTRS)
Su, Shin-Yi; Kessler, Donald J.
1991-01-01
The present study examines a very fast method of calculating the collision frequency between two low-eccentricity orbiting bodies for evaluating the evolution of earth-orbiting objects such as space debris. The results are very accurate and the required computer time is negligible. The method is now applied without modification to calculate the collision frequencies for moderately and highly eccentric orbits.
Shimasaki, Noriko; Hara, Masayuki; Kikuno, Ritsuko; Shinohara, Katsuaki
2016-01-01
To prevent nosocomial infections caused by even either Ebola virus or methicillin-resistant Staphylococcus aureus (MRSA), healthcare workers must wear the appropriate protective clothing which can inhibit contact transmission of these pathogens. Therefore, it is necessary to evaluate the performance of protective clothing for penetration resistance against infectious agents. In Japan, some standard methods were established to evaluate the penetration resistance of protective clothing fabric materials under applied pressure. However, these methods only roughly classified the penetration resistance of fabrics, and the detection sensitivity of the methods and the penetration amount with respect to the relationship between blood and the pathogen have not been studied in detail. Moreover, no standard method using bacteria for evaluation is known. Here, to evaluate penetration resistance of protective clothing materials under applied pressure, the detection sensitivity and the leak amount were investigated by using synthetic blood containing bacteriophage phi-X174 or S. aureus. And the volume of leaked synthetic blood and the amount of test microbe penetration were simultaneously quantified. Our results showed that the penetration detection sensitivity achieved using a test microbial culture was higher than that achieved using synthetic blood at invisible leak level pressures. This finding suggested that there is a potential risk of pathogen penetration even when visual leak of contaminated blood through the protective clothing was not observed. Moreover, at visible leak level pressures, it was found that the amount of test microbe penetration varied at least ten-fold among protective clothing materials classified into the same class of penetration resistance. Analysis of the penetration amount revealed a significant correlation between the volume of penetrated synthetic blood and the amount of test microbe penetration, indicating that the leaked volume of synthetic blood could be considered as a latent indicator for infection risk, that the amount of exposure to contaminated blood corresponds to the risk of infection. Our study helped us ascertain, with high sensitivity, the differences among fabric materials with respect to their protective performance, which may facilitate effective selection of protective clothing depending on the risk assessment.
Development of a stiffness-angle law for simplifying the measurement of human hair stiffness.
Jung, I K; Park, S C; Lee, Y R; Bin, S A; Hong, Y D; Eun, D; Lee, J H; Roh, Y S; Kim, B M
2018-04-01
This research examines the benefits of caffeine absorption on hair stiffness. To test hair stiffness, we have developed an evaluation method that is not only accurate, but also inexpensive. Our evaluation method for measuring hair stiffness culminated in a model, called the Stiffness-Angle Law, which describes the elastic properties of hair and can be widely applied to the development of hair care products. Small molecules (≤500 g mol -1 ) such as caffeine can be absorbed into hair. A common shampoo containing 4% caffeine was formulated and applied to hair 10 times, after which the hair stiffness was measured. The caffeine absorption of the treated hair was observed using Fourier-transform infrared spectroscopy (FTIR) with a focal plane array (FPA) detector. Our evaluation method for measuring hair stiffness consists of a regular camera and a support for single strands of hair. After attaching the hair to the support, the bending angle of the hair was observed with a camera and measured. Then, the hair strand was weighed. The stiffness of the hair was calculated based on our proposed Stiffness-Angle Law using three variables: angle, weight of hair and the distance the hair was pulled across the support. The caffeine absorption was confirmed by FTIR analysis. The concentration of amide bond in the hair certainly increased due to caffeine absorption. After caffeine was absorbed into the hair, the bending angle and weight of the hair changed. Applying these measured changes to the Stiffness-Angle Law, it was confirmed that the hair stiffness increased by 13.2% due to caffeine absorption. The theoretical results using the Stiffness-Angle Law agree with the visual examinations of hair exposed to caffeine and also the known results of hair stiffness from a previous report. Our evaluation method combined with our proposed Stiffness-Angle Law effectively provides an accurate and inexpensive evaluation technique for measuring bending stiffness of human hair. © 2018 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
Seismic data fusion anomaly detection
NASA Astrophysics Data System (ADS)
Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David
2014-06-01
Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.
Generalized Gilat-Raubenheimer method for density-of-states calculation in photonic crystals
NASA Astrophysics Data System (ADS)
Liu, Boyuan; Johnson, Steven G.; Joannopoulos, John D.; Lu, Ling
2018-04-01
An efficient numerical algorithm is the key for accurate evaluation of density of states (DOS) in band theory. The Gilat-Raubenheimer (GR) method proposed in 1966 is an efficient linear extrapolation method which was limited in specific lattices. Here, using an affine transformation, we provide a new generalization of the original GR method to any Bravais lattices and show that it is superior to the tetrahedron method and the adaptive Gaussian broadening method. Finally, we apply our generalized GR method to compute DOS of various gyroid photonic crystals of topological degeneracies.
Aggregation in Network Models for Transportation Planning
DOT National Transportation Integrated Search
1978-02-01
This report documents research performed on techniques of aggregation applied to network models used in transportation planning. The central objective of this research has been to identify, extend, and evaluate methods of aggregation so as to improve...
Guided Learning Applied to Optical Mineralogy
ERIC Educational Resources Information Center
Driver, S. C.; Hunter, W. R.
1975-01-01
Describes an individual programmed study method used in a second year Geology course at the University of Melbourne. Outlines the criteria that make this instructional style useful and presents the student questionnaire used to evaluate the course. (GS)
Fairchild, Amanda J.; McQuillin, Samuel D.
2017-01-01
Third variable effects elucidate the relation between two other variables, and can describe why they are related or under what conditions they are related. This article demonstrates methods to analyze two third-variable effects: moderation and mediation. The utility of examining moderation and mediation effects in school psychology is described and current use of the analyses in applied school psychology research is reviewed and evaluated. Proper statistical methods to test the effects are presented, and different effect size measures for the models are provided. Extensions of the basic moderator and mediator models are also described. PMID:20006988
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skiles, S. K.
1994-12-22
An inductive double-contingency analysis (DCA) method developed by the criticality safety function at the Savannah River Site, was applied in Criticality Safety Evaluations (CSEs) of five major plant process systems at the Westinghouse Electric Corporation`s Commercial Nuclear Fuel Manufacturing Plant in Columbia, South Carolina (WEC-Cola.). The method emphasizes a thorough evaluation of the controls intended to provide barriers against criticality for postulated initiating events, and has been demonstrated effective at identifying common mode failure potential and interdependence among multiple controls. A description of the method and an example of its application is provided.
Fairchild, Amanda J; McQuillin, Samuel D
2010-02-01
Third variable effects elucidate the relation between two other variables, and can describe why they are related or under what conditions they are related. This article demonstrates methods to analyze two third-variable effects: moderation and mediation. The utility of examining moderation and mediation effects in school psychology is described and current use of the analyses in applied school psychology research is reviewed and evaluated. Proper statistical methods to test the effects are presented, and different effect size measures for the models are provided. Extensions of the basic moderator and mediator models are also described.
Improved profiling of estrogen metabolites by orbitrap LC/MS
Li, Xingnan; Franke, Adrian A.
2015-01-01
Estrogen metabolites are important biomarkers to evaluate cancer risks and metabolic diseases. Due to their low physiological levels, a sensitive and accurate method is required, especially for the quantitation of unconjugated forms of endogenous steroids and their metabolites in humans. Here, we evaluated various derivatives of estrogens for improved analysis by orbitrap LC/MS in human serum samples. A new chemical derivatization reagent was applied modifying phenolic steroids to form 1-methylimidazole-2-sulfonyl adducts. The method significantly improves the sensitivity 2–100 fold by full scan MS and targeted selected ion monitoring MS over other derivatization methods including, dansyl, picolinoyl, and pyridine-3-sulfonyl products. PMID:25543003
NASA Technical Reports Server (NTRS)
Fehrman, A. L.; Masek, R. V.
1972-01-01
Quantitative estimates of the uncertainty in predicting aerodynamic heating rates for a fully reusable space shuttle system are developed and the impact of these uncertainties on Thermal Protection System (TPS) weight are discussed. The study approach consisted of statistical evaluations of the scatter of heating data on shuttle configurations about state-of-the-art heating prediction methods to define the uncertainty in these heating predictions. The uncertainties were then applied as heating rate increments to the nominal predicted heating rate to define the uncertainty in TPS weight. Separate evaluations were made for the booster and orbiter, for trajectories which included boost through reentry and touchdown. For purposes of analysis, the vehicle configuration is divided into areas in which a given prediction method is expected to apply, and separate uncertainty factors and corresponding uncertainty in TPS weight derived for each area.
A trust region approach with multivariate Padé model for optimal circuit design
NASA Astrophysics Data System (ADS)
Abdel-Malek, Hany L.; Ebid, Shaimaa E. K.; Mohamed, Ahmed S. A.
2017-11-01
Since the optimization process requires a significant number of consecutive function evaluations, it is recommended to replace the function by an easily evaluated approximation model during the optimization process. The model suggested in this article is based on a multivariate Padé approximation. This model is constructed using data points of ?, where ? is the number of parameters. The model is updated over a sequence of trust regions. This model avoids the slow convergence of linear models of ? and has features of quadratic models that need interpolation data points of ?. The proposed approach is tested by applying it to several benchmark problems. Yield optimization using such a direct method is applied to some practical circuit examples. Minimax solution leads to a suitable initial point to carry out the yield optimization process. The yield is optimized by the proposed derivative-free method for active and passive filter examples.
Nondestructive Evaluation of Carbon Fiber Bicycle Frames Using Infrared Thermography
Ibarra-Castanedo, Clemente; Klein, Matthieu; Maldague, Xavier; Sanchez-Beato, Alvaro
2017-01-01
Bicycle frames made of carbon fibre are extremely popular for high-performance cycling due to the stiffness-to-weight ratio, which enables greater power transfer. However, products manufactured using carbon fibre are sensitive to impact damage. Therefore, intelligent nondestructive evaluation is a required step to prevent failures and ensure a secure usage of the bicycle. This work proposes an inspection method based on active thermography, a proven technique successfully applied to other materials. Different configurations for the inspection are tested, including power and heating time. Moreover, experiments are applied to a real bicycle frame with generated impact damage of different energies. Tests show excellent results, detecting the generated damage during the inspection. When the results are combined with advanced image post-processing methods, the SNR is greatly increased, and the size and localization of the defects are clearly visible in the images. PMID:29156650
Costing evidence for health care decision-making in Austria: A systematic review
Mayer, Susanne; Kiss, Noemi; Łaszewska, Agata
2017-01-01
Background With rising healthcare costs comes an increasing demand for evidence-informed resource allocation using economic evaluations worldwide. Furthermore, standardization of costing and reporting methods both at international and national levels are imperative to make economic evaluations a valid tool for decision-making. The aim of this review is to assess the availability and consistency of costing evidence that could be used for decision-making in Austria. It describes systematically the current economic evaluation and costing studies landscape focusing on the applied costing methods and their reporting standards. Findings are discussed in terms of their likely impacts on evidence-based decision-making and potential suggestions for areas of development. Methods A systematic literature review of English and German language peer-reviewed as well as grey literature (2004–2015) was conducted to identify Austrian economic analyses. The databases MEDLINE, EMBASE, SSCI, EconLit, NHS EED and Scopus were searched. Publication and study characteristics, costing methods, reporting standards and valuation sources were systematically synthesised and assessed. Results A total of 93 studies were included. 87% were journal articles, 13% were reports. 41% of all studies were full economic evaluations, mostly cost-effectiveness analyses. Based on relevant standards the most commonly observed limitations were that 60% of the studies did not clearly state an analytical perspective, 25% of the studies did not provide the year of costing, 27% did not comprehensively list all valuation sources, and 38% did not report all applied unit costs. Conclusion There are substantial inconsistencies in the costing methods and reporting standards in economic analyses in Austria, which may contribute to a low acceptance and lack of interest in economic evaluation-informed decision making. To improve comparability and quality of future studies, national costing guidelines should be updated with more specific methodological guidance and a national reference cost library should be set up to allow harmonisation of valuation methods. PMID:28806728
NASA Astrophysics Data System (ADS)
Guiraldello, Rafael T.; Martins, Marcelo L.; Mancera, Paulo F. A.
2016-08-01
We present a mathematical model based on partial differential equations that is applied to understand tumor development and its response to chemotherapy. Our primary aim is to evaluate comparatively the efficacies of two chemotherapeutic protocols, Maximum Tolerated Dose (MTD) and metronomic, as well as two methods of drug delivery. Concerning therapeutic outcomes, the metronomic protocol proves more effective in prolonging the patient's life than MTD. Moreover, a uniform drug delivery method combined with the metronomic protocol is the most efficient strategy to reduce tumor density.
Evaluating a vessel for suitability for containing fluid
Barefield, II, James E.; Judge, Elizabeth J.; Le, Loan A.; Lopez, Leon N.; Beveridge, Andrew C.; Chapman, Daniel R.; Taylor, Seth T.
2017-05-30
A method for evaluating a vessel for suitability to contain a fluid includes providing a vessel and forming a polished surface portion of the vessel by removing oxidation and/or contaminants from a portion of the vessel. The method further includes applying a focused laser to the polished surface portion to form plasma on the polished surface portion, and determining whether the vessel is suitable for containing a fluid based on silicon content of the polished surface portion. The silicon content is estimated based on light emitted from the plasma.
Python package for model STructure ANalysis (pySTAN)
NASA Astrophysics Data System (ADS)
Van Hoey, Stijn; van der Kwast, Johannes; Nopens, Ingmar; Seuntjens, Piet
2013-04-01
The selection and identification of a suitable hydrological model structure is more than fitting parameters of a model structure to reproduce a measured hydrograph. The procedure is highly dependent on various criteria, i.e. the modelling objective, the characteristics and the scale of the system under investigation as well as the available data. Rigorous analysis of the candidate model structures is needed to support and objectify the selection of the most appropriate structure for a specific case (or eventually justify the use of a proposed ensemble of structures). This holds both in the situation of choosing between a limited set of different structures as well as in the framework of flexible model structures with interchangeable components. Many different methods to evaluate and analyse model structures exist. This leads to a sprawl of available methods, all characterized by different assumptions, changing conditions of application and various code implementations. Methods typically focus on optimization, sensitivity analysis or uncertainty analysis, with backgrounds from optimization, machine-learning or statistics amongst others. These methods also need an evaluation metric (objective function) to compare the model outcome with some observed data. However, for current methods described in literature, implementations are not always transparent and reproducible (if available at all). No standard procedures exist to share code and the popularity (and amount of applications) of the methods is sometimes more dependent on the availability than the merits of the method. Moreover, new implementations of existing methods are difficult to verify and the different theoretical backgrounds make it difficult for environmental scientists to decide about the usefulness of a specific method. A common and open framework with a large set of methods can support users in deciding about the most appropriate method. Hence, it enables to simultaneously apply and compare different methods on a fair basis. We developed and present pySTAN (python framework for STructure Analysis), a python package containing a set of functions for model structure evaluation to provide the analysis of (hydrological) model structures. A selected set of algorithms for optimization, uncertainty and sensitivity analysis is currently available, together with a set of evaluation (objective) functions and input distributions to sample from. The methods are implemented model-independent and the python language provides the wrapper functions to apply administer external model codes. Different objective functions can be considered simultaneously with both statistical metrics and more hydrology specific metrics. By using so-called reStructuredText (sphinx documentation generator) and Python documentation strings (docstrings), the generation of manual pages is semi-automated and a specific environment is available to enhance both the readability and transparency of the code. It thereby enables a larger group of users to apply and compare these methods and to extend the functionalities.
ERIC Educational Resources Information Center
Thiem, Alrik
2017-01-01
The search for necessary and sufficient causes of some outcome of interest, referred to as "configurational comparative research," has long been one of the main preoccupations of evaluation scholars and practitioners. However, only the last three decades have witnessed the evolution of a set of formal methods that are sufficiently…
ERIC Educational Resources Information Center
Kocakulah, Mustafa Sabri
2010-01-01
This study aims to develop and apply a rubric to evaluate the solutions of pre-service primary science teachers to questions about Newton's Laws of Motion. Two groups were taught the topic using the same teaching methods and administered four questions before and after teaching. Furthermore, 76 students in the experiment group were instructed…
ERIC Educational Resources Information Center
Kaysi, Feyzi; Bavli, Bünyamin; Gürol, Aysun
2016-01-01
The study evaluates the flight simulators course which was opened to fulfill the intermediate staff need of the sector. To collect data, Qualitative techniques were applied. Within this scope, the case study method was employed in the study. The study group consisted of students and instructors. In-depth and focus group interviews were conducted…
Erin S. Brooks; Mariana Dobre; William J. Elliot; Joan Q. Wu; Jan Boll
2016-01-01
Forest managers need methods to evaluate the impacts of management at the watershed scale. The Water Erosion Prediction Project (WEPP) has the ability to model disturbed forested hillslopes, but has difficulty addressing some of the critical processes that are important at a watershed scale, including baseflow and water yield. In order to apply WEPP to...
ERIC Educational Resources Information Center
Dressler, William W.; Balieiro, Mauro C.; dos Santos, José Ernesto
2015-01-01
This article reports the replication after 10 years of cultural consensus analyses in four cultural domains in the city of Ribeirão Preto, Brazil. Additionally, two methods for evaluating residual agreement are applied to the data, and a new technique for evaluating how cultural knowledge is represented by residual agreement is introduced. We…
Space-frame connection for small-diameter round timber
Ronald W. Wolfe; Agron E. Gjinolli; John R. King
2000-01-01
To promote more efficient use of small-diameter timber, research efforts are being focused on the development and evaluation of connection methods that can be easily applied to non-standard round wood profiles. This report summarizes an evaluation of a bdowel-nut connectionc as an option for the use of Douglas-fir peeler cores in three-dimensional truss or bspace-...
ERIC Educational Resources Information Center
Fallon, Lindsay M.; Collier-Meek, Melissa A.; Maggin, Daniel M.; Sanetti, Lisa M. H.; Johnson, Austin H.
2015-01-01
Optimal levels of treatment fidelity, a critical moderator of intervention effectiveness, are often difficult to sustain in applied settings. It is unknown whether performance feedback, a widely researched method for increasing educators' treatment fidelity, is an evidence-based practice. The purpose of this review was to evaluate the current…
ERIC Educational Resources Information Center
Gallagher, Rosina Mena
This study evaluates the counseling-learning approach to foreign language instruction as compared with traditional methods in terms of language achievement and change in personal orientation and in attitude toward learning. Twelve students volunteered to learn Spanish or German under simultaneous exposure to both languages using the…
ERIC Educational Resources Information Center
Karoulis, Athanasis; Demetriadis, Stavros; Pombortsis, Andreas
2006-01-01
This paper compares several interface evaluation methods applied in the case of a computer based learning (CBL) environment, during a longitudinal study performed in three European countries, Greece, Germany, and Holland, and within the framework of an EC funded Leonardo da Vinci program. The paper firstly considers the particularities of the CBL…
Brzezińska-Wcisło, L; Bogdanowski, T; Suwała-Jurczyk, B
1990-01-01
The therapeutic results are presented in cases of basocellular epithelioma treated by three methods. The best and most radical results were obtained by the surgical method, followed in the order of effectiveness by radiotherapy (45-55 kV) in a total dose of 4500-6000 R. In case of contraindications to these methods local chemotherapy was applied which was associated with a high proportion of failures (28.6%).
Air powder abrasive treatment as an implant surface cleaning method: a literature review.
Tastepe, Ceylin S; van Waas, Rien; Liu, Yuelian; Wismeijer, Daniel
2012-01-01
To evaluate the air powder abrasive treatment as an implant surface cleaning method for peri-implantitis based on the existing literature. A PubMed search was conducted to find articles that reported on air powder abrasive treatment as an implant surface cleaning method for peri-implantitis. The studies evaluated cleaning efficiency and surface change as a result of the method. Furthermore, cell response toward the air powder abrasive-treated discs, reosseointegration, and clinical outcome after treatment is also reported. The PubMed search resulted in 27 articles meeting the inclusion criteria. In vitro cleaning efficiency of the method is reported to be high. The method resulted in minor surface changes on titanium specimens. Although the air powder abrasive-treated specimens showed sufficient levels of cell attachment and cell viability, the cell response decreased compared with sterile discs. Considerable reosseointegration between 39% and 46% and improved clinical parameters were reported after treatment when applied in combination with surgical treatment. The results of the treatment are influenced by the powder type used, the application time, and whether powder was applied surgically or nonsurgically. The in vivo data on air powder abrasive treatment as an implant surface cleaning method is not sufficient to draw definitive conclusions. However, in vitro results allow the clinician to consider the method as a promising option for implant surface cleaning in peri-implantitis treatment.
Gu, Zhi-rong; Wang, Ya-li; Sun, Yu-jing; Dind, Jun-xia
2014-09-01
To investigate the establishment and application methods of entropy-weight TOPSIS model in synthetical quality evaluation of traditional Chinese medicine with Angelica sinensis growing in Gansu Province as an example. The contents of ferulic acid, 3-butylphthalide, Z-butylidenephthalide, Z-ligustilide, linolic acid, volatile oil, and ethanol soluble extractive were used as an evaluation index set. The weights of each evaluation index were determined by information entropy method. The entropyweight TOPSIS model was established to synthetically evaluate the quality of Angelica sinensis growing in Gansu Province by Euclid closeness degree. The results based on established model were in line with the daodi meaning and the knowledge of clinical experience. The established model was simple in calculation, objective, reliable, and can be applied to synthetical quality evaluation of traditional Chinese medicine.
NASA Astrophysics Data System (ADS)
Green, David L.; Berry, Lee A.; Simpson, Adam B.; Younkin, Timothy R.
2018-04-01
We present the KINETIC-J code, a computational kernel for evaluating the linearized Vlasov equation with application to calculating the kinetic plasma response (current) to an applied time harmonic wave electric field. This code addresses the need for a configuration space evaluation of the plasma current to enable kinetic full-wave solvers for waves in hot plasmas to move beyond the limitations of the traditional Fourier spectral methods. We benchmark the kernel via comparison with the standard k →-space forms of the hot plasma conductivity tensor.
Apparatus for combinatorial screening of electrochemical materials
Kepler, Keith Douglas [Belmont, CA; Wang, Yu [Foster City, CA
2009-12-15
A high throughput combinatorial screening method and apparatus for the evaluation of electrochemical materials using a single voltage source (2) is disclosed wherein temperature changes arising from the application of an electrical load to a cell array (1) are used to evaluate the relative electrochemical efficiency of the materials comprising the array. The apparatus may include an array of electrochemical cells (1) that are connected to each other in parallel or in series, an electronic load (2) for applying a voltage or current to the electrochemical cells (1), and a device (3), external to the cells, for monitoring the relative temperature of each cell when the load is applied.
NASA Astrophysics Data System (ADS)
Moustafa, Azza Aziz; Salem, Hesham; Hegazy, Maha; Ali, Omnia
2015-02-01
Simple, accurate, and selective methods have been developed and validated for simultaneous determination of a ternary mixture of Chlorpheniramine maleate (CPM), Pseudoephedrine HCl (PSE) and Ibuprofen (IBF), in tablet dosage form. Four univariate methods manipulating ratio spectra were applied, method A is the double divisor-ratio difference spectrophotometric method (DD-RD). Method B is double divisor-derivative ratio spectrophotometric method (DD-RD). Method C is derivative ratio spectrum-zero crossing method (DRZC), while method D is mean centering of ratio spectra (MCR). Two multivariate methods were also developed and validated, methods E and F are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods have the advantage of simultaneous determination of the mentioned drugs without prior separation steps. They were successfully applied to laboratory-prepared mixtures and to commercial pharmaceutical preparation without any interference from additives. The proposed methods were validated according to the ICH guidelines. The obtained results were statistically compared with the official methods where no significant difference was observed regarding both accuracy and precision.
Low cost MATLAB-based pulse oximeter for deployment in research and development applications.
Shokouhian, M; Morling, R C S; Kale, I
2013-01-01
Problems such as motion artifact and effects of ambient lights have forced developers to design different signal processing techniques and algorithms to increase the reliability and accuracy of the conventional pulse oximeter device. To evaluate the robustness of these techniques, they are applied either to recorded data or are implemented on chip to be applied to real-time data. Recorded data is the most common method of evaluating however it is not as reliable as real-time measurements. On the other hand, hardware implementation can be both expensive and time consuming. This paper presents a low cost MATLAB-based pulse oximeter that can be used for rapid evaluation of newly developed signal processing techniques and algorithms. Flexibility to apply different signal processing techniques, providing both processed and unprocessed data along with low implementation cost are the important features of this design which makes it ideal for research and development purposes, as well as commercial, hospital and healthcare application.
NASA Astrophysics Data System (ADS)
Song, Bongyong; Park, Justin C.; Song, William Y.
2014-11-01
The Barzilai-Borwein (BB) 2-point step size gradient method is receiving attention for accelerating Total Variation (TV) based CBCT reconstructions. In order to become truly viable for clinical applications, however, its convergence property needs to be properly addressed. We propose a novel fast converging gradient projection BB method that requires ‘at most one function evaluation’ in each iterative step. This Selective Function Evaluation method, referred to as GPBB-SFE in this paper, exhibits the desired convergence property when it is combined with a ‘smoothed TV’ or any other differentiable prior. This way, the proposed GPBB-SFE algorithm offers fast and guaranteed convergence to the desired 3DCBCT image with minimal computational complexity. We first applied this algorithm to a Shepp-Logan numerical phantom. We then applied to a CatPhan 600 physical phantom (The Phantom Laboratory, Salem, NY) and a clinically-treated head-and-neck patient, both acquired from the TrueBeam™ system (Varian Medical Systems, Palo Alto, CA). Furthermore, we accelerated the reconstruction by implementing the algorithm on NVIDIA GTX 480 GPU card. We first compared GPBB-SFE with three recently proposed BB-based CBCT reconstruction methods available in the literature using Shepp-Logan numerical phantom with 40 projections. It is found that GPBB-SFE shows either faster convergence speed/time or superior convergence property compared to existing BB-based algorithms. With the CatPhan 600 physical phantom, the GPBB-SFE algorithm requires only 3 function evaluations in 30 iterations and reconstructs the standard, 364-projection FDK reconstruction quality image using only 60 projections. We then applied the algorithm to a clinically-treated head-and-neck patient. It was observed that the GPBB-SFE algorithm requires only 18 function evaluations in 30 iterations. Compared with the FDK algorithm with 364 projections, the GPBB-SFE algorithm produces visibly equivalent quality CBCT image for the head-and-neck patient with only 180 projections, in 131.7 s, further supporting its clinical applicability.
Reliability verification of vehicle speed estimate method in forensic videos.
Kim, Jong-Hyuk; Oh, Won-Taek; Choi, Ji-Hun; Park, Jong-Chan
2018-06-01
In various types of traffic accidents, including car-to-car crash, vehicle-pedestrian collision, and hit-and-run accident, driver overspeed is one of the critical issues of traffic accident analysis. Hence, analysis of vehicle speed at the moment of accident is necessary. The present article proposes a vehicle speed estimate method (VSEM) applying a virtual plane and a virtual reference line to a forensic video. The reliability of the VSEM was verified by comparing the results obtained by applying the VSEM to videos from a test vehicle driving with a global positioning system (GPS)-based Vbox speed. The VSEM verified by these procedures was applied to real traffic accident examples to evaluate the usability of the VSEM. Copyright © 2018 Elsevier B.V. All rights reserved.
Topical dissolved oxygen penetrates skin: model and method.
Roe, David F; Gibbins, Bruce L; Ladizinsky, Daniel A
2010-03-01
It has been commonly perceived that skin receives its oxygen supply from the internal circulation. However, recent investigations have shown that a significant amount of oxygen may enter skin from the external overlying surface. A method has been developed for measuring the transcutaneous penetration of human skin by oxygen as described herein. This method was used to determine both the depth and magnitude of penetration of skin by topically applied oxygen. An apparatus consisting of human skin samples interposed between a topical oxygen source and a fluid filled chamber that registered changes in dissolved oxygen. Viable human skin samples of variable thicknesses with and without epidermis were used to evaluate the depth and magnitude of oxygen penetration from either topical dissolved oxygen (TDO) or topical gaseous oxygen (TGO) devices. This model effectively demonstrates transcutaneous penetration of topically applied oxygen. Topically applied dissolved oxygen penetrates through >700 microm of human skin. Topically applied oxygen penetrates better though dermis than epidermis, and TDO devices deliver oxygen more effectively than TGO devices. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Gely, P; Drouin, G; Thiry, P S; Tremblay, G R
1984-11-01
A new composite prosthesis was recently proposed for the anterior cruciate ligament. It is implanted in the femur and the tibia through two anchoring channels. Its intra-articular portion, composed of a fiber mesh sheath wrapped around a silicone rubber cylindrical core, reproduces satisfactorily the ligament response in tension. However, the prosthesis does not only undergo elongation. In addition, it is submitted to torsion in its intra-articular portion and bending at its ends. This paper presents a new method to evaluate these two types of deformations throughout a knee flexion by means of a geometric model of the implanted prosthesis. Input data originate from two sources: (i) a three-dimensional anatomic topology of the knee joint in full extension, providing the localization of the prosthesis anchoring channels, and ii) a kinematic model of the knee describing the motion of these anchoring channels during a physiological flexion of the knee joint. The evaluation method is independent of the way input data are obtained. This method, applied to a right cadaveric knee, shows that the orientation of the anchoring channels has a large effect on the extent of torsion and bending applied to the implanted prosthesis throughout a knee flexion, especially on the femoral side. The study suggests also the best choice for the anchoring channel axes orientation.
NASA Astrophysics Data System (ADS)
Zhang, X.; Srinivasan, R.
2008-12-01
In this study, a user friendly GIS tool was developed for evaluating and improving NEXRAD using raingauge data. This GIS tool can automatically read in raingauge and NEXRAD data, evaluate the accuracy of NEXRAD for each time unit, implement several geostatistical methods to improve the accuracy of NEXRAD through raingauge data, and output spatial precipitation map for distributed hydrologic model. The geostatistical methods incorporated in this tool include Simple Kriging with varying local means, Kriging with External Drift, Regression Kriging, Co-Kriging, and a new geostatistical method that was newly developed by Li et al. (2008). This tool was applied in two test watersheds at hourly and daily temporal scale. The preliminary cross-validation results show that incorporating raingauge data to calibrate NEXRAD can pronouncedly change the spatial pattern of NEXRAD and improve its accuracy. Using different geostatistical methods, the GIS tool was applied to produce long term precipitation input for a distributed hydrologic model - Soil and Water Assessment Tool (SWAT). Animated video was generated to vividly illustrate the effect of using different precipitation input data on distributed hydrologic modeling. Currently, this GIS tool is developed as an extension of SWAT, which is used as water quantity and quality modeling tool by USDA and EPA. The flexible module based design of this tool also makes it easy to be adapted for other hydrologic models for hydrological modeling and water resources management.
Specific algorithm method of scoring the Clock Drawing Test applied in cognitively normal elderly
Mendes-Santos, Liana Chaves; Mograbi, Daniel; Spenciere, Bárbara; Charchat-Fichman, Helenice
2015-01-01
The Clock Drawing Test (CDT) is an inexpensive, fast and easily administered measure of cognitive function, especially in the elderly. This instrument is a popular clinical tool widely used in screening for cognitive disorders and dementia. The CDT can be applied in different ways and scoring procedures also vary. Objective The aims of this study were to analyze the performance of elderly on the CDT and evaluate inter-rater reliability of the CDT scored by using a specific algorithm method adapted from Sunderland et al. (1989). Methods We analyzed the CDT of 100 cognitively normal elderly aged 60 years or older. The CDT ("free-drawn") and Mini-Mental State Examination (MMSE) were administered to all participants. Six independent examiners scored the CDT of 30 participants to evaluate inter-rater reliability. Results and Conclusion A score of 5 on the proposed algorithm ("Numbers in reverse order or concentrated"), equivalent to 5 points on the original Sunderland scale, was the most frequent (53.5%). The CDT specific algorithm method used had high inter-rater reliability (p<0.01), and mean score ranged from 5.06 to 5.96. The high frequency of an overall score of 5 points may suggest the need to create more nuanced evaluation criteria, which are sensitive to differences in levels of impairment in visuoconstructive and executive abilities during aging. PMID:29213954
Effect of portfolio assessment on student learning in prenatal training for midwives.
Kariman, Nourossadat; Moafi, Farnoosh
2011-01-01
The tendency to use portfolios for evaluation has been developed with the aim of optimizing the culture of assessment. The present study was carried out to determine the effect of using portfolios as an evaluation method on midwifery students' learning and satisfaction in prenatal practical training. In this prospective cohort study, all midwifery students in semester four (n=40), were randomly allocated to portfolio and routine evaluation groups. Based on their educational goals, the portfolio groups prepared packages which consisted of a complete report of the history, physical examinations, and methods of patient management (as evaluated by a checklist) for women who visited a prenatal clinic. During the last day of their course, a posttest, clinical exam, and student satisfaction form were completed. The two groups' mean age, mean pretest scores, and their prerequisite course that they should have taken in the previous semester were similar. The mean difference in the pre and post test scores for the two groups' knowledge and comprehension levels did not differ significantly (P>0.05). The average scores on questions in Bloom's taxonomy 2 and 3 of the portfolio group were significantly greater than those of the routine evaluation group (P=0.002, P=0.03, respectively). The mean of the two groups' clinical exam scores was significantly different. The portfolio group's mean scores on generating diagnostic and therapeutic solutions and the ability to apply theory in practice were higher than those of the routine group. Overall, students' satisfaction scores in the two evaluation methods were relatively similar. Portfolio evaluation provides the opportunity for more learning by increasing the student's participation in the learning process and helping them to apply theory in practice.
Kageyama, Shinji; Shinmura, Kazuya; Yamamoto, Hiroko; Goto, Masanori; Suzuki, Koichi; Tanioka, Fumihiko; Tsuneyoshi, Toshihiro; Sugimura, Haruhiko
2008-04-01
The PCR-based DNA fingerprinting method called the methylation-sensitive amplified fragment length polymorphism (MS-AFLP) analysis is used for genome-wide scanning of methylation status. In this study, we developed a method of fluorescence-labeled MS-AFLP (FL-MS-AFLP) analysis by applying a fluorescence-labeled primer and fluorescence-detecting electrophoresis apparatus to the existing method of MS-AFLP analysis. The FL-MS-AFLP analysis enables quantitative evaluation of more than 350 random CpG loci per run. It was shown to allow evaluation of the differences in methylation level of blood DNA of gastric cancer patients and evaluation of hypermethylation and hypomethylation in DNA from gastric cancer tissue in comparison with adjacent non-cancerous tissue.
Fuzzy Comprehensive Evaluation Method Applied in the Real Estate Investment Risks Research
NASA Astrophysics Data System (ADS)
ML(Zhang Minli), Zhang; Wp(Yang Wenpo), Yang
Real estate investment is a high-risk and high returned of economic activity, the key of real estate analysis is the identification of their types of investment risk and the risk of different types of effective prevention. But, as the financial crisis sweeping the world, the real estate industry also faces enormous risks, how effective and correct evaluation of real estate investment risks becomes the multitudinous scholar concern[1]. In this paper, real estate investment risks were summarized and analyzed, and comparative analysis method is discussed and finally presented fuzzy comprehensive evaluation method, not only in theory has the advantages of science, in the application also has the reliability, for real estate investment risk assessment provides an effective means for investors in real estate investing guidance on risk factors and forecasts.
Cumulative Risk and Impact Modeling on Environmental Chemical and Social Stressors.
Huang, Hongtai; Wang, Aolin; Morello-Frosch, Rachel; Lam, Juleen; Sirota, Marina; Padula, Amy; Woodruff, Tracey J
2018-03-01
The goal of this review is to identify cumulative modeling methods used to evaluate combined effects of exposures to environmental chemicals and social stressors. The specific review question is: What are the existing quantitative methods used to examine the cumulative impacts of exposures to environmental chemical and social stressors on health? There has been an increase in literature that evaluates combined effects of exposures to environmental chemicals and social stressors on health using regression models; very few studies applied other data mining and machine learning techniques to this problem. The majority of studies we identified used regression models to evaluate combined effects of multiple environmental and social stressors. With proper study design and appropriate modeling assumptions, additional data mining methods may be useful to examine combined effects of environmental and social stressors.
Synthesis, lipophilicity and antimicrobial activity evaluation of some new thiazolyl-oxadiazolines
STOICA, CRISTINA IOANA; IONUȚ, IOANA; PÎRNĂU, ADRIAN; POP, CARMEN; ROTAR, ANCUȚA; VLASE, LAURIAN; ONIGA, SMARANDA; ONIGA, OVIDIU
2015-01-01
Background and aims Synthesis of new potential antimicrobial agents and evaluation of their lipophilicity. Methods Ten new thiazolyl-oxadiazoline derivatives were synthesized and their structures were validated by 1H-NMR and mass spectrometry. The lipophilicity of the compounds was evaluated using the principal component analysis (PCA) method. The necessary data for applying this method were obtained by reverse-phase thin-layer chromatography (RP-TLC). The antimicrobial activities were tested in vitro against four bacterial strains and one fungal strain. Results The lipophilicity varied with the structure but could not be correlated with the antimicrobial activity, since this was modest. Conclusions We have synthesized ten new heterocyclic compounds. After their physical and chemical characterization, we determined their lipophilicity and screened their antimicrobial activity. PMID:26733751
Retrieval evaluation and distance learning from perceived similarity between endomicroscopy videos.
André, Barbara; Vercauteren, Tom; Buchner, Anna M; Wallace, Michael B; Ayache, Nicholas
2011-01-01
Evaluating content-based retrieval (CBR) is challenging because it requires an adequate ground-truth. When the available groundtruth is limited to textual metadata such as pathological classes, retrieval results can only be evaluated indirectly, for example in terms of classification performance. In this study we first present a tool to generate perceived similarity ground-truth that enables direct evaluation of endomicroscopic video retrieval. This tool uses a four-points Likert scale and collects subjective pairwise similarities perceived by multiple expert observers. We then evaluate against the generated ground-truth a previously developed dense bag-of-visual-words method for endomicroscopic video retrieval. Confirming the results of previous indirect evaluation based on classification, our direct evaluation shows that this method significantly outperforms several other state-of-the-art CBR methods. In a second step, we propose to improve the CBR method by learning an adjusted similarity metric from the perceived similarity ground-truth. By minimizing a margin-based cost function that differentiates similar and dissimilar video pairs, we learn a weight vector applied to the visual word signatures of videos. Using cross-validation, we demonstrate that the learned similarity distance is significantly better correlated with the perceived similarity than the original visual-word-based distance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goebel, J
2004-02-27
Without stable hardware any program will fail. The frustration and expense of supporting bad hardware can drain an organization, delay progress, and frustrate everyone involved. At Stanford Linear Accelerator Center (SLAC), we have created a testing method that helps our group, SLAC Computer Services (SCS), weed out potentially bad hardware and purchase the best hardware at the best possible cost. Commodity hardware changes often, so new evaluations happen periodically each time we purchase systems and minor re-evaluations happen for revised systems for our clusters, about twice a year. This general framework helps SCS perform correct, efficient evaluations. This article outlinesmore » SCS's computer testing methods and our system acceptance criteria. We expanded the basic ideas to other evaluations such as storage, and we think the methods outlined in this article has helped us choose hardware that is much more stable and supportable than our previous purchases. We have found that commodity hardware ranges in quality, so systematic method and tools for hardware evaluation were necessary. This article is based on one instance of a hardware purchase, but the guidelines apply to the general problem of purchasing commodity computer systems for production computational work.« less
Evaluation of fuzzy inference systems using fuzzy least squares
NASA Technical Reports Server (NTRS)
Barone, Joseph M.
1992-01-01
Efforts to develop evaluation methods for fuzzy inference systems which are not based on crisp, quantitative data or processes (i.e., where the phenomenon the system is built to describe or control is inherently fuzzy) are just beginning. This paper suggests that the method of fuzzy least squares can be used to perform such evaluations. Regressing the desired outputs onto the inferred outputs can provide both global and local measures of success. The global measures have some value in an absolute sense, but they are particularly useful when competing solutions (e.g., different numbers of rules, different fuzzy input partitions) are being compared. The local measure described here can be used to identify specific areas of poor fit where special measures (e.g., the use of emphatic or suppressive rules) can be applied. Several examples are discussed which illustrate the applicability of the method as an evaluation tool.
Criteria for evaluating the condition of a tropical cyclone warning system.
Parker, D
1999-09-01
This paper evaluates the condition (i.e. health) of a tropical cyclone warning system (TCWS) during a 'quiet period' between infrequent intense cyclones. Capacity to make pre-disaster evaluations is important--disaster warning systems need to be in sound condition before, not after, disaster. The research--part of the UK's International Decade of Natural Disaster Reduction Flagship Programme--focuses upon an evaluatory method first used on flood warning systems. The Criteria-development Matrix comprises social, organisational and institutional criteria by which a TCWS may be assessed using a five-stage development scale. This method is used to evaluate Mauritius's TCWS using in-depth interview data. Ways to enhance the method and apply it to other disaster warning systems are discussed. The TCWS in Mauritius is a relatively sound one from which others can learn. Weaknesses requiring attention for Mauritius's TCWS to progress to an advanced level of development are identified.
NASA Astrophysics Data System (ADS)
Li, Cunbin; Wang, Yi; Lin, Shuaishuai
2017-09-01
With the rapid development of the energy internet and the deepening of the electric power reform, the traditional marketing mode of electric power does not apply to most of electric power enterprises, so must seek a breakthrough, however, in the face of increasingly complex marketing information, how to make a quick, reasonable transformation, makes the electric power marketing competitiveness assessment more accurate and objective becomes a big problem. In this paper, cloud model and TOPSIS method is proposed. Firstly, build the electric power marketing competitiveness evaluation index system. Then utilize the cloud model to transform the qualitative evaluation of the marketing data into quantitative values and use the entropy weight method to weaken the subjective factors of evaluation index weight. Finally, by TOPSIS method the closeness degrees of alternatives are obtained. This method provides a novel solution for the electric power marketing competitiveness evaluation. Through the case analysis the effectiveness and feasibility of this model are verified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanemoto, S.; Andoh, Y.; Sandoz, S.A.
1984-10-01
A method for evaluating reactor stability in boiling water reactors has been developed. The method is based on multivariate autoregressive (M-AR) modeling of steady-state neutron and process noise signals. In this method, two kinds of power spectral densities (PSDs) for the measured neutron signal and the corresponding noise source signal are separately identified by the M-AR modeling. The closed- and open-loop stability parameters are evaluated from these PSDs. The method is applied to actual plant noise data that were measured together with artificial perturbation test data. Stability parameters identified from noise data are compared to those from perturbation test data,more » and it is shown that both results are in good agreement. In addition to these stability estimations, driving noise sources for the neutron signal are evaluated by the M-AR modeling. Contributions from void, core flow, and pressure noise sources are quantitatively evaluated, and the void noise source is shown to be the most dominant.« less
Questel, E; Durbise, E; Bardy, A-L; Schmitt, A-M; Josse, G
2015-05-01
To assess an objective method evaluating the effects of a retinaldehyde-based cream (RA-cream) on solar lentigines; 29 women randomly applied RA-cream on lentigines of one hand and a control cream on the other, once daily for 3 months. A specific method enabling a reliable visualisation of the lesions was proposed, using high-magnification colour-calibrated camera imaging. Assessment was performed using clinical evaluation by Physician Global Assessment score and image analysis. Luminance determination on the numeric images was performed either on the basis of 5 independent expert's consensus borders or probability map analysis via an algorithm automatically detecting the pigmented area. Both image analysis methods showed a similar lightening of ΔL* = 2 after a 3-month treatment by RA-cream, in agreement with single-blind clinical evaluation. High-magnification colour-calibrated camera imaging combined with probability map analysis is a fast and precise method to follow lentigo depigmentation. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Optical Fourier diffractometry applied to degraded bone structure recognition
NASA Astrophysics Data System (ADS)
Galas, Jacek; Godwod, Krzysztof; Szawdyn, Jacek; Sawicki, Andrzej
1993-09-01
Image processing and recognition methods are useful in many fields. This paper presents the hybrid optical and digital method applied to recognition of pathological changes in bones involved by metabolic bone diseases. The trabecular bone structure, registered by x ray on the photographic film, is analyzed in the new type of computer controlled diffractometer. The set of image parameters, extracted from diffractogram, is evaluated by statistical analysis. The synthetic image descriptors in discriminant space, constructed on the base of 3 training groups of images (control, osteoporosis, and osteomalacia groups) by discriminant analysis, allow us to recognize bone samples with degraded bone structure and to recognize the disease. About 89% of the images were classified correctly. This method after optimization process will be verified in medical investigations.
The detection of local irreversibility in time series based on segmentation
NASA Astrophysics Data System (ADS)
Teng, Yue; Shang, Pengjian
2018-06-01
We propose a strategy for the detection of local irreversibility in stationary time series based on multiple scale. The detection is beneficial to evaluate the displacement of irreversibility toward local skewness. By means of this method, we can availably discuss the local irreversible fluctuations of time series as the scale changes. The method was applied to simulated nonlinear signals generated by the ARFIMA process and logistic map to show how the irreversibility functions react to the increasing of the multiple scale. The method was applied also to series of financial markets i.e., American, Chinese and European markets. The local irreversibility for different markets demonstrate distinct characteristics. Simulations and real data support the need of exploring local irreversibility.
Extreme data compression for the CMB
Zablocki, Alan; Dodelson, Scott
2016-04-28
We apply the Karhunen-Loéve methods to cosmic microwave background (CMB) data sets, and show that we can recover the input cosmology and obtain the marginalized likelihoods in Λ cold dark matter cosmologies in under a minute, much faster than Markov chain Monte Carlo methods. This is achieved by forming a linear combination of the power spectra at each multipole l, and solving a system of simultaneous equations such that the Fisher matrix is locally unchanged. Instead of carrying out a full likelihood evaluation over the whole parameter space, we need evaluate the likelihood only for the parameter of interest, with themore » data compression effectively marginalizing over all other parameters. The weighting vectors contain insight about the physical effects of the parameters on the CMB anisotropy power spectrum C l. The shape and amplitude of these vectors give an intuitive feel for the physics of the CMB, the sensitivity of the observed spectrum to cosmological parameters, and the relative sensitivity of different experiments to cosmological parameters. We test this method on exact theory C l as well as on a Wilkinson Microwave Anisotropy Probe (WMAP)-like CMB data set generated from a random realization of a fiducial cosmology, comparing the compression results to those from a full likelihood analysis using CosmoMC. Furthermore, after showing that the method works, we apply it to the temperature power spectrum from the WMAP seven-year data release, and discuss the successes and limitations of our method as applied to a real data set.« less
APPLYING TOXICITY IDENTIFICATION PROCEDURES TO FIELD COLLECTED SEDIMENTS
Identification of specific causes of sediment toxicity can allow for much more focused risk assessment and management decision making. We have been developing toxicity identification evaluation (TIE) methods for contaminated sediments and focusing on three toxicant groups (ammoni...
Estimating Sobol Sensitivity Indices Using Correlations
Sensitivity analysis is a crucial tool in the development and evaluation of complex mathematical models. Sobol's method is a variance-based global sensitivity analysis technique that has been applied to computational models to assess the relative importance of input parameters on...
NASA Astrophysics Data System (ADS)
Miyachi, Yukiya; Arakawa, Mototaka; Kanai, Hiroshi
2018-07-01
In our studies on ultrasonic elasticity assessment, minute change in the thickness of the arterial wall was measured by the phased-tracking method. However, most images in carotid artery examinations contain multiple-reflection noise, making it difficult to evaluate arterial wall elasticity precisely. In the present study, a modified phased-tracking method using the pulse inversion method was examined to reduce the influence of the multiple-reflection noise. Moreover, aliasing in the harmonic components was corrected by the fundamental components. The conventional and proposed methods were applied to a pulsated tube phantom mimicking the arterial wall. For the conventional method, the elasticity was 298 kPa without multiple-reflection noise and 353 kPa with multiple-reflection noise on the posterior wall. That of the proposed method was 302 kPa without multiple-reflection noise and 297 kPa with multiple-reflection noise on the posterior wall. Therefore, the proposed method was very robust against multiple-reflection noise.
2012-01-01
Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846
Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi
2011-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.
NDT evaluation of long-term bond durability of CFRP-structural systems applied to RC highway bridges
NASA Astrophysics Data System (ADS)
Crawford, Kenneth C.
2016-06-01
The long-term durability of CFRP structural systems applied to reinforced-concrete (RC) highway bridges is a function of the system bond behavior over time. The sustained structural load performance of strengthened bridges depends on the carbon fiber-reinforced polymer (CFRP) laminates remaining 100 % bonded to concrete bridge members. Periodic testing of the CFRP-concrete bond condition is necessary to sustain load performance. The objective of this paper is to present a non-destructive testing (NDT) method designed to evaluate the bond condition and long-term durability of CFRP laminate (plate) systems applied to RC highway bridges. Using the impact-echo principle, a mobile mechanical device using light impact hammers moving along the length of a bonded CFRP plate produces unique acoustic frequencies which are a function of existing CFRP plate-concrete bond conditions. The purpose of this method is to test and locate CFRP plates de-bonded from bridge structural members to identify associated deterioration in bridge load performance. Laboratory tests of this NDT device on a CFRP plate bonded to concrete with staged voids (de-laminations) produced different frequencies for bonded and de-bonded areas of the plate. The spectra (bands) of frequencies obtained in these tests show a correlation to the CFRP-concrete bond condition and identify bonded and de-bonded areas of the plate. The results of these tests indicate that this NDT impact machine, with design improvements, can potentially provide bridge engineers a means to rapidly evaluate long lengths of CFRP laminates applied to multiple highway bridges within a national transportation infrastructure.
Identifying Audiences of E-Infrastructures - Tools for Measuring Impact
van den Besselaar, Peter
2012-01-01
Research evaluation should take into account the intended scholarly and non-scholarly audiences of the research output. This holds too for research infrastructures, which often aim at serving a large variety of audiences. With research and research infrastructures moving to the web, new possibilities are emerging for evaluation metrics. This paper proposes a feasible indicator for measuring the scope of audiences who use web-based e-infrastructures, as well as the frequency of use. In order to apply this indicator, a method is needed for classifying visitors to e-infrastructures into relevant user categories. The paper proposes such a method, based on an inductive logic program and a Bayesian classifier. The method is tested, showing that the visitors are efficiently classified with 90% accuracy into the selected categories. Consequently, the method can be used to evaluate the use of the e-infrastructure within and outside academia. PMID:23239995
NASA Technical Reports Server (NTRS)
Chen, L. T.
1975-01-01
A general method for analyzing aerodynamic flows around complex configurations is presented. By applying the Green function method, a linear integral equation relating the unknown, small perturbation potential on the surface of the body, to the known downwash is obtained. The surfaces of the aircraft, wake and diaphragm (if necessary) are divided into small quadrilateral elements which are approximated with hyperboloidal surfaces. The potential and its normal derivative are assumed to be constant within each element. This yields a set of linear algebraic equations and the coefficients are evaluated analytically. By using Gaussian elimination method, equations are solved for the potentials at the centroids of elements. The pressure coefficient is evaluated by the finite different method; the lift and moment coefficients are evaluated by numerical integration. Numerical results are presented, and applications to flutter are also included.
Bardinet, Eric; Bhattacharjee, Manik; Dormont, Didier; Pidoux, Bernard; Malandain, Grégoire; Schüpbach, Michael; Ayache, Nicholas; Cornu, Philippe; Agid, Yves; Yelnik, Jérôme
2009-02-01
The localization of any given target in the brain has become a challenging issue because of the increased use of deep brain stimulation to treat Parkinson disease, dystonia, and nonmotor diseases (for example, Tourette syndrome, obsessive compulsive disorders, and depression). The aim of this study was to develop an automated method of adapting an atlas of the human basal ganglia to the brains of individual patients. Magnetic resonance images of the brain specimen were obtained before extraction from the skull and histological processing. Adaptation of the atlas to individual patient anatomy was performed by reshaping the atlas MR images to the images obtained in the individual patient using a hierarchical registration applied to a region of interest centered on the basal ganglia, and then applying the reshaping matrix to the atlas surfaces. Results were evaluated by direct visual inspection of the structures visible on MR images and atlas anatomy, by comparison with electrophysiological intraoperative data, and with previous atlas studies in patients with Parkinson disease. The method was both robust and accurate, never failing to provide an anatomically reliable atlas to patient registration. The registration obtained did not exceed a 1-mm mismatch with the electrophysiological signatures in the region of the subthalamic nucleus. This registration method applied to the basal ganglia atlas forms a powerful and reliable method for determining deep brain stimulation targets within the basal ganglia of individual patients.
[Identification of care needs of patients with and without the use of a classification instrument].
Perroca, Marcia Galan; Jericó, Marli de Carvalho; Paschoal, Josi Vaz de Lima
2014-08-01
To analyze the agreement and disagreement between the assessments by applying or not a patient classification instrument, and to investigate the association between the agreement and personal and professional characteristics of the evaluators. This is a descriptive exploratory study. 105 patients were hospitalized in a teaching hospital in the state of Sao Paulo, using the kappa statistic (weighted) and the Bootstrap method. The agreement between the assessments were: k(w) 0.87 (instrument x internal evaluator), k(w) 0.78 (instrument x external evaluator) and k(w) 0.76 (between evaluators) and the influence of some personal and professional characteristics. The assessments conducted through the use of an instrument contemplated a greater number of areas of care in relation to when the instrument was not applied. The use of this instrument is recommended in order to more effectively identify care needs of patients.
Koskinen, Johanna; Isohanni, Matti; Paajala, Henna; Jääskeläinen, Erika; Nieminen, Pentti; Koponen, Hannu; Tienari, Pekka; Miettunen, Jouko
2008-01-01
We present bibliometric methods that can be utilized in evaluation processes of scientific work. In this paper, we present some practical clues using Finnish schizophrenia research as an example and comparing the research output of different institutions. Bibliometric data and indicators including publication counts, impact factors and received citations were used as tools for evaluating research performance in Finnish schizophrenia research. The articles and citations were searched from the Web of Science database. We used schizophrenia as a keyword and defined address Finland, and limited years to 1996-2005. When we analysed Finnish schizophrenia research, altogether 265 articles met our criteria. There were differences in impact factors and received citations between institutions. The number of annually published Finnish schizophrenia articles has tripled since the mid-1990s. International co-operation was common (43%). Bibliometric methods revealed differences between institutions, indicating that the methods can be applied in research evaluation. The coverage of databases as well as the precision of their search engines can be seen as limitations. Bibliometric methods offer a practical and impartial way to estimate publication profiles of researchers and research groups. According to our experience, these methods can be used as an evaluation instrument in research together with other methods, such as expert opinions and panels.
Yuan, Shi-Jie; He, Hui; Sheng, Guo-Ping; Chen, Jie-Jie; Tong, Zhong-Hua; Cheng, Yuan-Yuan; Li, Wen-Wei; Lin, Zhi-Qi; Zhang, Feng; Yu, Han-Qing
2013-01-01
Electrochemically active bacteria (EAB) are ubiquitous in environment and have important application in the fields of biogeochemistry, environment, microbiology and bioenergy. However, rapid and sensitive methods for EAB identification and evaluation of their extracellular electron transfer ability are still lacking. Herein we report a novel photometric method for visual detection of EAB by using an electrochromic material, WO(3) nanoclusters, as the probe. This method allowed a rapid identification of EAB within 5 min and a quantitative evaluation of their extracellular electron transfer abilities. In addition, it was also successfully applied for isolation of EAB from environmental samples. Attributed to its rapidness, high reliability, easy operation and low cost, this method has high potential for practical implementation of EAB detection and investigations.
ERIC Educational Resources Information Center
Liu, Boquan; Polce, Evan; Sprott, Julien C.; Jiang, Jack J.
2018-01-01
Purpose: The purpose of this study is to introduce a chaos level test to evaluate linear and nonlinear voice type classification method performances under varying signal chaos conditions without subjective impression. Study Design: Voice signals were constructed with differing degrees of noise to model signal chaos. Within each noise power, 100…
Regional forest cover estimation via remote sensing: the calibration center concept
Louis R. Iverson; Elizabeth A. Cook; Robin L. Graham; Robin L. Graham
1994-01-01
A method for combining Landsat Thematic Mapper (TM), Advanced Very High Resolution Radiometer (AVHRR) imagery, and other biogeographic data to estimate forest cover over large regions is applied and evaluated at two locations. In this method, TM data are used to classify a small area (calibration center) into forest/nonforest; the resulting forest cover map is then...
A general method for computing the total solar radiation force on complex spacecraft structures
NASA Technical Reports Server (NTRS)
Chan, F. K.
1981-01-01
The method circumvents many of the existing difficulties in computational logic presently encountered in the direct analytical or numerical evaluation of the appropriate surface integral. It may be applied to complex spacecraft structures for computing the total force arising from either specular or diffuse reflection or even from non-Lambertian reflection and re-radiation.
Chapelle, Frank H.; Robertson, John F.; Landmeyer, James E.; Bradley, Paul M.
2000-01-01
These two sites illustrate how the efficiency of natural attenuation processes acting on petroleum hydrocarbons can be systematically evaluated using hydrologic, geochemical, and microbiologic methods. These methods, in turn, can be used to assess the role that the natural attenuation of petroleum hydrocarbons can play in achieving overall site remediation.
Nondestructive structural evaluation of wood floor systems with a vibration technique.
Xiping Wang; Robert J. Ross; Lawrence Andrew Soltis
2002-01-01
The objective of this study was to determine if transverse vibration methods could be used to effectively assess the structural integrity of wood floors as component systems. A total of 10 wood floor systems, including 3 laboratory-built floor sections and 7 in-place floors in historic buildings, were tested. A forced vibration method was applied to the floor systems...
Aytona, Maria Corazon; Dudley, Karlene
2013-01-01
The McKenzie method, also known as Mechanical Diagnosis and Therapy (MDT), is primarily recognized as an evaluation and treatment method for the spine. However, McKenzie suggested that this method could also be applied to the extremities. Derangement is an MDT classification defined as an anatomical disturbance in the normal resting position of the joint, and McKenzie proposed that repeated movements could be applied to reduce internal joint displacement and rapidly reduce derangement symptoms. However, the current literature on MDT application to shoulder disorders is limited. Here, we present a case series involving four patients with chronic shoulder pain from a duration of 2–18 months classified as derangement and treated using MDT principles. Each patient underwent mechanical assessment and was treated with repeated movements based on their directional preference. All patients demonstrated rapid and clinically significant improvement in baseline measures and the disabilities of the arm, shoulder, and hand (QuickDASH) scores from an average of 38% at initial evaluation to 5% at discharge within 3–5 visits. Our findings suggest that MDT may be an effective treatment approach for shoulder pain. PMID:24421633
Pirsa, Sajad
2017-04-01
A portable chromatography device and a method were developed to analyze a gas mixture. The device comprises a chromatographic column for separating components of a sample of the gas mixture. It has an air pump coupled to the inlet of a chromatographic column for pumping air and an injector coupled to the inlet of chromatographic column for feeding the sample using the air as a carrier gas. A detector is arranged downstream from and coupled to the outlet of the chromatographic column. The detector is a nanostructure semiconductive microfiber. The device further comprises an evaluation unit arranged and configured to evaluate each detected component to determine the concentration. The designed portable system was used for simultaneous detection of amines. The possibility of applying dispersive liquid-liquid microextraction for the determination of analytes in trace levels is demonstrated. The reproducibility of this method is acceptable, and good standard deviations were obtained. The relative standard deviation value is less than 6% for all analytes. Finally, the method was successfully applied to the extraction and determination of analytes in water samples. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Zanchetti Meneghini, Leonardo; Rübensam, Gabriel; Claudino Bica, Vinicius; Ceccon, Amanda; Barreto, Fabiano; Flores Ferrão, Marco; Bergold, Ana Maria
2014-01-01
A simple and inexpensive method based on solvent extraction followed by low temperature clean-up was applied for determination of seven pyrethroids residues in bovine raw milk using gas chromatography coupled to tandem mass spectrometry (GC-MS/MS) and gas chromatography with electron-capture detector (GC-ECD). Sample extraction procedure was established through the evaluation of seven different extraction protocols, evaluated in terms of analyte recovery and cleanup efficiency. Sample preparation optimization was based on Doehlert design using fifteen runs with three different variables. Response surface methodologies and polynomial analysis were used to define the best extraction conditions. Method validation was carried out based on SANCO guide parameters and assessed by multivariate analysis. Method performance was considered satisfactory since mean recoveries were between 87% and 101% for three distinct concentrations. Accuracy and precision were lower than ±20%, and led to no significant differences (p < 0.05) between results obtained by GC-ECD and GC-MS/MS techniques. The method has been applied to routine analysis for determination of pyrethroid residues in bovine raw milk in the Brazilian National Residue Control Plan since 2013, in which a total of 50 samples were analyzed. PMID:25380457
Treatment of cervical spine fractures with halo vest method in children and young people.
Tomaszewski, Ryszard; Pyzińska, Marta
2014-01-01
The Halo Vest method is a non-invasive treatment of cervical spine fractures. It is successfully applied in adults, which is supported by numerous studies, but has rarely been used among children and young people. There is little published research in this field. The aim of the paper is to present the effectiveness of Halo Vest external fixation in children and to evaluate the complication rate of this method. A retrospective study of 6 patients with cervical spine fractures with an average age of 13.3 years (range: 10 to 17 years) treated with Halo Vest external fixation between 2004 and 2013. The type and cause of fracture, treatment outcome and complications were evaluated. The average duration of follow-up was 55 months. In 5 cases, the treatment result was satisfactory. In one case, there were complications in the form of an external infection around the cranial pins. 1. The Halo Vest system can be applied as a non-operative method of treating cervical spine fractures in children and young people. 2. The criteria of eligibility for specific types of cervical spine fracture treatment in children and young people require further investigation, especially with regard to eliminating complications.
Estimating groundwater recharge
Healy, Richard W.; Scanlon, Bridget R.
2010-01-01
Understanding groundwater recharge is essential for successful management of water resources and modeling fluid and contaminant transport within the subsurface. This book provides a critical evaluation of the theory and assumptions that underlie methods for estimating rates of groundwater recharge. Detailed explanations of the methods are provided - allowing readers to apply many of the techniques themselves without needing to consult additional references. Numerous practical examples highlight benefits and limitations of each method. Approximately 900 references allow advanced practitioners to pursue additional information on any method. For the first time, theoretical and practical considerations for selecting and applying methods for estimating groundwater recharge are covered in a single volume with uniform presentation. Hydrogeologists, water-resource specialists, civil and agricultural engineers, earth and environmental scientists and agronomists will benefit from this informative and practical book. It can serve as the primary text for a graduate-level course on groundwater recharge or as an adjunct text for courses on groundwater hydrology or hydrogeology.
Superiorization with level control
NASA Astrophysics Data System (ADS)
Cegielski, Andrzej; Al-Musallam, Fadhel
2017-04-01
The convex feasibility problem is to find a common point of a finite family of closed convex subsets. In many applications one requires something more, namely finding a common point of closed convex subsets which minimizes a continuous convex function. The latter requirement leads to an application of the superiorization methodology which is actually settled between methods for convex feasibility problem and the convex constrained minimization. Inspired by the superiorization idea we introduce a method which sequentially applies a long-step algorithm for a sequence of convex feasibility problems; the method employs quasi-nonexpansive operators as well as subgradient projections with level control and does not require evaluation of the metric projection. We replace a perturbation of the iterations (applied in the superiorization methodology) by a perturbation of the current level in minimizing the objective function. We consider the method in the Euclidean space in order to guarantee the strong convergence, although the method is well defined in a Hilbert space.
GPS/DR Error Estimation for Autonomous Vehicle Localization.
Lee, Byung-Hyun; Song, Jong-Hwa; Im, Jun-Hyuck; Im, Sung-Hyuck; Heo, Moon-Beom; Jee, Gyu-In
2015-08-21
Autonomous vehicles require highly reliable navigation capabilities. For example, a lane-following method cannot be applied in an intersection without lanes, and since typical lane detection is performed using a straight-line model, errors can occur when the lateral distance is estimated in curved sections due to a model mismatch. Therefore, this paper proposes a localization method that uses GPS/DR error estimation based on a lane detection method with curved lane models, stop line detection, and curve matching in order to improve the performance during waypoint following procedures. The advantage of using the proposed method is that position information can be provided for autonomous driving through intersections, in sections with sharp curves, and in curved sections following a straight section. The proposed method was applied in autonomous vehicles at an experimental site to evaluate its performance, and the results indicate that the positioning achieved accuracy at the sub-meter level.
GPS/DR Error Estimation for Autonomous Vehicle Localization
Lee, Byung-Hyun; Song, Jong-Hwa; Im, Jun-Hyuck; Im, Sung-Hyuck; Heo, Moon-Beom; Jee, Gyu-In
2015-01-01
Autonomous vehicles require highly reliable navigation capabilities. For example, a lane-following method cannot be applied in an intersection without lanes, and since typical lane detection is performed using a straight-line model, errors can occur when the lateral distance is estimated in curved sections due to a model mismatch. Therefore, this paper proposes a localization method that uses GPS/DR error estimation based on a lane detection method with curved lane models, stop line detection, and curve matching in order to improve the performance during waypoint following procedures. The advantage of using the proposed method is that position information can be provided for autonomous driving through intersections, in sections with sharp curves, and in curved sections following a straight section. The proposed method was applied in autonomous vehicles at an experimental site to evaluate its performance, and the results indicate that the positioning achieved accuracy at the sub-meter level. PMID:26307997
NASA Astrophysics Data System (ADS)
Balcerzak, Marek; Dąbrowski, Artur; Pikunov, Danylo
2018-01-01
This paper presents a practical application of a new, simplified method of Lyapunov exponents estimation. The method has been applied to optimization of a real, nonlinear inverted pendulum system. Authors presented how the algorithm of the Largest Lyapunov Exponent (LLE) estimation can be applied to evaluate control systems performance. The new LLE-based control performance index has been proposed. Equations of the inverted pendulum system of the fourth order have been found. The nonlinear friction of the regulation object has been identified by means of the nonlinear least squares method. Three different friction models have been tested: linear, cubic and Coulomb model. The Differential Evolution (DE) algorithm has been used to search for the best set of parameters of the general linear regulator. This work proves that proposed method is efficient and results in faster perturbation rejection, especially when disturbances are significant.
Olney, Cynthia A
2005-10-01
After arguing that most community-based organizations (CBOs) function as complex adaptive systems, this white paper describes the evaluation goals, questions, indicators, and methods most important at different stages of community-based health information outreach. This paper presents the basic characteristics of complex adaptive systems and argues that the typical CBO can be considered this type of system. It then presents evaluation as a tool for helping outreach teams adapt their outreach efforts to the CBO environment and thus maximize success. Finally, it describes the goals, questions, indicators, and methods most important or helpful at each stage of evaluation (community assessment, needs assessment and planning, process evaluation, and outcomes assessment). Literature from complex adaptive systems as applied to health care, business, and evaluation settings is presented. Evaluation models and applications, particularly those based on participatory approaches, are presented as methods for maximizing the effectiveness of evaluation in dynamic CBO environments. If one accepts that CBOs function as complex adaptive systems-characterized by dynamic relationships among many agents, influences, and forces-then effective evaluation at the stages of community assessment, needs assessment and planning, process evaluation, and outcomes assessment is critical to outreach success.
Nondestructive evaluation of the preservation state of stone columns in the Hospital Real of Granada
NASA Astrophysics Data System (ADS)
Moreno de Jong van Coevorden, C.; Cobos Sánchez, C.; Rubio Bretones, A.; Fernández Pantoja, M.; García, Salvador G.; Gómez Martín, R.
2012-12-01
This paper describes the results of the employment of two nondestructive evaluation methods for the diagnostic of the preservation state of stone elements. The first method is based on ultrasonic (US) pulses while the second method uses short electromagnetic pulses. Specifically, these methods were applied to some columns, some of them previously restored. These columns are part of the architectonic heritage of the University of Granada, in particular they are located at the patio de la capilla del Hospital Real of Granada. The objective of this work was the application of systems based on US pulses (in transmission mode) and the ground-penetrating radar systems (electromagnetic tomography) in the diagnosis and detection of possible faults in the interior of columns.
Application of econometric and ecology analysis methods in physics software
NASA Astrophysics Data System (ADS)
Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo
2017-10-01
Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.
Determination of copper by isotopic dilution.
Faquim, E S; Munita, C S
1994-01-01
A rapid and selective method was used for the determination of copper by isotopic dilution employing substoichiometric extraction with dithizone in carbon tetrachloride. The appropriate pH range for the substoichiometric extraction was 2-7. In the analysis, even a large excess of elements forming extractable complexes with dithizone does not interfere. The accuracy and precision of the method were evaluated. The method has been applied to analysis of reference materials, wheat flour, wine, and beer.
Application of single-step genomic evaluation for crossbred performance in pig.
Xiang, T; Nielsen, B; Su, G; Legarra, A; Christensen, O F
2016-03-01
Crossbreding is predominant and intensively used in commercial meat production systems, especially in poultry and swine. Genomic evaluation has been successfully applied for breeding within purebreds but also offers opportunities of selecting purebreds for crossbred performance by combining information from purebreds with information from crossbreds. However, it generally requires that all relevant animals are genotyped, which is costly and presently does not seem to be feasible in practice. Recently, a novel single-step BLUP method for genomic evaluation of both purebred and crossbred performance has been developed that can incorporate marker genotypes into a traditional animal model. This new method has not been validated in real data sets. In this study, we applied this single-step method to analyze data for the maternal trait of total number of piglets born in Danish Landrace, Yorkshire, and two-way crossbred pigs in different scenarios. The genetic correlation between purebred and crossbred performances was investigated first, and then the impact of (crossbred) genomic information on prediction reliability for crossbred performance was explored. The results confirm the existence of a moderate genetic correlation, and it was seen that the standard errors on the estimates were reduced when including genomic information. Models with marker information, especially crossbred genomic information, improved model-based reliabilities for crossbred performance of purebred boars and also improved the predictive ability for crossbred animals and, to some extent, reduced the bias of prediction. We conclude that the new single-step BLUP method is a good tool in the genetic evaluation for crossbred performance in purebred animals.
NASA Astrophysics Data System (ADS)
Krstulović-Opara, Lovre; Surjak, Martin; Vesenjak, Matej; Tonković, Zdenko; Kodvanj, Janoš; Domazet, Željko
2015-11-01
To investigate the applicability of infrared thermography as a tool for acquiring dynamic yielding in metals, a comparison of infrared thermography with three dimensional digital image correlation has been made. Dynamical tension tests and three point bending tests of aluminum alloys have been performed to evaluate results obtained by IR thermography in order to detect capabilities and limits for these two methods. Both approaches detect pastification zone migrations during the yielding process. The results of the tension test and three point bending test proved the validity of the IR approach as a method for evaluating the dynamic yielding process when used on complex structures such as cellular porous materials. The stability of the yielding process in the three point bending test, as contrary to the fluctuation of the plastification front in the tension test, is of great importance for the validation of numerical constitutive models. The research proved strong performance, robustness and reliability of the IR approach when used to evaluate yielding during dynamic loading processes, while the 3D DIC method proved to be superior in the low velocity loading regimes. This research based on two basic tests, proved the conclusions and suggestions presented in our previous research on porous materials where middle wave infrared thermography was applied.
Text mining by Tsallis entropy
NASA Astrophysics Data System (ADS)
Jamaati, Maryam; Mehri, Ali
2018-01-01
Long-range correlations between the elements of natural languages enable them to convey very complex information. Complex structure of human language, as a manifestation of natural languages, motivates us to apply nonextensive statistical mechanics in text mining. Tsallis entropy appropriately ranks the terms' relevance to document subject, taking advantage of their spatial correlation length. We apply this statistical concept as a new powerful word ranking metric in order to extract keywords of a single document. We carry out an experimental evaluation, which shows capability of the presented method in keyword extraction. We find that, Tsallis entropy has reliable word ranking performance, at the same level of the best previous ranking methods.
NASA Technical Reports Server (NTRS)
Consoli, Robert David; Sobieszczanski-Sobieski, Jaroslaw
1990-01-01
Advanced multidisciplinary analysis and optimization methods, namely system sensitivity analysis and non-hierarchical system decomposition, are applied to reduce the cost and improve the visibility of an automated vehicle design synthesis process. This process is inherently complex due to the large number of functional disciplines and associated interdisciplinary couplings. Recent developments in system sensitivity analysis as applied to complex non-hierarchic multidisciplinary design optimization problems enable the decomposition of these complex interactions into sub-processes that can be evaluated in parallel. The application of these techniques results in significant cost, accuracy, and visibility benefits for the entire design synthesis process.
The Elicitation Interview Technique: Capturing People's Experiences of Data Representations.
Hogan, Trevor; Hinrichs, Uta; Hornecker, Eva
2016-12-01
Information visualization has become a popular tool to facilitate sense-making, discovery and communication in a large range of professional and casual contexts. However, evaluating visualizations is still a challenge. In particular, we lack techniques to help understand how visualizations are experienced by people. In this paper we discuss the potential of the Elicitation Interview technique to be applied in the context of visualization. The Elicitation Interview is a method for gathering detailed and precise accounts of human experience. We argue that it can be applied to help understand how people experience and interpret visualizations as part of exploration and data analysis processes. We describe the key characteristics of this interview technique and present a study we conducted to exemplify how it can be applied to evaluate data representations. Our study illustrates the types of insights this technique can bring to the fore, for example, evidence for deep interpretation of visual representations and the formation of interpretations and stories beyond the represented data. We discuss general visualization evaluation scenarios where the Elicitation Interview technique may be beneficial and specify what needs to be considered when applying this technique in a visualization context specifically.
Application of ride quality technology to predict ride satisfaction for commuter-type aircraft
NASA Technical Reports Server (NTRS)
Jacobson, I. D.; Kuhlthau, A. R.; Richards, L. G.
1975-01-01
A method was developed to predict passenger satisfaction with the ride environment of a transportation vehicle. This method, a general approach, was applied to a commuter-type aircraft for illustrative purposes. The effect of terrain, altitude and seat location were examined. The method predicts the variation in passengers satisfied for any set of flight conditions. In addition several noncommuter aircraft were analyzed for comparison and other uses of the model described. The method has advantages for design, evaluation, and operating decisions.
Uncertainty factors in screening ecological risk assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duke, L.D.; Taggart, M.
2000-06-01
The hazard quotient (HQ) method is commonly used in screening ecological risk assessments (ERAs) to estimate risk to wildlife at contaminated sites. Many ERAs use uncertainty factors (UFs) in the HQ calculation to incorporate uncertainty associated with predicting wildlife responses to contaminant exposure using laboratory toxicity data. The overall objective was to evaluate the current UF methodology as applied to screening ERAs in California, USA. Specific objectives included characterizing current UF methodology, evaluating the degree of conservatism in UFs as applied, and identifying limitations to the current approach. Twenty-four of 29 evaluated ERAs used the HQ approach: 23 of thesemore » used UFs in the HQ calculation. All 24 made interspecies extrapolations, and 21 compensated for its uncertainty, most using allometric adjustments and some using RFs. Most also incorporated uncertainty for same-species extrapolations. Twenty-one ERAs used UFs extrapolating from lowest observed adverse effect level (LOAEL) to no observed adverse effect level (NOAEL), and 18 used UFs extrapolating from subchronic to chronic exposure. Values and application of all UF types were inconsistent. Maximum cumulative UFs ranged from 10 to 3,000. Results suggest UF methodology is widely used but inconsistently applied and is not uniformly conservative relative to UFs recommended in regulatory guidelines and academic literature. The method is limited by lack of consensus among scientists, regulators, and practitioners about magnitudes, types, and conceptual underpinnings of the UF methodology.« less
Aerodynamics Characteristics of Multi-Element Airfoils at -90 Degrees Incidence
NASA Technical Reports Server (NTRS)
Stremel, Paul M.; Schmitz, Fredric H. (Technical Monitor)
1994-01-01
A developed method has been applied to calculate accurately the viscous flow about airfoils normal to the free-stream flow. This method has special application to the analysis of tilt rotor aircraft in the evaluation of download. In particular, the flow about an XV-15 airfoil with and without deflected leading and trailing edge flaps at -90 degrees incidence is evaluated. The multi-element aspect of the method provides for the evaluation of slotted flap configurations which may lead to decreased drag. The method solves for turbulent flow at flight Reynolds numbers. The flow about the XV-15 airfoil with and without flap deflections has been calculated and compared with experimental data at a Reynolds number of one million. The comparison between the calculated and measured pressure distributions are very good, thereby, verifying the method. The aerodynamic evaluation of multielement airfoils will be conducted to determine airfoil/flap configurations for reduced airfoil drag. Comparisons between the calculated lift, drag and pitching moment on the airfoil and the airfoil surface pressure will also be presented.
Development and Validation of New Discriminative Dissolution Method for Carvedilol Tablets
Raju, V.; Murthy, K. V. R.
2011-01-01
The objective of the present study was to develop and validate a discriminative dissolution method for evaluation of carvedilol tablets. Different conditions such as type of dissolution medium, volume of dissolution medium and rotation speed of paddle were evaluated. The best in vitro dissolution profile was obtained using Apparatus II (paddle), 50 rpm, 900 ml of pH 6.8 phosphate buffer as dissolution medium. The drug release was evaluated by high-performance liquid chromatographic method. The dissolution method was validated according to current ICH and FDA guidelines using parameters such as the specificity, accuracy, precision and stability were evaluated and obtained results were within the acceptable range. The comparison of the obtained dissolution profiles of three different products were investigated using ANOVA-based, model-dependent and model-independent methods, results showed that there is significant difference between the products. The dissolution test developed and validated was adequate for its higher discriminative capacity in differentiating the release characteristics of the products tested and could be applied for development and quality control of carvedilol tablets. PMID:22923865
Evaluation and selection of 3PL provider using fuzzy AHP and grey TOPSIS in group decision making
NASA Astrophysics Data System (ADS)
Garside, Annisa Kesy; Saputro, Thomy Eko
2017-11-01
Selection of a 3PL provider is a problem of multi criteria decision making, where the decision maker has to select several 3PL provider alternatives based on several evaluation criteria. A decision maker will have difficulty to express judgments in exact numerical values due to the fact that information is often incomplete and the decision environment is uncertain. This paper presents an integrated fuzzy AHP and Grey TOPSIS for the evaluation and selection of 3PL provider method. Fuzzy AHP is used to determine the importance weight of evaluation criteria. For final selection, grey TOPSIS is used to evaluate the alternatives and obtain the overall performance which is measured as closeness coefficient. This method is applied to solve the selection of 3PL provider at PT. X. Five criterias and twelve sub-criterias were determined and then the best alternative among four 3PL providers was selected by proposed method.
Application of Conjugate Gradient methods to tidal simulation
Barragy, E.; Carey, G.F.; Walters, R.A.
1993-01-01
A harmonic decomposition technique is applied to the shallow water equations to yield a complex, nonsymmetric, nonlinear, Helmholtz type problem for the sea surface and an accompanying complex, nonlinear diagonal problem for the velocities. The equation for the sea surface is linearized using successive approximation and then discretized with linear, triangular finite elements. The study focuses on applying iterative methods to solve the resulting complex linear systems. The comparative evaluation includes both standard iterative methods for the real subsystems and complex versions of the well known Bi-Conjugate Gradient and Bi-Conjugate Gradient Squared methods. Several Incomplete LU type preconditioners are discussed, and the effects of node ordering, rejection strategy, domain geometry and Coriolis parameter (affecting asymmetry) are investigated. Implementation details for the complex case are discussed. Performance studies are presented and comparisons made with a frontal solver. ?? 1993.
NASA Technical Reports Server (NTRS)
Mcclain, W. D.
1977-01-01
A recursively formulated, first-order, semianalytic artificial satellite theory, based on the generalized method of averaging is presented in two volumes. Volume I comprehensively discusses the theory of the generalized method of averaging applied to the artificial satellite problem. Volume II presents the explicit development in the nonsingular equinoctial elements of the first-order average equations of motion. The recursive algorithms used to evaluate the first-order averaged equations of motion are also presented in Volume II. This semianalytic theory is, in principle, valid for a term of arbitrary degree in the expansion of the third-body disturbing function (nonresonant cases only) and for a term of arbitrary degree and order in the expansion of the nonspherical gravitational potential function.
NASA Astrophysics Data System (ADS)
Guo, Hongbo; He, Xiaowei; Liu, Muhan; Zhang, Zeyu; Hu, Zhenhua; Tian, Jie
2017-03-01
Cerenkov luminescence tomography (CLT), as a promising optical molecular imaging modality, can be applied to cancer diagnostic and therapeutic. Most researches about CLT reconstruction are based on the finite element method (FEM) framework. However, the quality of FEM mesh grid is still a vital factor to restrict the accuracy of the CLT reconstruction result. In this paper, we proposed a multi-grid finite element method framework, which was able to improve the accuracy of reconstruction. Meanwhile, the multilevel scheme adaptive algebraic reconstruction technique (MLS-AART) based on a modified iterative algorithm was applied to improve the reconstruction accuracy. In numerical simulation experiments, the feasibility of our proposed method were evaluated. Results showed that the multi-grid strategy could obtain 3D spatial information of Cerenkov source more accurately compared with the traditional single-grid FEM.
Water quality assessment with hierarchical cluster analysis based on Mahalanobis distance.
Du, Xiangjun; Shao, Fengjing; Wu, Shunyao; Zhang, Hanlin; Xu, Si
2017-07-01
Water quality assessment is crucial for assessment of marine eutrophication, prediction of harmful algal blooms, and environment protection. Previous studies have developed many numeric modeling methods and data driven approaches for water quality assessment. The cluster analysis, an approach widely used for grouping data, has also been employed. However, there are complex correlations between water quality variables, which play important roles in water quality assessment but have always been overlooked. In this paper, we analyze correlations between water quality variables and propose an alternative method for water quality assessment with hierarchical cluster analysis based on Mahalanobis distance. Further, we cluster water quality data collected form coastal water of Bohai Sea and North Yellow Sea of China, and apply clustering results to evaluate its water quality. To evaluate the validity, we also cluster the water quality data with cluster analysis based on Euclidean distance, which are widely adopted by previous studies. The results show that our method is more suitable for water quality assessment with many correlated water quality variables. To our knowledge, it is the first attempt to apply Mahalanobis distance for coastal water quality assessment.
Estimation and prediction under local volatility jump-diffusion model
NASA Astrophysics Data System (ADS)
Kim, Namhyoung; Lee, Younhee
2018-02-01
Volatility is an important factor in operating a company and managing risk. In the portfolio optimization and risk hedging using the option, the value of the option is evaluated using the volatility model. Various attempts have been made to predict option value. Recent studies have shown that stochastic volatility models and jump-diffusion models reflect stock price movements accurately. However, these models have practical limitations. Combining them with the local volatility model, which is widely used among practitioners, may lead to better performance. In this study, we propose a more effective and efficient method of estimating option prices by combining the local volatility model with the jump-diffusion model and apply it using both artificial and actual market data to evaluate its performance. The calibration process for estimating the jump parameters and local volatility surfaces is divided into three stages. We apply the local volatility model, stochastic volatility model, and local volatility jump-diffusion model estimated by the proposed method to KOSPI 200 index option pricing. The proposed method displays good estimation and prediction performance.
Shegog, Ross; Bartholomew, L Kay; Gold, Robert S; Pierrel, Elaine; Parcel, Guy S; Sockrider, Marianna M; Czyzewski, Danita I; Fernandez, Maria E; Berlin, Nina J; Abramson, Stuart
2006-01-01
Translating behavioral theories, models, and strategies to guide the development and structure of computer-based health applications is well recognized, although a continued challenge for program developers. A stepped approach to translate behavioral theory in the design of simulations to teach chronic disease management to children is described. This includes the translation steps to: 1) define target behaviors and their determinants, 2) identify theoretical methods to optimize behavioral change, and 3) choose educational strategies to effectively apply these methods and combine these into a cohesive computer-based simulation for health education. Asthma is used to exemplify a chronic health management problem and a computer-based asthma management simulation (Watch, Discover, Think and Act) that has been evaluated and shown to effect asthma self-management in children is used to exemplify the application of theory to practice. Impact and outcome evaluation studies have indicated the effectiveness of these steps in providing increased rigor and accountability, suggesting their utility for educators and developers seeking to apply simulations to enhance self-management behaviors in patients.