Sample records for cross validation technique

  1. Validity Evidence in Scale Development: The Application of Cross Validation and Classification-Sequencing Validation

    ERIC Educational Resources Information Center

    Acar, Tu¨lin

    2014-01-01

    In literature, it has been observed that many enhanced criteria are limited by factor analysis techniques. Besides examinations of statistical structure and/or psychological structure, such validity studies as cross validation and classification-sequencing studies should be performed frequently. The purpose of this study is to examine cross…

  2. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods.

    PubMed

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J Sunil

    2014-08-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called "Patient Recursive Survival Peeling" is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called "combined" cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication.

  3. Cross-Validation of Survival Bump Hunting by Recursive Peeling Methods

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a survival/risk bump hunting framework to build a bump hunting model with a possibly censored time-to-event type of response and to validate model estimates. First, we describe the use of adequate survival peeling criteria to build a survival/risk bump hunting model based on recursive peeling methods. Our method called “Patient Recursive Survival Peeling” is a rule-induction method that makes use of specific peeling criteria such as hazard ratio or log-rank statistics. Second, to validate our model estimates and improve survival prediction accuracy, we describe a resampling-based validation technique specifically designed for the joint task of decision rule making by recursive peeling (i.e. decision-box) and survival estimation. This alternative technique, called “combined” cross-validation is done by combining test samples over the cross-validation loops, a design allowing for bump hunting by recursive peeling in a survival setting. We provide empirical results showing the importance of cross-validation and replication. PMID:26997922

  4. Joint use of over- and under-sampling techniques and cross-validation for the development and assessment of prediction models.

    PubMed

    Blagus, Rok; Lusa, Lara

    2015-11-04

    Prediction models are used in clinical research to develop rules that can be used to accurately predict the outcome of the patients based on some of their characteristics. They represent a valuable tool in the decision making process of clinicians and health policy makers, as they enable them to estimate the probability that patients have or will develop a disease, will respond to a treatment, or that their disease will recur. The interest devoted to prediction models in the biomedical community has been growing in the last few years. Often the data used to develop the prediction models are class-imbalanced as only few patients experience the event (and therefore belong to minority class). Prediction models developed using class-imbalanced data tend to achieve sub-optimal predictive accuracy in the minority class. This problem can be diminished by using sampling techniques aimed at balancing the class distribution. These techniques include under- and oversampling, where a fraction of the majority class samples are retained in the analysis or new samples from the minority class are generated. The correct assessment of how the prediction model is likely to perform on independent data is of crucial importance; in the absence of an independent data set, cross-validation is normally used. While the importance of correct cross-validation is well documented in the biomedical literature, the challenges posed by the joint use of sampling techniques and cross-validation have not been addressed. We show that care must be taken to ensure that cross-validation is performed correctly on sampled data, and that the risk of overestimating the predictive accuracy is greater when oversampling techniques are used. Examples based on the re-analysis of real datasets and simulation studies are provided. We identify some results from the biomedical literature where the incorrect cross-validation was performed, where we expect that the performance of oversampling techniques was heavily overestimated.

  5. A cross-validation package driving Netica with python

    USGS Publications Warehouse

    Fienen, Michael N.; Plant, Nathaniel G.

    2014-01-01

    Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).

  6. Validation of cross-sectional time series and multivariate adaptive regression splines models for the prediction of energy expenditure in children and adolescents using doubly labeled water

    USDA-ARS?s Scientific Manuscript database

    Accurate, nonintrusive, and inexpensive techniques are needed to measure energy expenditure (EE) in free-living populations. Our primary aim in this study was to validate cross-sectional time series (CSTS) and multivariate adaptive regression splines (MARS) models based on observable participant cha...

  7. Mind your crossings: Mining GIS imagery for crosswalk localization.

    PubMed

    Ahmetovic, Dragan; Manduchi, Roberto; Coughlan, James M; Mascetti, Sergio

    2017-04-01

    For blind travelers, finding crosswalks and remaining within their borders while traversing them is a crucial part of any trip involving street crossings. While standard Orientation & Mobility (O&M) techniques allow blind travelers to safely negotiate street crossings, additional information about crosswalks and other important features at intersections would be helpful in many situations, resulting in greater safety and/or comfort during independent travel. For instance, in planning a trip a blind pedestrian may wish to be informed of the presence of all marked crossings near a desired route. We have conducted a survey of several O&M experts from the United States and Italy to determine the role that crosswalks play in travel by blind pedestrians. The results show stark differences between survey respondents from the U.S. compared with Italy: the former group emphasized the importance of following standard O&M techniques at all legal crossings (marked or unmarked), while the latter group strongly recommended crossing at marked crossings whenever possible. These contrasting opinions reflect differences in the traffic regulations of the two countries and highlight the diversity of needs that travelers in different regions may have. To address the challenges faced by blind pedestrians in negotiating street crossings, we devised a computer vision-based technique that mines existing spatial image databases for discovery of zebra crosswalks in urban settings. Our algorithm first searches for zebra crosswalks in satellite images; all candidates thus found are validated against spatially registered Google Street View images. This cascaded approach enables fast and reliable discovery and localization of zebra crosswalks in large image datasets. While fully automatic, our algorithm can be improved by a final crowdsourcing validation. To this end, we developed a Pedestrian Crossing Human Validation (PCHV) web service, which supports crowdsourcing to rule out false positives and identify false negatives.

  8. Mind your crossings: Mining GIS imagery for crosswalk localization

    PubMed Central

    Ahmetovic, Dragan; Manduchi, Roberto; Coughlan, James M.; Mascetti, Sergio

    2017-01-01

    For blind travelers, finding crosswalks and remaining within their borders while traversing them is a crucial part of any trip involving street crossings. While standard Orientation & Mobility (O&M) techniques allow blind travelers to safely negotiate street crossings, additional information about crosswalks and other important features at intersections would be helpful in many situations, resulting in greater safety and/or comfort during independent travel. For instance, in planning a trip a blind pedestrian may wish to be informed of the presence of all marked crossings near a desired route. We have conducted a survey of several O&M experts from the United States and Italy to determine the role that crosswalks play in travel by blind pedestrians. The results show stark differences between survey respondents from the U.S. compared with Italy: the former group emphasized the importance of following standard O&M techniques at all legal crossings (marked or unmarked), while the latter group strongly recommended crossing at marked crossings whenever possible. These contrasting opinions reflect differences in the traffic regulations of the two countries and highlight the diversity of needs that travelers in different regions may have. To address the challenges faced by blind pedestrians in negotiating street crossings, we devised a computer vision-based technique that mines existing spatial image databases for discovery of zebra crosswalks in urban settings. Our algorithm first searches for zebra crosswalks in satellite images; all candidates thus found are validated against spatially registered Google Street View images. This cascaded approach enables fast and reliable discovery and localization of zebra crosswalks in large image datasets. While fully automatic, our algorithm can be improved by a final crowdsourcing validation. To this end, we developed a Pedestrian Crossing Human Validation (PCHV) web service, which supports crowdsourcing to rule out false positives and identify false negatives. PMID:28757907

  9. Computer simulation of Cerebral Arteriovenous Malformation-validation analysis of hemodynamics parameters.

    PubMed

    Kumar, Y Kiran; Mehta, Shashi Bhushan; Ramachandra, Manjunath

    2017-01-01

    The purpose of this work is to provide some validation methods for evaluating the hemodynamic assessment of Cerebral Arteriovenous Malformation (CAVM). This article emphasizes the importance of validating noninvasive measurements for CAVM patients, which are designed using lumped models for complex vessel structure. The validation of the hemodynamics assessment is based on invasive clinical measurements and cross-validation techniques with the Philips proprietary validated software's Qflow and 2D Perfursion. The modeling results are validated for 30 CAVM patients for 150 vessel locations. Mean flow, diameter, and pressure were compared between modeling results and with clinical/cross validation measurements, using an independent two-tailed Student t test. Exponential regression analysis was used to assess the relationship between blood flow, vessel diameter, and pressure between them. Univariate analysis is used to assess the relationship between vessel diameter, vessel cross-sectional area, AVM volume, AVM pressure, and AVM flow results were performed with linear or exponential regression. Modeling results were compared with clinical measurements from vessel locations of cerebral regions. Also, the model is cross validated with Philips proprietary validated software's Qflow and 2D Perfursion. Our results shows that modeling results and clinical results are nearly matching with a small deviation. In this article, we have validated our modeling results with clinical measurements. The new approach for cross-validation is proposed by demonstrating the accuracy of our results with a validated product in a clinical environment.

  10. Applicability of Monte Carlo cross validation technique for model development and validation using generalised least squares regression

    NASA Astrophysics Data System (ADS)

    Haddad, Khaled; Rahman, Ataur; A Zaman, Mohammad; Shrestha, Surendra

    2013-03-01

    SummaryIn regional hydrologic regression analysis, model selection and validation are regarded as important steps. Here, the model selection is usually based on some measurements of goodness-of-fit between the model prediction and observed data. In Regional Flood Frequency Analysis (RFFA), leave-one-out (LOO) validation or a fixed percentage leave out validation (e.g., 10%) is commonly adopted to assess the predictive ability of regression-based prediction equations. This paper develops a Monte Carlo Cross Validation (MCCV) technique (which has widely been adopted in Chemometrics and Econometrics) in RFFA using Generalised Least Squares Regression (GLSR) and compares it with the most commonly adopted LOO validation approach. The study uses simulated and regional flood data from the state of New South Wales in Australia. It is found that when developing hydrologic regression models, application of the MCCV is likely to result in a more parsimonious model than the LOO. It has also been found that the MCCV can provide a more realistic estimate of a model's predictive ability when compared with the LOO.

  11. Cross-cultural equivalence of the patient- and parent-reported quality of life in short stature youth (QoLISSY) questionnaire.

    PubMed

    Bullinger, Monika; Quitmann, Julia; Silva, Neuza; Rohenkohl, Anja; Chaplin, John E; DeBusk, Kendra; Mimoun, Emmanuelle; Feigerlova, Eva; Herdman, Michael; Sanz, Dolores; Wollmann, Hartmut; Pleil, Andreas; Power, Michael

    2014-01-01

    Testing cross-cultural equivalence of patient-reported outcomes requires sufficiently large samples per country, which is difficult to achieve in rare endocrine paediatric conditions. We describe a novel approach to cross-cultural testing of the Quality of Life in Short Stature Youth (QoLISSY) questionnaire in five countries by sequentially taking one country out (TOCO) from the total sample and iteratively comparing the resulting psychometric performance. Development of the QoLISSY proceeded from focus group discussions through pilot testing to field testing in 268 short-statured patients and their parents. To explore cross-cultural equivalence, the iterative TOCO technique was used to examine and compare the validity, reliability, and convergence of patient and parent responses on QoLISSY in the field test dataset, and to predict QoLISSY scores from clinical, socio-demographic and psychosocial variables. Validity and reliability indicators were satisfactory for each sample after iteratively omitting one country. Comparisons with the total sample revealed cross-cultural equivalence in internal consistency and construct validity for patients and parents, high inter-rater agreement and a substantial proportion of QoLISSY variance explained by predictors. The TOCO technique is a powerful method to overcome problems of country-specific testing of patient-reported outcome instruments. It provides an empirical support to QoLISSY's cross-cultural equivalence and is recommended for future research.

  12. A diagnostic technique used to obtain cross range radiation centers from antenna patterns

    NASA Technical Reports Server (NTRS)

    Lee, T. H.; Burnside, W. D.

    1988-01-01

    A diagnostic technique to obtain cross range radiation centers based on antenna radiation patterns is presented. This method is similar to the synthetic aperture processing of scattered fields in the radar application. Coherent processing of the radiated fields is used to determine the various radiation centers associated with the far-zone pattern of an antenna for a given radiation direction. This technique can be used to identify an unexpected radiation center that creates an undesired effect in a pattern; on the other hand, it can improve a numerical simulation of the pattern by identifying other significant mechanisms. Cross range results for two 8' reflector antennas are presented to illustrate as well as validate that technique.

  13. Measurement of the $$\\mathrm{Z}\\gamma^{*} \\to \\tau\\tau$$ cross section in pp collisions at $$\\sqrt{s} = $$ 13 TeV and validation of $$\\tau$$ lepton analysis techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sirunyan, Albert M; et al.

    A measurement is presented of themore » $$\\mathrm{Z}/\\gamma^{*} \\to \\tau\\tau$$ cross section in pp collisions at $$\\sqrt{s} = $$ 13 TeV, using data recorded by the CMS experiment at the LHC, corresponding to an integrated luminosity of 2.3 fb$$^{-1}$$. The product of the inclusive cross section and branching fraction is measured to be $$\\sigma(\\mathrm{pp} \\to \\mathrm{Z}/\\gamma^{*}\\text{+X}) \\, \\mathcal{B}(\\mathrm{Z}/\\gamma^{*} \\to \\tau\\tau) = $$ 1848 $$\\pm$$ 12 (stat) $$\\pm$$ 67 (syst+lumi) pb, in agreement with the standard model expectation, computed at next-to-next-to-leading order accuracy in perturbative quantum chromodynamics. The measurement is used to validate new analysis techniques relevant for future measurements of $$\\tau$$ lepton production. The measurement also provides the reconstruction efficiency and energy scale for $$\\tau$$ decays to hadrons+$$\

  14. K-Fold Crossvalidation in Canonical Analysis.

    ERIC Educational Resources Information Center

    Liang, Kun-Hsia; And Others

    1995-01-01

    A computer-assisted, K-fold cross-validation technique is discussed in the framework of canonical correlation analysis of randomly generated data sets. Analysis results suggest that this technique can effectively reduce the contamination of canonical variates and canonical correlations by sample-specific variance components. (Author/SLD)

  15. Prediction of adult height in girls: the Beunen-Malina-Freitas method.

    PubMed

    Beunen, Gaston P; Malina, Robert M; Freitas, Duarte L; Thomis, Martine A; Maia, José A; Claessens, Albrecht L; Gouveia, Elvio R; Maes, Hermine H; Lefevre, Johan

    2011-12-01

    The purpose of this study was to validate and cross-validate the Beunen-Malina-Freitas method for non-invasive prediction of adult height in girls. A sample of 420 girls aged 10-15 years from the Madeira Growth Study were measured at yearly intervals and then 8 years later. Anthropometric dimensions (lengths, breadths, circumferences, and skinfolds) were measured; skeletal age was assessed using the Tanner-Whitehouse 3 method and menarcheal status (present or absent) was recorded. Adult height was measured and predicted using stepwise, forward, and maximum R (2) regression techniques. Multiple correlations, mean differences, standard errors of prediction, and error boundaries were calculated. A sample of the Leuven Longitudinal Twin Study was used to cross-validate the regressions. Age-specific coefficients of determination (R (2)) between predicted and measured adult height varied between 0.57 and 0.96, while standard errors of prediction varied between 1.1 and 3.9 cm. The cross-validation confirmed the validity of the Beunen-Malina-Freitas method in girls aged 12-15 years, but at lower ages the cross-validation was less consistent. We conclude that the Beunen-Malina-Freitas method is valid for the prediction of adult height in girls aged 12-15 years. It is applicable to European populations or populations of European ancestry.

  16. Validated spectrophotometric methods for simultaneous determination of Omeprazole, Tinidazole and Doxycycline in their ternary mixture

    NASA Astrophysics Data System (ADS)

    Lotfy, Hayam M.; Hegazy, Maha A.; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-01-01

    A comparative study of smart spectrophotometric techniques for the simultaneous determination of Omeprazole (OMP), Tinidazole (TIN) and Doxycycline (DOX) without prior separation steps is developed. These techniques consist of several consecutive steps utilizing zero/or ratio/or derivative spectra. The proposed techniques adopt nine simple different methods, namely direct spectrophotometry, dual wavelength, first derivative-zero crossing, amplitude factor, spectrum subtraction, ratio subtraction, derivative ratio-zero crossing, constant center, and successive derivative ratio method. The calibration graphs are linear over the concentration range of 1-20 μg/mL, 5-40 μg/mL and 2-30 μg/mL for OMP, TIN and DOX, respectively. These methods are tested by analyzing synthetic mixtures of the above drugs and successfully applied to commercial pharmaceutical preparation. The methods that are validated according to the ICH guidelines, accuracy, precision, and repeatability, were found to be within the acceptable limits.

  17. Cross validation issues in multiobjective clustering

    PubMed Central

    Brusco, Michael J.; Steinley, Douglas

    2018-01-01

    The implementation of multiobjective programming methods in combinatorial data analysis is an emergent area of study with a variety of pragmatic applications in the behavioural sciences. Most notably, multiobjective programming provides a tool for analysts to model trade offs among competing criteria in clustering, seriation, and unidimensional scaling tasks. Although multiobjective programming has considerable promise, the technique can produce numerically appealing results that lack empirical validity. With this issue in mind, the purpose of this paper is to briefly review viable areas of application for multiobjective programming and, more importantly, to outline the importance of cross-validation when using this method in cluster analysis. PMID:19055857

  18. Validating LES for Jet Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Bridges, James

    2011-01-01

    Engineers charged with making jet aircraft quieter have long dreamed of being able to see exactly how turbulent eddies produce sound and this dream is now coming true with the advent of large eddy simulation (LES). Two obvious challenges remain: validating the LES codes at the resolution required to see the fluid-acoustic coupling, and the interpretation of the massive datasets that result in having dreams come true. This paper primarily addresses the former, the use of advanced experimental techniques such as particle image velocimetry (PIV) and Raman and Rayleigh scattering, to validate the computer codes and procedures used to create LES solutions. It also addresses the latter problem in discussing what are relevant measures critical for aeroacoustics that should be used in validating LES codes. These new diagnostic techniques deliver measurements and flow statistics of increasing sophistication and capability, but what of their accuracy? And what are the measures to be used in validation? This paper argues that the issue of accuracy be addressed by cross-facility and cross-disciplinary examination of modern datasets along with increased reporting of internal quality checks in PIV analysis. Further, it is argued that the appropriate validation metrics for aeroacoustic applications are increasingly complicated statistics that have been shown in aeroacoustic theory to be critical to flow-generated sound.

  19. Validity and reliability of rectus femoris ultrasound measurements: Comparison of curved-array and linear-array transducers.

    PubMed

    Hammond, Kendra; Mampilly, Jobby; Laghi, Franco A; Goyal, Amit; Collins, Eileen G; McBurney, Conor; Jubran, Amal; Tobin, Martin J

    2014-01-01

    Muscle-mass loss augers increased morbidity and mortality in critically ill patients. Muscle-mass loss can be assessed by wide linear-array ultrasound transducers connected to cumbersome, expensive console units. Whether cheaper, hand-carried units equipped with curved-array transducers can be used as alternatives is unknown. Accordingly, our primary aim was to investigate in 15 nondisabled subjects the validity of measurements of rectus femoris cross-sectional area by using a curved-array transducer against a linear-array transducer-the reference-standard technique. In these subjects, we also determined the reliability of measurements obtained by a novice operator versus measurements obtained by an experienced operator. Lastly, the relationship between quadriceps strength and rectus area recorded by two experienced operators with a curved-array transducer was assessed in 17 patients with chronic obstructive pulmonary disease (COPD). In nondisabled subjects, the rectus cross-sectional area measured with the curved-array transducer by the novice and experienced operators was valid (intraclass correlation coefficient [ICC]: 0.98, typical percentage error [%TE]: 3.7%) and reliable (ICC: 0.79, %TE: 9.7%). In the subjects with COPD, both reliability (ICC: 0.99) and repeatability (%TE: 7.6% and 9.8%) were high. Rectus area was related to quadriceps strength in COPD for both experienced operators (coefficient of determination: 0.67 and 0.70). In conclusion, measurements of rectus femoris cross-sectional area recorded with a curved-array transducer connected to a hand-carried unit are valid, reliable, and reproducible, leading us to contend that this technique is suitable for cross-sectional and longitudinal studies.

  20. MSX-3D: a tool to validate 3D protein models using mass spectrometry.

    PubMed

    Heymann, Michaël; Paramelle, David; Subra, Gilles; Forest, Eric; Martinez, Jean; Geourjon, Christophe; Deléage, Gilbert

    2008-12-01

    The technique of chemical cross-linking followed by mass spectrometry has proven to bring valuable information about the protein structure and interactions between proteic subunits. It is an effective and efficient way to experimentally investigate some aspects of a protein structure when NMR and X-ray crystallography data are lacking. We introduce MSX-3D, a tool specifically geared to validate protein models using mass spectrometry. In addition to classical peptides identifications, it allows an interactive 3D visualization of the distance constraints derived from a cross-linking experiment. Freely available at http://proteomics-pbil.ibcp.fr

  1. Validation of bioelectrical impedance analysis for total body water assessment against the deuterium dilution technique in Asian children.

    PubMed

    Liu, A; Byrne, N M; Ma, G; Nasreddine, L; Trinidad, T P; Kijboonchoo, K; Ismail, M N; Kagawa, M; Poh, B K; Hills, A P

    2011-12-01

    To develop and cross-validate bioelectrical impedance analysis (BIA) prediction equations of total body water (TBW) and fat-free mass (FFM) for Asian pre-pubertal children from China, Lebanon, Malaysia, Philippines and Thailand. Height, weight, age, gender, resistance and reactance measured by BIA were collected from 948 Asian children (492 boys and 456 girls) aged 8-10 years from the five countries. The deuterium dilution technique was used as the criterion method for the estimation of TBW and FFM. The BIA equations were developed using stepwise multiple regression analysis and cross-validated using the Bland-Altman approach. The BIA prediction equation for the estimation of TBW was as follows: TBW=0.231 × height(2)/resistance+0.066 × height+0.188 × weight+0.128 × age+0.500 × sex-0.316 × Thais-4.574 (R (2)=88.0%, root mean square error (RMSE)=1.3 kg), and for the estimation of FFM was as follows: FFM=0.299 × height(2)/resistance+0.086 × height+0.245 × weight+0.260 × age+0.901 × sex-0.415 × ethnicity (Thai ethnicity =1, others = 0)-6.952 (R (2)=88.3%, RMSE=1.7 kg). No significant difference between measured and predicted values for the whole cross-validation sample was found. However, the prediction equation for estimation of TBW/FFM tended to overestimate TBW/FFM at lower levels whereas underestimate at higher levels of TBW/FFM. Accuracy of the general equation for TBW and FFM was also valid at each body mass index category. Ethnicity influences the relationship between BIA and body composition in Asian pre-pubertal children. The newly developed BIA prediction equations are valid for use in Asian pre-pubertal children.

  2. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less

  3. [Cross-Mapping: diagnostic labels formulated according to the ICNP® versus diagnosis of NANDA International].

    PubMed

    Tannure, Meire Chucre; Salgado, Patrícia de Oliveira; Chianca, Tânia Couto Machado

    2014-01-01

    This descriptive study aimed at elaborating nursing diagnostic labels according to ICNP®; conducting a cross-mapping between the diagnostic formulations and the diagnostic labels of NANDA-I; identifying the diagnostic labels thus obtained that were also listed in the NANDA-I; and mapping them according to Basic Human Needs. The workshop technique was applied to 32 intensive care nurses, the cross-mapping and validation based on agreement with experts. The workshop produced 1665 diagnostic labels which were further refined into 120 labels. They were then submitted to a cross-mapping process with both NANDA-I diagnostic labels and the Basic Human Needs. The mapping results underwent content validation by two expert nurses leading to concordance rates of 92% and 100%. It was found that 63 labels were listed in NANDA-I and 47 were not.

  4. Semi-Empirical Validation of the Cross-Band Relative Absorption Technique for the Measurement of Molecular Mixing Ratios

    NASA Technical Reports Server (NTRS)

    Pliutau, Denis; Prasad, Narasimha S

    2013-01-01

    Studies were performed to carry out semi-empirical validation of a new measurement approach we propose for molecular mixing ratios determination. The approach is based on relative measurements in bands of O2 and other molecules and as such may be best described as cross band relative absorption (CoBRA). . The current validation studies rely upon well verified and established theoretical and experimental databases, satellite data assimilations and modeling codes such as HITRAN, line-by-line radiative transfer model (LBLRTM), and the modern-era retrospective analysis for research and applications (MERRA). The approach holds promise for atmospheric mixing ratio measurements of CO2 and a variety of other molecules currently under investigation for several future satellite lidar missions. One of the advantages of the method is a significant reduction of the temperature sensitivity uncertainties which is illustrated with application to the ASCENDS mission for the measurement of CO2 mixing ratios (XCO2). Additional advantages of the method include the possibility to closely match cross-band weighting function combinations which is harder to achieve using conventional differential absorption techniques and the potential for additional corrections for water vapor and other interferences without using the data from numerical weather prediction (NWP) models.

  5. An intercomparison of a large ensemble of statistical downscaling methods for Europe: Overall results from the VALUE perfect predictor cross-validation experiment

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data (downscaled values) and metadata (characterizing different aspects of the downscaling methods). This constitutes the largest and most comprehensive to date intercomparison of statistical downscaling methods. Here, we present an overall validation, analyzing marginal and temporal aspects to assess the intrinsic performance and added value of statistical downscaling methods at both annual and seasonal levels. This validation takes into account the different properties/limitations of different approaches and techniques (as reported in the provided metadata) in order to perform a fair comparison. It is pointed out that this experiment alone is not sufficient to evaluate the limitations of (MOS) bias correction techniques. Moreover, it also does not fully validate PP since we don't learn whether we have the right predictors and whether the PP assumption is valid. These problems will be analyzed in the subsequent community-open VALUE experiments 2) and 3), which will be open for participation along the present year.

  6. Testing and Validating Machine Learning Classifiers by Metamorphic Testing☆

    PubMed Central

    Xie, Xiaoyuan; Ho, Joshua W. K.; Murphy, Christian; Kaiser, Gail; Xu, Baowen; Chen, Tsong Yueh

    2011-01-01

    Machine Learning algorithms have provided core functionality to many application domains - such as bioinformatics, computational linguistics, etc. However, it is difficult to detect faults in such applications because often there is no “test oracle” to verify the correctness of the computed outputs. To help address the software quality, in this paper we present a technique for testing the implementations of machine learning classification algorithms which support such applications. Our approach is based on the technique “metamorphic testing”, which has been shown to be effective to alleviate the oracle problem. Also presented include a case study on a real-world machine learning application framework, and a discussion of how programmers implementing machine learning algorithms can avoid the common pitfalls discovered in our study. We also conduct mutation analysis and cross-validation, which reveal that our method has high effectiveness in killing mutants, and that observing expected cross-validation result alone is not sufficiently effective to detect faults in a supervised classification program. The effectiveness of metamorphic testing is further confirmed by the detection of real faults in a popular open-source classification program. PMID:21532969

  7. Validating LES for Jet Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Bridges, James; Wernet, Mark P.

    2011-01-01

    Engineers charged with making jet aircraft quieter have long dreamed of being able to see exactly how turbulent eddies produce sound and this dream is now coming true with the advent of large eddy simulation (LES). Two obvious challenges remain: validating the LES codes at the resolution required to see the fluid-acoustic coupling, and the interpretation of the massive datasets that are produced. This paper addresses the former, the use of advanced experimental techniques such as particle image velocimetry (PIV) and Raman and Rayleigh scattering, to validate the computer codes and procedures used to create LES solutions. This paper argues that the issue of accuracy of the experimental measurements be addressed by cross-facility and cross-disciplinary examination of modern datasets along with increased reporting of internal quality checks in PIV analysis. Further, it argues that the appropriate validation metrics for aeroacoustic applications are increasingly complicated statistics that have been shown in aeroacoustic theory to be critical to flow-generated sound, such as two-point space-time velocity correlations. A brief review of data sources available is presented along with examples illustrating cross-facility and internal quality checks required of the data before it should be accepted for validation of LES.

  8. R package PRIMsrc: Bump Hunting by Patient Rule Induction Method for Survival, Regression and Classification

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    PRIMsrc is a novel implementation of a non-parametric bump hunting procedure, based on the Patient Rule Induction Method (PRIM), offering a unified treatment of outcome variables, including censored time-to-event (Survival), continuous (Regression) and discrete (Classification) responses. To fit the model, it uses a recursive peeling procedure with specific peeling criteria and stopping rules depending on the response. To validate the model, it provides an objective function based on prediction-error or other specific statistic, as well as two alternative cross-validation techniques, adapted to the task of decision-rule making and estimation in the three types of settings. PRIMsrc comes as an open source R package, including at this point: (i) a main function for fitting a Survival Bump Hunting model with various options allowing cross-validated model selection to control model size (#covariates) and model complexity (#peeling steps) and generation of cross-validated end-point estimates; (ii) parallel computing; (iii) various S3-generic and specific plotting functions for data visualization, diagnostic, prediction, summary and display of results. It is available on CRAN and GitHub. PMID:26798326

  9. SERS quantitative urine creatinine measurement of human subject

    NASA Astrophysics Data System (ADS)

    Wang, Tsuei Lian; Chiang, Hui-hua K.; Lu, Hui-hsin; Hung, Yung-da

    2005-03-01

    SERS method for biomolecular analysis has several potentials and advantages over traditional biochemical approaches, including less specimen contact, non-destructive to specimen, and multiple components analysis. Urine is an easily available body fluid for monitoring the metabolites and renal function of human body. We developed surface-enhanced Raman scattering (SERS) technique using 50nm size gold colloidal particles for quantitative human urine creatinine measurements. This paper shows that SERS shifts of creatinine (104mg/dl) in artificial urine is from 1400cm-1 to 1500cm-1 which was analyzed for quantitative creatinine measurement. Ten human urine samples were obtained from ten healthy persons and analyzed by the SERS technique. Partial least square cross-validation (PLSCV) method was utilized to obtain the estimated creatinine concentration in clinically relevant (55.9mg/dl to 208mg/dl) concentration range. The root-mean square error of cross validation (RMSECV) is 26.1mg/dl. This research demonstrates the feasibility of using SERS for human subject urine creatinine detection, and establishes the SERS platform technique for bodily fluids measurement.

  10. Automatic welding detection by an intelligent tool pipe inspection

    NASA Astrophysics Data System (ADS)

    Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.

    2015-07-01

    This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.

  11. Measurement of stream channel habitat using sonar

    USGS Publications Warehouse

    Flug, Marshall; Seitz, Heather; Scott, John

    1998-01-01

    An efficient and low cost technique using a sonar system was evaluated for describing channel geometry and quantifying inundated area in a large river. The boat-mounted portable sonar equipment was used to record water depths and river width measurements for direct storage on a laptop computer. The field data collected from repeated traverses at a cross-section were evaluated to determine the precision of the system and field technique. Results from validation at two different sites showed average sample standard deviations (S.D.s) of 0.12 m for these complete cross-sections, with coefficient of variations of 10%. Validation using only the mid-channel river cross-section data yields an average sample S.D. of 0.05 m, with a coefficient of variation below 5%, at a stable and gauged river site using only measurements of water depths greater than 0.6 m. Accuracy of the sonar system was evaluated by comparison to traditionally surveyed transect data from a regularly gauged site. We observed an average mean squared deviation of 46.0 cm2, considering only that portion of the cross-section inundated by more than 0.6 m of water. Our procedure proved to be a reliable, accurate, safe, quick, and economic method to record river depths, discharges, bed conditions, and substratum composition necessary for stream habitat studies.

  12. Cascade Back-Propagation Learning in Neural Networks

    NASA Technical Reports Server (NTRS)

    Duong, Tuan A.

    2003-01-01

    The cascade back-propagation (CBP) algorithm is the basis of a conceptual design for accelerating learning in artificial neural networks. The neural networks would be implemented as analog very-large-scale integrated (VLSI) circuits, and circuits to implement the CBP algorithm would be fabricated on the same VLSI circuit chips with the neural networks. Heretofore, artificial neural networks have learned slowly because it has been necessary to train them via software, for lack of a good on-chip learning technique. The CBP algorithm is an on-chip technique that provides for continuous learning in real time. Artificial neural networks are trained by example: A network is presented with training inputs for which the correct outputs are known, and the algorithm strives to adjust the weights of synaptic connections in the network to make the actual outputs approach the correct outputs. The input data are generally divided into three parts. Two of the parts, called the "training" and "cross-validation" sets, respectively, must be such that the corresponding input/output pairs are known. During training, the cross-validation set enables verification of the status of the input-to-output transformation learned by the network to avoid over-learning. The third part of the data, termed the "test" set, consists of the inputs that are required to be transformed into outputs; this set may or may not include the training set and/or the cross-validation set. Proposed neural-network circuitry for on-chip learning would be divided into two distinct networks; one for training and one for validation. Both networks would share the same synaptic weights.

  13. Cross-validation and Peeling Strategies for Survival Bump Hunting using Recursive Peeling Methods

    PubMed Central

    Dazard, Jean-Eudes; Choe, Michael; LeBlanc, Michael; Rao, J. Sunil

    2015-01-01

    We introduce a framework to build a survival/risk bump hunting model with a censored time-to-event response. Our Survival Bump Hunting (SBH) method is based on a recursive peeling procedure that uses a specific survival peeling criterion derived from non/semi-parametric statistics such as the hazards-ratio, the log-rank test or the Nelson--Aalen estimator. To optimize the tuning parameter of the model and validate it, we introduce an objective function based on survival or prediction-error statistics, such as the log-rank test and the concordance error rate. We also describe two alternative cross-validation techniques adapted to the joint task of decision-rule making by recursive peeling and survival estimation. Numerical analyses show the importance of replicated cross-validation and the differences between criteria and techniques in both low and high-dimensional settings. Although several non-parametric survival models exist, none addresses the problem of directly identifying local extrema. We show how SBH efficiently estimates extreme survival/risk subgroups unlike other models. This provides an insight into the behavior of commonly used models and suggests alternatives to be adopted in practice. Finally, our SBH framework was applied to a clinical dataset. In it, we identified subsets of patients characterized by clinical and demographic covariates with a distinct extreme survival outcome, for which tailored medical interventions could be made. An R package PRIMsrc (Patient Rule Induction Method in Survival, Regression and Classification settings) is available on CRAN (Comprehensive R Archive Network) and GitHub. PMID:27034730

  14. The anatomy of floating shock fitting. [shock waves computation for flow field

    NASA Technical Reports Server (NTRS)

    Salas, M. D.

    1975-01-01

    The floating shock fitting technique is examined. Second-order difference formulas are developed for the computation of discontinuities. A procedure is developed to compute mesh points that are crossed by discontinuities. The technique is applied to the calculation of internal two-dimensional flows with arbitrary number of shock waves and contact surfaces. A new procedure, based on the coalescence of characteristics, is developed to detect the formation of shock waves. Results are presented to validate and demonstrate the versatility of the technique.

  15. Efficient generalized cross-validation with applications to parametric image restoration and resolution enhancement.

    PubMed

    Nguyen, N; Milanfar, P; Golub, G

    2001-01-01

    In many image restoration/resolution enhancement applications, the blurring process, i.e., point spread function (PSF) of the imaging system, is not known or is known only to within a set of parameters. We estimate these PSF parameters for this ill-posed class of inverse problem from raw data, along with the regularization parameters required to stabilize the solution, using the generalized cross-validation method (GCV). We propose efficient approximation techniques based on the Lanczos algorithm and Gauss quadrature theory, reducing the computational complexity of the GCV. Data-driven PSF and regularization parameter estimation experiments with synthetic and real image sequences are presented to demonstrate the effectiveness and robustness of our method.

  16. Psychometric validation of a condom self-efficacy scale in Korean.

    PubMed

    Cha, EunSeok; Kim, Kevin H; Burke, Lora E

    2008-01-01

    When an instrument is translated for use in cross-cultural research, it needs to account for cultural factors without distorting the psychometric properties of the instrument. To validate the psychometric properties of the condom self-efficacy scale (CSE) originally developed for American adolescents and young adults after translating the scale to Korean (CSE-K) to determine its suitability for cross-cultural research among Korean college students. A cross-sectional, correlational design was used with an exploratory survey methodology through self-report questionnaires. A convenience sample of 351 students, aged 18 to 25 years, were recruited at a university in Seoul, Korea. The participants completed the CSE-K and the intention of condom use scales after they were translated from English to Korean using a combined translation technique. A demographic and sex history questionnaire, which included an item to assess actual condom usage, was also administered. Mean, variance, reliability, criterion validity, and factorial validity using confirmatory factor analysis were assessed in the CSE-K. Norms for the CSE-K were similar, but not identical, to norms for the English version. The means of all three subscales were lower for the CSE-K than for the original CSE; however, the obtained variance in CSE-K was roughly similar with the original CSE. The Cronbach's alpha coefficient for the total scale was higher for the CSE-K (.91) than that for either the CSE (.85) or CSE in Thai (.85). Criterion validity and construct validity of the CSE-K were confirmed. The CSE-K was a reliable and valid scale in measuring condom self-efficacy among Korean college students. The findings suggest that the CSE was an appropriate instrument to conduct cross-cultural research on sexual behavior in adolescents and young adults.

  17. Unremarked or Unperformed? Systematic Review on Reporting of Validation Efforts of Health Economic Decision Models in Seasonal Influenza and Early Breast Cancer.

    PubMed

    de Boer, Pieter T; Frederix, Geert W J; Feenstra, Talitha L; Vemer, Pepijn

    2016-09-01

    Transparent reporting of validation efforts of health economic models give stakeholders better insight into the credibility of model outcomes. In this study we reviewed recently published studies on seasonal influenza and early breast cancer in order to gain insight into the reporting of model validation efforts in the overall health economic literature. A literature search was performed in Pubmed and Embase to retrieve health economic modelling studies published between 2008 and 2014. Reporting on model validation was evaluated by checking for the word validation, and by using AdViSHE (Assessment of the Validation Status of Health Economic decision models), a tool containing a structured list of relevant items for validation. Additionally, we contacted corresponding authors to ask whether more validation efforts were performed other than those reported in the manuscripts. A total of 53 studies on seasonal influenza and 41 studies on early breast cancer were included in our review. The word validation was used in 16 studies (30 %) on seasonal influenza and 23 studies (56 %) on early breast cancer; however, in a minority of studies, this referred to a model validation technique. Fifty-seven percent of seasonal influenza studies and 71 % of early breast cancer studies reported one or more validation techniques. Cross-validation of study outcomes was found most often. A limited number of studies reported on model validation efforts, although good examples were identified. Author comments indicated that more validation techniques were performed than those reported in the manuscripts. Although validation is deemed important by many researchers, this is not reflected in the reporting habits of health economic modelling studies. Systematic reporting of validation efforts would be desirable to further enhance decision makers' confidence in health economic models and their outcomes.

  18. Continuum Modeling of Inductor Hysteresis and Eddy Current Loss Effects in Resonant Circuits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pries, Jason L.; Tang, Lixin; Burress, Timothy A.

    This paper presents experimental validation of a high-fidelity toroid inductor modeling technique. The aim of this research is to accurately model the instantaneous magnetization state and core losses in ferromagnetic materials. Quasi–static hysteresis effects are captured using a Preisach model. Eddy currents are included by coupling the associated quasi-static Everett function to a simple finite element model representing the inductor cross sectional area. The modeling technique is validated against the nonlinear frequency response from two different series RLC resonant circuits using inductors made of electrical steel and soft ferrite. The method is shown to accurately model shifts in resonant frequencymore » and quality factor. The technique also successfully predicts a discontinuity in the frequency response of the ferrite inductor resonant circuit.« less

  19. Elastic modulus measurements at variable temperature: Validation of atomic force microscopy techniques

    NASA Astrophysics Data System (ADS)

    Natali, Marco; Reggente, Melania; Passeri, Daniele; Rossi, Marco

    2016-06-01

    The development of polymer-based nanocomposites to be used in critical thermal environments requires the characterization of their mechanical properties, which are related to their chemical composition, size, morphology and operating temperature. Atomic force microscopy (AFM) has been proven to be a useful tool to develop techniques for the mechanical characterization of these materials, thanks to its nanometer lateral resolution and to the capability of exerting ultra-low loads, down to the piconewton range. In this work, we demonstrate two techniques, one quasi-static, i.e., AFM-based indentation (I-AFM), and one dynamic, i.e., contact resonance AFM (CR-AFM), for the mechanical characterization of compliant materials at variable temperature. A cross-validation of I-AFM and CR-AFM has been performed by comparing the results obtained on two reference materials, i.e., low-density polyethylene (LDPE) and polycarbonate (PC), which demonstrated the accuracy of the techniques.

  20. Measurement Properties of the Persian Translated Version of Graves Orbitopathy Quality of Life Questionnaire: A Validation Study.

    PubMed

    Kashkouli, Mohsen Bahmani; Karimi, Nasser; Aghamirsalim, Mohamadreza; Abtahi, Mohammad Bagher; Nojomi, Marzieh; Shahrad-Bejestani, Hadi; Salehi, Masoud

    2017-02-01

    To determine the measurement properties of the Persian language version of the Graves orbitopathy quality of life questionnaire (GO-QOL). Following a systematic translation and cultural adaptation process, 141 consecutive unselected thyroid eye disease (TED) patients answered the Persian GO-QOL and underwent complete ophthalmic examination. The questionnaire was again completed by 60 patients on the second visit, 2-4 weeks later. Construct validity (cross-cultural validity, structural validity and hypotheses testing), reliability (internal consistency and test-retest reliability), and floor and ceiling effects of the Persian version of the GO-QOL were evaluated. Furthermore, Rasch analysis was used to assess its psychometric properties. Cross-cultural validity was established by back-translation techniques, committee review and pretesting techniques. Bi-dimensionality of the questionnaire was confirmed by factor analysis. Construct validity was also supported through confirmation of 6 out of 8 predefined hypotheses. Cronbach's α and intraclass correlation coefficient (ICC) were 0.650 and 0.859 for visual functioning and 0.875 and 0.896 for appearance subscale, respectively. Mean quality of life (QOL) scores for visual functioning and appearance were 78.18 (standard deviation, SD, 21.57) and 56.25 (SD 26.87), respectively. Person reliabilities from the Rasch rating scale model for both visual functioning and appearance revealed an acceptable internal consistency for the Persian GO-QOL. The Persian GO-QOL questionnaire is a valid and reliable tool with good psychometric properties in evaluation of Persian-speaking patients with TED. Applying Rasch analysis to future versions of the GO-QOL is recommended in order to perform tests for linearity between the estimated item measures in different versions.

  1. Pitfalls in Prediction Modeling for Normal Tissue Toxicity in Radiation Therapy: An Illustration With the Individual Radiation Sensitivity and Mammary Carcinoma Risk Factor Investigation Cohorts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mbah, Chamberlain, E-mail: chamberlain.mbah@ugent.be; Department of Mathematical Modeling, Statistics, and Bioinformatics, Faculty of Bioscience Engineering, Ghent University, Ghent; Thierens, Hubert

    Purpose: To identify the main causes underlying the failure of prediction models for radiation therapy toxicity to replicate. Methods and Materials: Data were used from two German cohorts, Individual Radiation Sensitivity (ISE) (n=418) and Mammary Carcinoma Risk Factor Investigation (MARIE) (n=409), of breast cancer patients with similar characteristics and radiation therapy treatments. The toxicity endpoint chosen was telangiectasia. The LASSO (least absolute shrinkage and selection operator) logistic regression method was used to build a predictive model for a dichotomized endpoint (Radiation Therapy Oncology Group/European Organization for the Research and Treatment of Cancer score 0, 1, or ≥2). Internal areas undermore » the receiver operating characteristic curve (inAUCs) were calculated by a naïve approach whereby the training data (ISE) were also used for calculating the AUC. Cross-validation was also applied to calculate the AUC within the same cohort, a second type of inAUC. Internal AUCs from cross-validation were calculated within ISE and MARIE separately. Models trained on one dataset (ISE) were applied to a test dataset (MARIE) and AUCs calculated (exAUCs). Results: Internal AUCs from the naïve approach were generally larger than inAUCs from cross-validation owing to overfitting the training data. Internal AUCs from cross-validation were also generally larger than the exAUCs, reflecting heterogeneity in the predictors between cohorts. The best models with largest inAUCs from cross-validation within both cohorts had a number of common predictors: hypertension, normalized total boost, and presence of estrogen receptors. Surprisingly, the effect (coefficient in the prediction model) of hypertension on telangiectasia incidence was positive in ISE and negative in MARIE. Other predictors were also not common between the 2 cohorts, illustrating that overcoming overfitting does not solve the problem of replication failure of prediction models completely. Conclusions: Overfitting and cohort heterogeneity are the 2 main causes of replication failure of prediction models across cohorts. Cross-validation and similar techniques (eg, bootstrapping) cope with overfitting, but the development of validated predictive models for radiation therapy toxicity requires strategies that deal with cohort heterogeneity.« less

  2. Aqueous two-phase system patterning of detection antibody solutions for cross-reaction-free multiplex ELISA

    NASA Astrophysics Data System (ADS)

    Frampton, John P.; White, Joshua B.; Simon, Arlyne B.; Tsuei, Michael; Paczesny, Sophie; Takayama, Shuichi

    2014-05-01

    Accurate disease diagnosis, patient stratification and biomarker validation require the analysis of multiple biomarkers. This paper describes cross-reactivity-free multiplexing of enzyme-linked immunosorbent assays (ELISAs) using aqueous two-phase systems (ATPSs) to confine detection antibodies at specific locations in fully aqueous environments. Antibody cross-reactions are eliminated because the detection antibody solutions are co-localized only to corresponding surface-immobilized capture antibody spots. This multiplexing technique is validated using plasma samples from allogeneic bone marrow recipients. Patients with acute graft versus host disease (GVHD), a common and serious condition associated with allogeneic bone marrow transplantation, display higher mean concentrations for four multiplexed biomarkers (HGF, elafin, ST2 and TNFR1) relative to healthy donors and transplant patients without GVHD. The antibody co-localization capability of this technology is particularly useful when using inherently cross-reactive reagents such as polyclonal antibodies, although monoclonal antibody cross-reactivity can also be reduced. Because ATPS-ELISA adapts readily available antibody reagents, plate materials and detection instruments, it should be easily transferable into other research and clinical settings.

  3. Aqueous two-phase system patterning of detection antibody solutions for cross-reaction-free multiplex ELISA

    PubMed Central

    Frampton, John P.; White, Joshua B.; Simon, Arlyne B.; Tsuei, Michael; Paczesny, Sophie; Takayama, Shuichi

    2014-01-01

    Accurate disease diagnosis, patient stratification and biomarker validation require the analysis of multiple biomarkers. This paper describes cross-reactivity-free multiplexing of enzyme-linked immunosorbent assays (ELISAs) using aqueous two-phase systems (ATPSs) to confine detection antibodies at specific locations in fully aqueous environments. Antibody cross-reactions are eliminated because the detection antibody solutions are co-localized only to corresponding surface-immobilized capture antibody spots. This multiplexing technique is validated using plasma samples from allogeneic bone marrow recipients. Patients with acute graft versus host disease (GVHD), a common and serious condition associated with allogeneic bone marrow transplantation, display higher mean concentrations for four multiplexed biomarkers (HGF, elafin, ST2 and TNFR1) relative to healthy donors and transplant patients without GVHD. The antibody co-localization capability of this technology is particularly useful when using inherently cross-reactive reagents such as polyclonal antibodies, although monoclonal antibody cross-reactivity can also be reduced. Because ATPS-ELISA adapts readily available antibody reagents, plate materials and detection instruments, it should be easily transferable into other research and clinical settings. PMID:24786974

  4. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  5. Typification of cider brandy on the basis of cider used in its manufacture.

    PubMed

    Rodríguez Madrera, Roberto; Mangas Alonso, Juan J

    2005-04-20

    A study of typification of cider brandies on the basis of the origin of the raw material used in their manufacture was conducted using chemometric techniques (principal component analysis, linear discriminant analysis, and Bayesian analysis) together with their composition in volatile compounds, as analyzed by gas chromatography with flame ionization to detect the major volatiles and by mass spectrometric to detect the minor ones. Significant principal components computed by a double cross-validation procedure allowed the structure of the database to be visualized as a function of the raw material, that is, cider made from fresh apple juice versus cider made from apple juice concentrate. Feasible and robust discriminant rules were computed and validated by a cross-validation procedure that allowed the authors to classify fresh and concentrate cider brandies, obtaining classification hits of >92%. The most discriminating variables for typifying cider brandies according to their raw material were 1-butanol and ethyl hexanoate.

  6. Turbulence measurements in a complex plowfield using a crossed hot-wire. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Mckillop, B. E.

    1983-01-01

    Turbulence was quantified in complex axisymmetric, nonreacting, nonswirling flowfields using a crossed hot-wire anemometer. Mean velocity, turbulence intensities, turbulent viscosity, and Reynolds tree were measured in round free jet and confined jet flowfields. The confined jet, a model of an axisymmetric can combustor, had an expansion ratio D/d=2, an expansion angle of 90 deg, and an axial location increments of 0.5 diameters. The confined jet was studied with and without a contraction nozzle. Free jet measurements validated the experimental technique and data reduction. Results show good agreement with those of previous research. Measurements in the confined jet indicate that the cross hot-wire used cannot handle axial flow reversal and the experimental technique is inadequate for measuring time-mean radial velocity. Other quantities show a high level of comparability.

  7. Modification of the random forest algorithm to avoid statistical dependence problems when classifying remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Cánovas-García, Fulgencio; Alonso-Sarría, Francisco; Gomariz-Castillo, Francisco; Oñate-Valdivieso, Fernando

    2017-06-01

    Random forest is a classification technique widely used in remote sensing. One of its advantages is that it produces an estimation of classification accuracy based on the so called out-of-bag cross-validation method. It is usually assumed that such estimation is not biased and may be used instead of validation based on an external data-set or a cross-validation external to the algorithm. In this paper we show that this is not necessarily the case when classifying remote sensing imagery using training areas with several pixels or objects. According to our results, out-of-bag cross-validation clearly overestimates accuracy, both overall and per class. The reason is that, in a training patch, pixels or objects are not independent (from a statistical point of view) of each other; however, they are split by bootstrapping into in-bag and out-of-bag as if they were really independent. We believe that putting whole patch, rather than pixels/objects, in one or the other set would produce a less biased out-of-bag cross-validation. To deal with the problem, we propose a modification of the random forest algorithm to split training patches instead of the pixels (or objects) that compose them. This modified algorithm does not overestimate accuracy and has no lower predictive capability than the original. When its results are validated with an external data-set, the accuracy is not different from that obtained with the original algorithm. We analysed three remote sensing images with different classification approaches (pixel and object based); in the three cases reported, the modification we propose produces a less biased accuracy estimation.

  8. Computation of leaky guided waves dispersion spectrum using vibroacoustic analyses and the Matrix Pencil Method: a validation study for immersed rectangular waveguides.

    PubMed

    Mazzotti, M; Bartoli, I; Castellazzi, G; Marzani, A

    2014-09-01

    The paper aims at validating a recently proposed Semi Analytical Finite Element (SAFE) formulation coupled with a 2.5D Boundary Element Method (2.5D BEM) for the extraction of dispersion data in immersed waveguides of generic cross-section. To this end, three-dimensional vibroacoustic analyses are carried out on two waveguides of square and rectangular cross-section immersed in water using the commercial Finite Element software Abaqus/Explicit. Real wavenumber and attenuation dispersive data are extracted by means of a modified Matrix Pencil Method. It is demonstrated that the results obtained using the two techniques are in very good agreement. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Considerations in cross-validation type density smoothing with a look at some data

    NASA Technical Reports Server (NTRS)

    Schuster, E. F.

    1982-01-01

    Experience gained in applying nonparametric maximum likelihood techniques of density estimation to judge the comparative quality of various estimators is reported. Two invariate data sets of one hundered samples (one Cauchy, one natural normal) are considered as well as studies in the multivariate case.

  10. Prediction of Recidivism in Juvenile Offenders Based on Discriminant Analysis.

    ERIC Educational Resources Information Center

    Proefrock, David W.

    The recent development of strong statistical techniques has made accurate predictions of recidivism possible. To investigate the utility of discriminant analysis methodology in making predictions of recidivism in juvenile offenders, the court records of 271 male and female juvenile offenders, aged 12-16, were reviewed. A cross validation group…

  11. Non-destructive Techniques for Classifying Aircraft Coating Degradation

    DTIC Science & Technology

    2015-03-26

    model is bidirectional reflectance distribution func- tions ( BRDF ) which describes how much radiation is reflected for each solid angle and each...incident angle. An intermediate model between ideal reflectors and BRDF is to assume all reflectance is a combination of diffuse and specular reflectance...19 K-Fold Cross Validation

  12. Translation of scales in cross-cultural research: issues and techniques.

    PubMed

    Cha, Eun-Seok; Kim, Kevin H; Erlen, Judith A

    2007-05-01

    This paper is a report of a study designed to: (i) describe issues and techniques of translation of standard measures for use in international research; (ii) identify a user-friendly and valid translation method when researchers have limited resources during translation procedure; and (iii) discuss translation issues using data from a pilot study as an example. The process of translation is an important part of cross-cultural studies. Cross-cultural researchers are often confronted by the need to translate scales from one language to another and to do this with limited resources. The lessons learned from our experience in a pilot study are presented to underline the importance of using appropriate translation procedures. The issues of the back-translation method are discussed to identify strategies to ensure success when translating measures. A combined technique is an appropriate method to maintain the content equivalences between the original and translated instruments in international research. There are several possible combinations of translation techniques. However, there is no gold standard of translation techniques because the research environment (e.g. accessibility and availability of bilingual people) and the research questions are different. It is important to use appropriate translation procedures and to employ a combined translation technique based on the research environment and questions.

  13. Learning to recognize rat social behavior: Novel dataset and cross-dataset application.

    PubMed

    Lorbach, Malte; Kyriakou, Elisavet I; Poppe, Ronald; van Dam, Elsbeth A; Noldus, Lucas P J J; Veltkamp, Remco C

    2018-04-15

    Social behavior is an important aspect of rodent models. Automated measuring tools that make use of video analysis and machine learning are an increasingly attractive alternative to manual annotation. Because machine learning-based methods need to be trained, it is important that they are validated using data from different experiment settings. To develop and validate automated measuring tools, there is a need for annotated rodent interaction datasets. Currently, the availability of such datasets is limited to two mouse datasets. We introduce the first, publicly available rat social interaction dataset, RatSI. We demonstrate the practical value of the novel dataset by using it as the training set for a rat interaction recognition method. We show that behavior variations induced by the experiment setting can lead to reduced performance, which illustrates the importance of cross-dataset validation. Consequently, we add a simple adaptation step to our method and improve the recognition performance. Most existing methods are trained and evaluated in one experimental setting, which limits the predictive power of the evaluation to that particular setting. We demonstrate that cross-dataset experiments provide more insight in the performance of classifiers. With our novel, public dataset we encourage the development and validation of automated recognition methods. We are convinced that cross-dataset validation enhances our understanding of rodent interactions and facilitates the development of more sophisticated recognition methods. Combining them with adaptation techniques may enable us to apply automated recognition methods to a variety of animals and experiment settings. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. A cross-sectional observational study to assess inhaler technique in Saudi hospitalized patients with asthma and chronic obstructive pulmonary disease

    PubMed Central

    Ammari, Maha Al; Sultana, Khizra; Yunus, Faisal; Ghobain, Mohammed Al; Halwan, Shatha M. Al

    2016-01-01

    Objectives: To assess the proportion of critical errors committed while demonstrating the inhaler technique in hospitalized patients diagnosed with asthma and chronic obstructive pulmonary disease (COPD). Methods: This cross-sectional observational study was conducted in 47 asthmatic and COPD patients using inhaler devices. The study took place at King Abdulaziz Medical City, Riyadh, Saudi Arabia between September and December 2013. Two pharmacists independently assessed inhaler technique with a validated checklist. Results: Seventy percent of patients made at least one critical error while demonstrating their inhaler technique, and the mean number of critical errors per patient was 1.6. Most patients used metered dose inhaler (MDI), and 73% of MDI users and 92% of dry powder inhaler users committed at least one critical error. Conclusion: Inhaler technique in hospitalized Saudi patients was inadequate. Health care professionals should understand the importance of reassessing and educating patients on a regular basis for inhaler technique, recommend the use of a spacer when needed, and regularly assess and update their own inhaler technique skills. PMID:27146622

  15. Measurement of conjugated linoleic acid (CLA) in CLA-rich soy oil by attenuated total reflectance-Fourier transform infrared spectroscopy (ATR-FTIR).

    PubMed

    Kadamne, Jeta V; Jain, Vishal P; Saleh, Mohammed; Proctor, Andrew

    2009-11-25

    Conjugated linoleic acid (CLA) isomers in oils are currently measured as fatty acid methyl esters by a gas chromatography-flame ionization detector (GC-FID) technique, which requires approximately 2 h to complete the analysis. Hence, we aim to develop a method to rapidly determine CLA isomers in CLA-rich soy oil. Soy oil with 0.38-25.11% total CLA was obtained by photo-isomerization of 96 soy oil samples for 24 h. A sample was withdrawn at 30 min intervals with repeated processing using a second batch of oil. Six replicates of GC-FID fatty acid analysis were conducted for each oil sample. The oil samples were scanned using attenuated total reflectance-Fourier transform infrared spectroscopy (ATR-FTIR), and the spectrum was collected. Calibration models were developed using partial least-squares (PLS-1) regression using Unscrambler software. Models were validated using a full cross-validation technique and tested using samples that were not included in the calibration sample set. Measured and predicted total CLA, trans,trans CLA isomers, total mono trans CLA isomers, trans-10,cis-12 CLA, trans-9,cis-11 CLA and cis-10,trans-12 CLA, and cis-9,trans-11 CLA had cross-validated coefficients of determinations (R2v) of 0.97, 0.98, 0.97, 0.98, 0.97, and 0.99 and corresponding root-mean-square error of validation (RMSEV) of 1.14, 0.69, 0.27, 0.07, 0.14, and 0.07% CLA, respectively. The ATR-FTIR technique is a rapid and less expensive method for determining CLA isomers in linoleic acid photo-isomerized soy oil than GC-FID.

  16. Solid-state NMR characterization of cross-linking in EPDM/PP blends from 1H-13C polarization transfer dynamics.

    PubMed

    Aluas, Mihaela; Filip, Claudiu

    2005-05-01

    A novel approach for solid-state NMR characterization of cross-linking in polymer blends from the analysis of (1)H-(13)C polarization transfer dynamics is introduced. It extends the model of residual dipolar couplings under permanent cross-linking, typically used to describe (1)H transverse relaxation techniques, by considering a more realistic distribution of the order parameter along a polymer chain in rubbers. Based on a systematic numerical analysis, the extended model was shown to accurately reproduce all the characteristic features of the cross-polarization curves measured on such materials. This is particularly important for investigating blends of great technological potential, like thermoplastic elastomers, where (13)C high-resolution techniques, such as CP-MAS, are indispensable to selectively investigate structural and dynamical properties of the desired component. The validity of the new approach was demonstrated using the example of the CP build-up curves measured on a well resolved EPDM resonance line in a series of EPDM/PP blends.

  17. Hekate: Software Suite for the Mass Spectrometric Analysis and Three-Dimensional Visualization of Cross-Linked Protein Samples

    PubMed Central

    2013-01-01

    Chemical cross-linking of proteins combined with mass spectrometry provides an attractive and novel method for the analysis of native protein structures and protein complexes. Analysis of the data however is complex. Only a small number of cross-linked peptides are produced during sample preparation and must be identified against a background of more abundant native peptides. To facilitate the search and identification of cross-linked peptides, we have developed a novel software suite, named Hekate. Hekate is a suite of tools that address the challenges involved in analyzing protein cross-linking experiments when combined with mass spectrometry. The software is an integrated pipeline for the automation of the data analysis workflow and provides a novel scoring system based on principles of linear peptide analysis. In addition, it provides a tool for the visualization of identified cross-links using three-dimensional models, which is particularly useful when combining chemical cross-linking with other structural techniques. Hekate was validated by the comparative analysis of cytochrome c (bovine heart) against previously reported data.1 Further validation was carried out on known structural elements of DNA polymerase III, the catalytic α-subunit of the Escherichia coli DNA replisome along with new insight into the previously uncharacterized C-terminal domain of the protein. PMID:24010795

  18. Compressive Sensing with Cross-Validation and Stop-Sampling for Sparse Polynomial Chaos Expansions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    Compressive sensing is a powerful technique for recovering sparse solutions of underdetermined linear systems, which is often encountered in uncertainty quanti cation analysis of expensive and high-dimensional physical models. We perform numerical investigations employing several com- pressive sensing solvers that target the unconstrained LASSO formulation, with a focus on linear systems that arise in the construction of polynomial chaos expansions. With core solvers of l1 ls, SpaRSA, CGIST, FPC AS, and ADMM, we develop techniques to mitigate over tting through an automated selection of regularization constant based on cross-validation, and a heuristic strategy to guide the stop-sampling decision. Practical recommendationsmore » on parameter settings for these tech- niques are provided and discussed. The overall method is applied to a series of numerical examples of increasing complexity, including large eddy simulations of supersonic turbulent jet-in-cross flow involving a 24-dimensional input. Through empirical phase-transition diagrams and convergence plots, we illustrate sparse recovery performance under structures induced by polynomial chaos, accuracy and computational tradeoffs between polynomial bases of different degrees, and practi- cability of conducting compressive sensing for a realistic, high-dimensional physical application. Across test cases studied in this paper, we find ADMM to have demonstrated empirical advantages through consistent lower errors and faster computational times.« less

  19. Generation of 238U Covariance Matrices by Using the Integral Data Assimilation Technique of the CONRAD Code

    NASA Astrophysics Data System (ADS)

    Privas, E.; Archier, P.; Bernard, D.; De Saint Jean, C.; Destouche, C.; Leconte, P.; Noguère, G.; Peneliau, Y.; Capote, R.

    2016-02-01

    A new IAEA Coordinated Research Project (CRP) aims to test, validate and improve the IRDF library. Among the isotopes of interest, the modelisation of the 238U capture and fission cross sections represents a challenging task. A new description of the 238U neutrons induced reactions in the fast energy range is within progress in the frame of an IAEA evaluation consortium. The Nuclear Data group of Cadarache participates in this effort utilizing the 238U spectral indices measurements and Post Irradiated Experiments (PIE) carried out in the fast reactors MASURCA (CEA Cadarache) and PHENIX (CEA Marcoule). Such a collection of experimental results provides reliable integral information on the (n,γ) and (n,f) cross sections. This paper presents the Integral Data Assimilation (IDA) technique of the CONRAD code used to propagate the uncertainties of the integral data on the 238U cross sections of interest for dosimetry applications.

  20. Rapid determination of Swiss cheese composition by Fourier transform infrared/attenuated total reflectance spectroscopy.

    PubMed

    Rodriguez-Saona, L E; Koca, N; Harper, W J; Alvarez, V B

    2006-05-01

    There is a need for rapid and simple techniques that can be used to predict the quality of cheese. The aim of this research was to develop a simple and rapid screening tool for monitoring Swiss cheese composition by using Fourier transform infrared spectroscopy. Twenty Swiss cheese samples from different manufacturers and degree of maturity were evaluated. Direct measurements of Swiss cheese slices (approximately 0.5 g) were made using a MIRacle 3-reflection diamond attenuated total reflectance (ATR) accessory. Reference methods for moisture (vacuum oven), protein content (Kjeldahl), and fat (Babcock) were used. Calibration models were developed based on a cross-validated (leave-one-out approach) partial least squares regression. The information-rich infrared spectral range for Swiss cheese samples was from 3,000 to 2,800 cm(-1) and 1,800 to 900 cm(-1). The performance statistics for cross-validated models gave estimates for standard error of cross-validation of 0.45, 0.25, and 0.21% for moisture, protein, and fat respectively, and correlation coefficients r > 0.96. Furthermore, the ATR infrared protocol allowed for the classification of cheeses according to manufacturer and aging based on unique spectral information, especially of carbonyl groups, probably due to their distinctive lipid composition. Attenuated total reflectance infrared spectroscopy allowed for the rapid (approximately 3-min analysis time) and accurate analysis of the composition of Swiss cheese. This technique could contribute to the development of simple and rapid protocols for monitoring complex biochemical changes, and predicting the final quality of the cheese.

  1. Using patient data similarities to predict radiation pneumonitis via a self-organizing map

    NASA Astrophysics Data System (ADS)

    Chen, Shifeng; Zhou, Sumin; Yin, Fang-Fang; Marks, Lawrence B.; Das, Shiva K.

    2008-01-01

    This work investigates the use of the self-organizing map (SOM) technique for predicting lung radiation pneumonitis (RP) risk. SOM is an effective method for projecting and visualizing high-dimensional data in a low-dimensional space (map). By projecting patients with similar data (dose and non-dose factors) onto the same region of the map, commonalities in their outcomes can be visualized and categorized. Once built, the SOM may be used to predict pneumonitis risk by identifying the region of the map that is most similar to a patient's characteristics. Two SOM models were developed from a database of 219 lung cancer patients treated with radiation therapy (34 clinically diagnosed with Grade 2+ pneumonitis). The models were: SOMall built from all dose and non-dose factors and, for comparison, SOMdose built from dose factors alone. Both models were tested using ten-fold cross validation and Receiver Operating Characteristics (ROC) analysis. Models SOMall and SOMdose yielded ten-fold cross-validated ROC areas of 0.73 (sensitivity/specificity = 71%/68%) and 0.67 (sensitivity/specificity = 63%/66%), respectively. The significant difference between the cross-validated ROC areas of these two models (p < 0.05) implies that non-dose features add important information toward predicting RP risk. Among the input features selected by model SOMall, the two with highest impact for increasing RP risk were: (a) higher mean lung dose and (b) chemotherapy prior to radiation therapy. The SOM model developed here may not be extrapolated to treatment techniques outside that used in our database, such as several-field lung intensity modulated radiation therapy or gated radiation therapy.

  2. Parkinson's disease detection based on dysphonia measurements

    NASA Astrophysics Data System (ADS)

    Lahmiri, Salim

    2017-04-01

    Assessing dysphonic symptoms is a noninvasive and effective approach to detect Parkinson's disease (PD) in patients. The main purpose of this study is to investigate the effect of different dysphonia measurements on PD detection by support vector machine (SVM). Seven categories of dysphonia measurements are considered. Experimental results from ten-fold cross-validation technique demonstrate that vocal fundamental frequency statistics yield the highest accuracy of 88 % ± 0.04. When all dysphonia measurements are employed, the SVM classifier achieves 94 % ± 0.03 accuracy. A refinement of the original patterns space by removing dysphonia measurements with similar variation across healthy and PD subjects allows achieving 97.03 % ± 0.03 accuracy. The latter performance is larger than what is reported in the literature on the same dataset with ten-fold cross-validation technique. Finally, it was found that measures of ratio of noise to tonal components in the voice are the most suitable dysphonic symptoms to detect PD subjects as they achieve 99.64 % ± 0.01 specificity. This finding is highly promising for understanding PD symptoms.

  3. QSPR for predicting chloroform formation in drinking water disinfection.

    PubMed

    Luilo, G B; Cabaniss, S E

    2011-01-01

    Chlorination is the most widely used technique for water disinfection, but may lead to the formation of chloroform (trichloromethane; TCM) and other by-products. This article reports the first quantitative structure-property relationship (QSPR) for predicting the formation of TCM in chlorinated drinking water. Model compounds (n = 117) drawn from 10 literature sources were divided into training data (n = 90, analysed by five-way leave-many-out internal cross-validation) and external validation data (n = 27). QSPR internal cross-validation had Q² = 0.94 and root mean square error (RMSE) of 0.09 moles TCM per mole compound, consistent with external validation Q2 of 0.94 and RMSE of 0.08 moles TCM per mole compound, and met criteria for high predictive power and robustness. In contrast, log TCM QSPR performed poorly and did not meet the criteria for predictive power. The QSPR predictions were consistent with experimental values for TCM formation from tannic acid and for model fulvic acid structures. The descriptors used are consistent with a relatively small number of important TCM precursor structures based upon 1,3-dicarbonyls or 1,3-diphenols.

  4. NPOESS Preparatory Project Validation Program for the Cross-track Infrared Sounder

    NASA Astrophysics Data System (ADS)

    Barnet, C.; Gu, D.; Nalli, N. R.

    2009-12-01

    The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Program, in partnership with National Aeronautical Space Administration (NASA), will launch the NPOESS Preparatory Project (NPP), a risk reduction and data continuity mission, prior to the first operational NPOESS launch. The NPOESS Program, in partnership with Northrop Grumman Aerospace Systems, will execute the NPP Calibration and Validation (Cal/Val) program to ensure the data products comply with the requirements of the sponsoring agencies. The Cross-track Infrared Sounder (CrIS) and the Advanced Technology Microwave Sounder (ATMS) are two of the instruments that make up the suite of sensors on NPP. Together, CrIS and ATMS will produce three Environmental Data Records (EDRs) including the Atmospheric Vertical Temperature Profile (AVTP), Atmospheric Vertical Moisture Profile (AVMP), and the Atmospheric Vertical Pressure Profile (AVPP). The AVTP and the AVMP are both NPOESS Key Performance Parameters (KPPs). The validation plans establish science and user community leadership and participation, and demonstrated, cost-effective Cal/Val approaches. This presentation will provide an overview of the collaborative data, techniques, and schedule for the validation of the NPP CrIS and ATMS environmental data products.

  5. Measurement of the static and dynamic coefficients of a cross-type parachute in subsonic flow

    NASA Technical Reports Server (NTRS)

    Shpund, Zalman; Levin, Daniel

    1991-01-01

    An experimental parametric investigation of the aerodynamic qualities of cross-type parachutes was performed in a subsonic wind tunnel, using a new experimental technique. This investigation included the measurement of the static and dynamic aerodynamic coefficients, utilizing the measuring apparatus modified specifically for this type of testing. It is shown that the static aerodynamic coefficients of several configurations are in good agreement with available data, and assisted in validating the experimental technique employed. Two configuration parameters were varied in the static tests, the cord length and the canopy aspect ratio, with both parameters having a similar effect on the drag measurement, i.e., any increase in either of them increased the effective blocking area, and therefore the axial force.

  6. A Machine Learning Framework for Plan Payment Risk Adjustment.

    PubMed

    Rose, Sherri

    2016-12-01

    To introduce cross-validation and a nonparametric machine learning framework for plan payment risk adjustment and then assess whether they have the potential to improve risk adjustment. 2011-2012 Truven MarketScan database. We compare the performance of multiple statistical approaches within a broad machine learning framework for estimation of risk adjustment formulas. Total annual expenditure was predicted using age, sex, geography, inpatient diagnoses, and hierarchical condition category variables. The methods included regression, penalized regression, decision trees, neural networks, and an ensemble super learner, all in concert with screening algorithms that reduce the set of variables considered. The performance of these methods was compared based on cross-validated R 2 . Our results indicate that a simplified risk adjustment formula selected via this nonparametric framework maintains much of the efficiency of a traditional larger formula. The ensemble approach also outperformed classical regression and all other algorithms studied. The implementation of cross-validated machine learning techniques provides novel insight into risk adjustment estimation, possibly allowing for a simplified formula, thereby reducing incentives for increased coding intensity as well as the ability of insurers to "game" the system with aggressive diagnostic upcoding. © Health Research and Educational Trust.

  7. Triacylglycerol stereospecific analysis and linear discriminant analysis for milk speciation.

    PubMed

    Blasi, Francesca; Lombardi, Germana; Damiani, Pietro; Simonetti, Maria Stella; Giua, Laura; Cossignani, Lina

    2013-05-01

    Product authenticity is an important topic in dairy sector. Dairy products sold for public consumption must be accurately labelled in accordance with the contained milk species. Linear discriminant analysis (LDA), a common chemometric procedure, has been applied to fatty acid% composition to classify pure milk samples (cow, ewe, buffalo, donkey, goat). All original grouped cases were correctly classified, while 90% of cross-validated grouped cases were correctly classified. Another objective of this research was the characterisation of cow-ewe milk mixtures in order to reveal a common fraud in dairy field, that is the addition of cow to ewe milk. Stereospecific analysis of triacylglycerols (TAG), a method based on chemical-enzymatic procedures coupled with chromatographic techniques, has been carried out to detect fraudulent milk additions, in particular 1, 3, 5% cow milk added to ewe milk. When only TAG composition data were used for the elaboration, 75% of original grouped cases were correctly classified, while totally correct classified samples were obtained when both total and intrapositional TAG data were used. Also the results of cross validation were better when TAG stereospecific analysis data were considered as LDA variables. In particular, 100% of cross-validated grouped cases were obtained when 5% cow milk mixtures were considered.

  8. Discrimination among populations of sockeye salmon fry with Fourier analysis of otolith banding patterns formed during incubation

    USGS Publications Warehouse

    Finn, James E.; Burger, Carl V.; Holland-Bartels, Leslie E.

    1997-01-01

    We used otolith banding patterns formed during incubation to discriminate among hatchery- and wild-incubated fry of sockeye salmon Oncorhynchus nerka from Tustumena Lake, Alaska. Fourier analysis of otolith luminance profiles was used to describe banding patterns: the amplitudes of individual Fourier harmonics were discriminant variables. Correct classification of otoliths to either hatchery or wild origin was 83.1% (cross-validation) and 72.7% (test data) with the use of quadratic discriminant function analysts on 10 Fourier amplitudes. Overall classification rates among the six test groups (one hatchery and five wild groups) were 46.5% (cross-validation) and 39.3% (test data) with the use of linear discriminant function analysis on 16 Fourier amplitudes. Although classification rates for wild-incubated fry from any one site never exceeded 67% (cross-validation) or 60% (test data), location-specific information was evident for all groups because the probability of classifying an individual to its true incubation location was significantly greater than chance. Results indicate phenotypic differences in otolith microstructure among incubation sites separated by less than 10 km. Analysis of otolith luminance profiles is a potentially useful technique for discriminating among and between various populations of hatchery and wild fish.

  9. Optimization of the volume reconstruction for classical Tomo-PIV algorithms (MART, BIMART and SMART): synthetic and experimental studies

    NASA Astrophysics Data System (ADS)

    Thomas, L.; Tremblais, B.; David, L.

    2014-03-01

    Optimization of multiplicative algebraic reconstruction technique (MART), simultaneous MART and block iterative MART reconstruction techniques was carried out on synthetic and experimental data. Different criteria were defined to improve the preprocessing of the initial images. Knowledge of how each reconstruction parameter influences the quality of particle volume reconstruction and computing time is the key in Tomo-PIV. These criteria were applied to a real case, a jet in cross flow, and were validated.

  10. 3D-QSAR and molecular docking studies on HIV protease inhibitors

    NASA Astrophysics Data System (ADS)

    Tong, Jianbo; Wu, Yingji; Bai, Min; Zhan, Pei

    2017-02-01

    In order to well understand the chemical-biological interactions governing their activities toward HIV protease activity, QSAR models of 34 cyclic-urea derivatives with inhibitory HIV were developed. The quantitative structure activity relationship (QSAR) model was built by using comparative molecular similarity indices analysis (CoMSIA) technique. And the best CoMSIA model has rcv2, rncv2 values of 0.586 and 0.931 for cross-validated and non-cross-validated. The predictive ability of CoMSIA model was further validated by a test set of 7 compounds, giving rpred2 value of 0.973. Docking studies were used to find the actual conformations of chemicals in active site of HIV protease, as well as the binding mode pattern to the binding site in protease enzyme. The information provided by 3D-QSAR model and molecular docking may lead to a better understanding of the structural requirements of 34 cyclic-urea derivatives and help to design potential anti-HIV protease molecules.

  11. Validation and upgrading of physically based mathematical models

    NASA Technical Reports Server (NTRS)

    Duval, Ronald

    1992-01-01

    The validation of the results of physically-based mathematical models against experimental results was discussed. Systematic techniques are used for: (1) isolating subsets of the simulator mathematical model and comparing the response of each subset to its experimental response for the same input conditions; (2) evaluating the response error to determine whether it is the result of incorrect parameter values, incorrect structure of the model subset, or unmodeled external effects of cross coupling; and (3) modifying and upgrading the model and its parameter values to determine the most physically appropriate combination of changes.

  12. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    PubMed

    Chavez, Juan D; Eng, Jimmy K; Schweppe, Devin K; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E

    2016-01-01

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  13. Precipitation interpolation in mountainous areas

    NASA Astrophysics Data System (ADS)

    Kolberg, Sjur

    2015-04-01

    Different precipitation interpolation techniques as well as external drift covariates are tested and compared in a 26000 km2 mountainous area in Norway, using daily data from 60 stations. The main method of assessment is cross-validation. Annual precipitation in the area varies from below 500 mm to more than 2000 mm. The data were corrected for wind-driven undercatch according to operational standards. While temporal evaluation produce seemingly acceptable at-station correlation values (on average around 0.6), the average daily spatial correlation is less than 0.1. Penalising also bias, Nash-Sutcliffe R2 values are negative for spatial correspondence, and around 0.15 for temporal. Despite largely violated assumptions, plain Kriging produces better results than simple inverse distance weighting. More surprisingly, the presumably 'worst-case' benchmark of no interpolation at all, simply averaging all 60 stations for each day, actually outperformed the standard interpolation techniques. For logistic reasons, high altitudes are under-represented in the gauge network. The possible effect of this was investigated by a) fitting a precipitation lapse rate as an external drift, and b) applying a linear model of orographic enhancement (Smith and Barstad, 2004). These techniques improved the results only marginally. The gauge density in the region is one for each 433 km2; higher than the overall density of the Norwegian national network. Admittedly the cross-validation technique reduces the gauge density, still the results suggest that we are far from able to provide hydrological models with adequate data for the main driving force.

  14. Validity and reliability of a self-report instrument to assess social support and physical environmental correlates of physical activity in adolescents

    PubMed Central

    2012-01-01

    Background The purpose of this study was to examine the internal consistency, test-retest reliability, construct validity and predictive validity of a new German self-report instrument to assess the influence of social support and the physical environment on physical activity in adolescents. Methods Based on theoretical consideration, the short scales on social support and physical environment were developed and cross-validated in two independent study samples of 9 to 17 year-old girls and boys. The longitudinal sample of Study I (n = 196) was recruited from a German comprehensive school, and subjects in this study completed the questionnaire twice with a between-test interval of seven days. Cronbach’s alphas were computed to determine the internal consistency of the factors. Test-retest reliability of the latent factors was assessed using intra-class coefficients. Factorial validity of the scales was assessed using principle components analysis. Construct validity was determined using a cross-validation technique by performing confirmatory factor analysis with the independent nationwide cross-sectional sample of Study II (n = 430). Correlations between factors and three measures of physical activity (objectively measured moderate-to-vigorous physical activity (MVPA), self-reported habitual MVPA and self-reported recent MVPA) were calculated to determine the predictive validity of the instrument. Results Construct validity of the social support scale (two factors: parental support and peer support) and the physical environment scale (four factors: convenience, public recreation facilities, safety and private sport providers) was shown. Both scales had moderate test-retest reliability. The factors of the social support scale also had good internal consistency and predictive validity. Internal consistency and predictive validity of the physical environment scale were low to acceptable. Conclusions The results of this study indicate moderate to good reliability and construct validity of the social support scale and physical environment scale. Predictive validity was only confirmed for the social support scale but not for the physical environment scale. Hence, it remains unclear if a person’s physical environment has a direct or an indirect effect on physical activity behavior or a moderation function. PMID:22928865

  15. Validity and reliability of a self-report instrument to assess social support and physical environmental correlates of physical activity in adolescents.

    PubMed

    Reimers, Anne K; Jekauc, Darko; Mess, Filip; Mewes, Nadine; Woll, Alexander

    2012-08-29

    The purpose of this study was to examine the internal consistency, test-retest reliability, construct validity and predictive validity of a new German self-report instrument to assess the influence of social support and the physical environment on physical activity in adolescents. Based on theoretical consideration, the short scales on social support and physical environment were developed and cross-validated in two independent study samples of 9 to 17 year-old girls and boys. The longitudinal sample of Study I (n = 196) was recruited from a German comprehensive school, and subjects in this study completed the questionnaire twice with a between-test interval of seven days. Cronbach's alphas were computed to determine the internal consistency of the factors. Test-retest reliability of the latent factors was assessed using intra-class coefficients. Factorial validity of the scales was assessed using principle components analysis. Construct validity was determined using a cross-validation technique by performing confirmatory factor analysis with the independent nationwide cross-sectional sample of Study II (n = 430). Correlations between factors and three measures of physical activity (objectively measured moderate-to-vigorous physical activity (MVPA), self-reported habitual MVPA and self-reported recent MVPA) were calculated to determine the predictive validity of the instrument. Construct validity of the social support scale (two factors: parental support and peer support) and the physical environment scale (four factors: convenience, public recreation facilities, safety and private sport providers) was shown. Both scales had moderate test-retest reliability. The factors of the social support scale also had good internal consistency and predictive validity. Internal consistency and predictive validity of the physical environment scale were low to acceptable. The results of this study indicate moderate to good reliability and construct validity of the social support scale and physical environment scale. Predictive validity was only confirmed for the social support scale but not for the physical environment scale. Hence, it remains unclear if a person's physical environment has a direct or an indirect effect on physical activity behavior or a moderation function.

  16. Body fat measurement by bioelectrical impedance and air displacement plethysmography: a cross-validation study to design bioelectrical impedance equations in Mexican adults

    PubMed Central

    Macias, Nayeli; Alemán-Mateo, Heliodoro; Esparza-Romero, Julián; Valencia, Mauro E

    2007-01-01

    Background The study of body composition in specific populations by techniques such as bio-impedance analysis (BIA) requires validation based on standard reference methods. The aim of this study was to develop and cross-validate a predictive equation for bioelectrical impedance using air displacement plethysmography (ADP) as standard method to measure body composition in Mexican adult men and women. Methods This study included 155 male and female subjects from northern Mexico, 20–50 years of age, from low, middle, and upper income levels. Body composition was measured by ADP. Body weight (BW, kg) and height (Ht, cm) were obtained by standard anthropometric techniques. Resistance, R (ohms) and reactance, Xc (ohms) were also measured. A random-split method was used to obtain two samples: one was used to derive the equation by the "all possible regressions" procedure and was cross-validated in the other sample to test predicted versus measured values of fat-free mass (FFM). Results and Discussion The final model was: FFM (kg) = 0.7374 * (Ht2 /R) + 0.1763 * (BW) - 0.1773 * (Age) + 0.1198 * (Xc) - 2.4658. R2 was 0.97; the square root of the mean square error (SRMSE) was 1.99 kg, and the pure error (PE) was 2.96. There was no difference between FFM predicted by the new equation (48.57 ± 10.9 kg) and that measured by ADP (48.43 ± 11.3 kg). The new equation did not differ from the line of identity, had a high R2 and a low SRMSE, and showed no significant bias (0.87 ± 2.84 kg). Conclusion The new bioelectrical impedance equation based on the two-compartment model (2C) was accurate, precise, and free of bias. This equation can be used to assess body composition and nutritional status in populations similar in anthropometric and physical characteristics to this sample. PMID:17697388

  17. Predictive modeling of outcomes following definitive chemoradiotherapy for oropharyngeal cancer based on FDG-PET image characteristics

    NASA Astrophysics Data System (ADS)

    Folkert, Michael R.; Setton, Jeremy; Apte, Aditya P.; Grkovski, Milan; Young, Robert J.; Schöder, Heiko; Thorstad, Wade L.; Lee, Nancy Y.; Deasy, Joseph O.; Oh, Jung Hun

    2017-07-01

    In this study, we investigate the use of imaging feature-based outcomes research (‘radiomics’) combined with machine learning techniques to develop robust predictive models for the risk of all-cause mortality (ACM), local failure (LF), and distant metastasis (DM) following definitive chemoradiation therapy (CRT). One hundred seventy four patients with stage III-IV oropharyngeal cancer (OC) treated at our institution with CRT with retrievable pre- and post-treatment 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) scans were identified. From pre-treatment PET scans, 24 representative imaging features of FDG-avid disease regions were extracted. Using machine learning-based feature selection methods, multiparameter logistic regression models were built incorporating clinical factors and imaging features. All model building methods were tested by cross validation to avoid overfitting, and final outcome models were validated on an independent dataset from a collaborating institution. Multiparameter models were statistically significant on 5 fold cross validation with the area under the receiver operating characteristic curve (AUC)  =  0.65 (p  =  0.004), 0.73 (p  =  0.026), and 0.66 (p  =  0.015) for ACM, LF, and DM, respectively. The model for LF retained significance on the independent validation cohort with AUC  =  0.68 (p  =  0.029) whereas the models for ACM and DM did not reach statistical significance, but resulted in comparable predictive power to the 5 fold cross validation with AUC  =  0.60 (p  =  0.092) and 0.65 (p  =  0.062), respectively. In the largest study of its kind to date, predictive features including increasing metabolic tumor volume, increasing image heterogeneity, and increasing tumor surface irregularity significantly correlated to mortality, LF, and DM on 5 fold cross validation in a relatively uniform single-institution cohort. The LF model also retained significance in an independent population.

  18. Zebra Crossing Spotter: Automatic Population of Spatial Databases for Increased Safety of Blind Travelers

    PubMed Central

    Ahmetovic, Dragan; Manduchi, Roberto; Coughlan, James M.; Mascetti, Sergio

    2016-01-01

    In this paper we propose a computer vision-based technique that mines existing spatial image databases for discovery of zebra crosswalks in urban settings. Knowing the location of crosswalks is critical for a blind person planning a trip that includes street crossing. By augmenting existing spatial databases (such as Google Maps or OpenStreetMap) with this information, a blind traveler may make more informed routing decisions, resulting in greater safety during independent travel. Our algorithm first searches for zebra crosswalks in satellite images; all candidates thus found are validated against spatially registered Google Street View images. This cascaded approach enables fast and reliable discovery and localization of zebra crosswalks in large image datasets. While fully automatic, our algorithm could also be complemented by a final crowdsourcing validation stage for increased accuracy. PMID:26824080

  19. Three-dimensional quantitative structure-activity relationship studies on c-Src inhibitors based on different docking methods.

    PubMed

    Bairy, Santhosh Kumar; Suneel Kumar, B V S; Bhalla, Joseph Uday Tej; Pramod, A B; Ravikumar, Muttineni

    2009-04-01

    c-Src kinase play an important role in cell growth and differentiation and its inhibitors can be useful for the treatment of various diseases, including cancer, osteoporosis, and metastatic bone disease. Three dimensional quantitative structure-activity relationship (3D-QSAR) studies were carried out on quinazolin derivatives inhibiting c-Src kinase. Molecular field analysis (MFA) models with four different alignment techniques, namely, GLIDE, GOLD, LIGANDFIT and Least squares based methods were developed. glide based MFA model showed better results (Leave one out cross validation correlation coefficient r(2)(cv) = 0.923 and non-cross validation correlation coefficient r(2)= 0.958) when compared with other models. These results help us to understand the nature of descriptors required for activity of these compounds and thereby provide guidelines to design novel and potent c-Src kinase inhibitors.

  20. Cross-correlation Doppler global velocimetry (CC-DGV)

    NASA Astrophysics Data System (ADS)

    Cadel, Daniel R.; Lowe, K. Todd

    2015-08-01

    A flow velocimetry method, cross-correlation Doppler global velocimetry (CC-DGV), is presented as a robust, simplified, and high dynamic range implementation of the Doppler global/planar Doppler velocimetry technique. A sweep of several gigahertz of the vapor absorption spectrum is used for each velocity sample, with signals acquired from both Doppler-shifted scattered light within the flow and a non-Doppler shifted reference beam. Cross-correlation of these signals yields the Doppler shift between them, averaged over the duration of the scan. With presently available equipment, velocities from 0 ms-1 to over 3000 ms-1 can notionally be measured simultaneously, making the technique ideal for high speed flows. The processing routine is shown to be robust against large changes in the vapor pressure of the iodine cell, benefiting performance of the system in facilities where ambient conditions cannot be easily regulated. Validation of the system was performed with measurements of a model wind turbine blade boundary layer made in a 1.83 m by 1.83 m subsonic wind tunnel for which laser Doppler velocimetry (LDV) measurements were acquired alongside the CC-DGV results. CC-DGV uncertainties of ±1.30 ms-1, ±0.64 ms-1, and ±1.11 ms-1 were determined for the orthogonal stream-wise, transverse-horizontal, and transverse-vertical velocity components, and root-mean-square deviations of 2.77 ms-1 and 1.34 ms-1 from the LDV validation results were observed for Reynolds numbers of 1.5 million and 2 million, respectively. Volumetric mean velocity measurements are also presented for a supersonic jet, with velocity uncertainties of ±4.48 ms-1, ±16.93 ms-1, and ±0.50 ms-1 for the orthogonal components, and self-validation done by collapsing the data with a physical scaling.

  1. [The Use of FTIR Coupled with Partial Least Square for Quantitative Analysis of the Main Composition of Bamboo/Polypropylene Composites].

    PubMed

    Lao, Wan-li; He, Yu-chan; Li, Gai-yun; Zhou, Qun

    2016-01-01

    The biomass to plastic ratio in wood plastic composites (WPCs) greatly affects the physical and mechanical properties and price. Fast and accurate evaluation of the biomass to plastic ratio is important for the further development of WPCs. Quantitative analysis of the WPC main composition currently relies primarily on thermo-analytical methods. However, these methods have some inherent disadvantages, including time-consuming, high analytical errors and sophisticated, which severely limits the applications of these techniques. Therefore, in this study, Fourier Transform Infrared (FTIR) spectroscopy in combination with partial least square (PLS) has been used for rapid prediction of bamboo and polypropylene (PP) content in bamboo/PP composites. The bamboo powders were used as filler after being dried at 105 degrees C for 24 h. PP was used as matrix materials, and some chemical regents were used as additives. Then 42 WPC samples with different ratios of bamboo and PP were prepared by the methods of extrusion. FTIR spectral data of 42 WPC samples were collected by means of KBr pellets technique. The model for bamboo and PP content prediction was developed by PLS-2 and full cross validation. Results of internal cross validation showed that the first derivative spectra in the range of 1 800-800 cm(-1) corrected by standard normal variate (SNV) yielded the optimal model. For both bamboo and PP calibration, the coefficients of determination (R2) were 0.955. The standard errors of calibration (SEC) were 1.872 for bamboo content and 1.848 for PP content, respectively. For both bamboo and PP validation, the R2 values were 0.950. The standard errors of cross validation (SECV) were 1.927 for bamboo content and 1.950 for PP content, respectively. And the ratios of performance to deviation (RPD) were 4.45 for both biomass and PP examinations. The results of external validation showed that the relative prediction deviations for both biomass and PP contents were lower than ± 6%. FTIR combined with PLS can be used for rapid and accurate determination of bamboo and PP content in bamboo/PP composites.

  2. Automatic Classification of Sub-Techniques in Classical Cross-Country Skiing Using a Machine Learning Algorithm on Micro-Sensor Data

    PubMed Central

    Seeberg, Trine M.; Tjønnås, Johannes; Haugnes, Pål; Sandbakk, Øyvind

    2017-01-01

    The automatic classification of sub-techniques in classical cross-country skiing provides unique possibilities for analyzing the biomechanical aspects of outdoor skiing. This is currently possible due to the miniaturization and flexibility of wearable inertial measurement units (IMUs) that allow researchers to bring the laboratory to the field. In this study, we aimed to optimize the accuracy of the automatic classification of classical cross-country skiing sub-techniques by using two IMUs attached to the skier’s arm and chest together with a machine learning algorithm. The novelty of our approach is the reliable detection of individual cycles using a gyroscope on the skier’s arm, while a neural network machine learning algorithm robustly classifies each cycle to a sub-technique using sensor data from an accelerometer on the chest. In this study, 24 datasets from 10 different participants were separated into the categories training-, validation- and test-data. Overall, we achieved a classification accuracy of 93.9% on the test-data. Furthermore, we illustrate how an accurate classification of sub-techniques can be combined with data from standard sports equipment including position, altitude, speed and heart rate measuring systems. Combining this information has the potential to provide novel insight into physiological and biomechanical aspects valuable to coaches, athletes and researchers. PMID:29283421

  3. Primary central nervous system lymphoma and glioblastoma differentiation based on conventional magnetic resonance imaging by high-throughput SIFT features.

    PubMed

    Chen, Yinsheng; Li, Zeju; Wu, Guoqing; Yu, Jinhua; Wang, Yuanyuan; Lv, Xiaofei; Ju, Xue; Chen, Zhongping

    2018-07-01

    Due to the totally different therapeutic regimens needed for primary central nervous system lymphoma (PCNSL) and glioblastoma (GBM), accurate differentiation of the two diseases by noninvasive imaging techniques is important for clinical decision-making. Thirty cases of PCNSL and 66 cases of GBM with conventional T1-contrast magnetic resonance imaging (MRI) were analyzed in this study. Convolutional neural networks was used to segment tumor automatically. A modified scale invariant feature transform (SIFT) method was utilized to extract three-dimensional local voxel arrangement information from segmented tumors. Fisher vector was proposed to normalize the dimension of SIFT features. An improved genetic algorithm (GA) was used to extract SIFT features with PCNSL and GBM discrimination ability. The data-set was divided into a cross-validation cohort and an independent validation cohort by the ratio of 2:1. Support vector machine with the leave-one-out cross-validation based on 20 cases of PCNSL and 44 cases of GBM was employed to build and validate the differentiation model. Among 16,384 high-throughput features, 1356 features show significant differences between PCNSL and GBM with p < 0.05 and 420 features with p < 0.001. A total of 496 features were finally chosen by improved GA algorithm. The proposed method produces PCNSL vs. GBM differentiation with an area under the curve (AUC) curve of 99.1% (98.2%), accuracy 95.3% (90.6%), sensitivity 85.0% (80.0%) and specificity 100% (95.5%) on the cross-validation cohort (and independent validation cohort). Since the local voxel arrangement characterization provided by SIFT features, proposed method produced more competitive PCNSL and GBM differentiation performance by using conventional MRI than methods based on advanced MRI.

  4. [A short form of the positions on nursing diagnosis scale: development and psychometric testing].

    PubMed

    Romero-Sánchez, José Manuel; Paloma-Castro, Olga; Paramio-Cuevas, Juan Carlos; Pastor-Montero, Sonia María; O'Ferrall-González, Cristina; Gabaldón-Bravo, Eva Maria; González-Domínguez, Maria Eugenia; Castro-Yuste, Cristina; Frandsen, Anna J; Martínez-Sabater, Antonio

    2013-06-01

    The Positions on Nursing Diagnosis (PND) is a scale that uses the semantic differential technique to measure nurses' attitudes towards the nursing diagnosis concept. The aim of this study was to develop a shortened form of the Spanish version of this scale and evaluate its psychometric properties and efficiency. A double theoretical-empirical approach was used to obtain a short form of the PND, the PND-7-SV, which would be equivalent to the original. Using a cross-sectional survey design, the reliability (internal consistency and test-retest reliability), construct (exploratory factor analysis, known-groups technique and discriminant validity) and criterion-related validity (concurrent validity), sensitivity to change and efficiency of the PND-7-SV were assessed in a sample of 476 Spanish nursing students. The results endorsed the utility of the PND-7-SV to measure attitudes toward nursing diagnosis in an equivalent manner to the complete form of the scale and in a shorter time.

  5. Real-time sensor data validation

    NASA Technical Reports Server (NTRS)

    Bickmore, Timothy W.

    1994-01-01

    This report describes the status of an on-going effort to develop software capable of detecting sensor failures on rocket engines in real time. This software could be used in a rocket engine controller to prevent the erroneous shutdown of an engine due to sensor failures which would otherwise be interpreted as engine failures by the control software. The approach taken combines analytical redundancy with Bayesian belief networks to provide a solution which has well defined real-time characteristics and well-defined error rates. Analytical redundancy is a technique in which a sensor's value is predicted by using values from other sensors and known or empirically derived mathematical relations. A set of sensors and a set of relations among them form a network of cross-checks which can be used to periodically validate all of the sensors in the network. Bayesian belief networks provide a method of determining if each of the sensors in the network is valid, given the results of the cross-checks. This approach has been successfully demonstrated on the Technology Test Bed Engine at the NASA Marshall Space Flight Center. Current efforts are focused on extending the system to provide a validation capability for 100 sensors on the Space Shuttle Main Engine.

  6. Identifying intervals of temporally invariant field-aligned currents from Swarm: Assessing the validity of single-spacecraft methods

    NASA Astrophysics Data System (ADS)

    Forsyth, C.; Rae, I. J.; Mann, I. R.; Pakhotin, I. P.

    2017-03-01

    Field-aligned currents (FACs) are a fundamental component of coupled solar wind-magnetosphere-ionosphere. By assuming that FACs can be approximated by stationary infinite current sheets that do not change on the spacecraft crossing time, single-spacecraft magnetic field measurements can be used to estimate the currents flowing in space. By combining data from multiple spacecraft on similar orbits, these stationarity assumptions can be tested. In this technical report, we present a new technique that combines cross correlation and linear fitting of multiple spacecraft measurements to determine the reliability of the FAC estimates. We show that this technique can identify those intervals in which the currents estimated from single-spacecraft techniques are both well correlated and have similar amplitudes, thus meeting the spatial and temporal stationarity requirements. Using data from European Space Agency's Swarm mission from 2014 to 2015, we show that larger-scale currents (>450 km) are well correlated and have a one-to-one fit up to 50% of the time, whereas small-scale (<50 km) currents show similar amplitudes only 1% of the time despite there being a good correlation 18% of the time. It is thus imperative to examine both the correlation and amplitude of the calculated FACs in order to assess both the validity of the underlying assumptions and hence ultimately the reliability of such single-spacecraft FAC estimates.

  7. Uniqueness, Integration or Separation? Exploring the Nature of Creativity through Creative Writing by Elementary School Students in Taiwan

    ERIC Educational Resources Information Center

    Chu, Tsai-Ling; Lin, Wei-Wen

    2013-01-01

    The primary goal of our study was to investigate the importance of originality in divergent thinking (DT) tests and to determine whether originality is the best reflection of creativity. To accomplish this, we cross-validated the DT test and creative writing task rating by consensual assessment technique (CAT). Thirty-seven elementary school…

  8. Artificial intelligence techniques: An efficient new approach to challenge the assessment of complex clinical fields such as airway clearance techniques in patients with cystic fibrosis?

    PubMed

    Slavici, Titus; Almajan, Bogdan

    2013-04-01

    To construct an artificial intelligence application to assist untrained physiotherapists in determining the appropriate physiotherapy exercises to improve the quality of life of patients with cystic fibrosis. A total of 42 children (21 boys and 21 girls), age range 6-18 years, participated in a clinical survey between 2001 and 2005. Data collected during the clinical survey were entered into a neural network in order to correlate the health state indicators of the patients and the type of physiotherapy exercise to be followed. Cross-validation of the network was carried out by comparing the health state indicators achieved after following a certain physiotherapy exercise and the health state indicators predicted by the network. The lifestyle and health state indicators of the survey participants improved. The network predicted the health state indicators of the participants with an accuracy of 93%. The results of the cross-validation test were within the error margins of the real-life indicators. Using data on the clinical state of individuals with cystic fibrosis, it is possible to determine the most effective type of physiotherapy exercise for improving overall health state indicators.

  9. High wavenumber Raman spectroscopy in the characterization of urinary metabolites of normal subjects, oral premalignant and malignant patients

    NASA Astrophysics Data System (ADS)

    Brindha, Elumalai; Rajasekaran, Ramu; Aruna, Prakasarao; Koteeswaran, Dornadula; Ganesan, Singaravelu

    2017-01-01

    Urine has emerged as one of the diagnostically potential bio fluids, as it has many metabolites. As the concentration and the physiochemical properties of the urinary metabolites may vary under pathological transformation, Raman spectroscopic characterization of urine has been exploited as a significant tool in identifying several diseased conditions, including cancers. In the present study, an attempt was made to study the high wavenumber (HWVN) Raman spectroscopic characterization of urine samples of normal subjects, oral premalignant and malignant patients. It is concluded that the urinary metabolites flavoproteins, tryptophan and phenylalanine are responsible for the observed spectral variations between the normal and abnormal groups. Principal component analysis-based linear discriminant analysis was carried out to verify the diagnostic potentiality of the present technique. The discriminant analysis performed across normal and oral premalignant subjects classifies 95.6% of the original and 94.9% of the cross-validated grouped cases correctly. In the second analysis performed across normal and oral malignant groups, the accuracy of the original and cross-validated grouped cases was 96.4% and 92.1% respectively. Similarly, the third analysis performed across three groups, normal, oral premalignant and malignant groups, classifies 93.3% and 91.2% of the original and cross-validated grouped cases correctly.

  10. Ultrasonic linear array validation via concrete test blocks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoegh, Kyle, E-mail: hoeg0021@umn.edu; Khazanovich, Lev, E-mail: hoeg0021@umn.edu; Ferraro, Chris

    2015-03-31

    Oak Ridge National Laboratory (ORNL) comparatively evaluated the ability of a number of NDE techniques to generate an image of the volume of 6.5′ X 5.0′ X 10″ concrete specimens fabricated at the Florida Department of Transportation (FDOT) NDE Validation Facility in Gainesville, Florida. These test blocks were fabricated to test the ability of various NDE methods to characterize various placements and sizes of rebar as well as simulated cracking and non-consolidation flaws. The first version of the ultrasonic linear array device, MIRA [version 1], was one of 7 different NDE equipment used to characterize the specimens. This paper dealsmore » with the ability of this equipment to determine subsurface characterizations such as reinforcing steel relative size, concrete thickness, irregularities, and inclusions using Kirchhoff-based migration techniques. The ability of individual synthetic aperture focusing technique (SAFT) B-scan cross sections resulting from self-contained scans are compared with various processing, analysis, and interpretation methods using the various features fabricated in the specimens for validation. The performance is detailed, especially with respect to the limitations and implications for evaluation of a thicker, more heavily reinforced concrete structures.« less

  11. Planar near-field scanning for compact range bistatic radar cross-section measurement. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Tuhela-Reuning, S. R.; Walton, E. K.

    1991-01-01

    The design, construction, and testing of a low cost, planar scanning system to be used in a compact range environment for bistatic radar cross-section (bistatic RCS) measurement data are discussed. This scanning system is similar to structures used for measuring near-field antenna patterns. A synthetic aperture technique is used for plane wave reception. System testing entailed comparison of measured and theoretical bistatic RCS of a sphere and a right circular cylinder. Bistatic scattering analysis of the ogival target support, target and pedestal interactions, and compact range room was necessary to determine measurement validity.

  12. Cross-validation pitfalls when selecting and assessing regression and classification models.

    PubMed

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  13. Near infrared spectroscopy for prediction of antioxidant compounds in the honey.

    PubMed

    Escuredo, Olga; Seijo, M Carmen; Salvador, Javier; González-Martín, M Inmaculada

    2013-12-15

    The selection of antioxidant variables in honey is first time considered applying the near infrared (NIR) spectroscopic technique. A total of 60 honey samples were used to develop the calibration models using the modified partial least squares (MPLS) regression method and 15 samples were used for external validation. Calibration models on honey matrix for the estimation of phenols, flavonoids, vitamin C, antioxidant capacity (DPPH), oxidation index and copper using near infrared (NIR) spectroscopy has been satisfactorily obtained. These models were optimised by cross-validation, and the best model was evaluated according to multiple correlation coefficient (RSQ), standard error of cross-validation (SECV), ratio performance deviation (RPD) and root mean standard error (RMSE) in the prediction set. The result of these statistics suggested that the equations developed could be used for rapid determination of antioxidant compounds in honey. This work shows that near infrared spectroscopy can be considered as rapid tool for the nondestructive measurement of antioxidant constitutes as phenols, flavonoids, vitamin C and copper and also the antioxidant capacity in the honey. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Application of Cross-Correlation Greens Function Along With FDTD for Fast Computation of Envelope Correlation Coefficient Over Wideband for MIMO Antennas

    NASA Astrophysics Data System (ADS)

    Sarkar, Debdeep; Srivastava, Kumar Vaibhav

    2017-02-01

    In this paper, the concept of cross-correlation Green's functions (CGF) is used in conjunction with the finite difference time domain (FDTD) technique for calculation of envelope correlation coefficient (ECC) of any arbitrary MIMO antenna system over wide frequency band. Both frequency-domain (FD) and time-domain (TD) post-processing techniques are proposed for possible application with this FDTD-CGF scheme. The FDTD-CGF time-domain (FDTD-CGF-TD) scheme utilizes time-domain signal processing methods and exhibits significant reduction in ECC computation time as compared to the FDTD-CGF frequency domain (FDTD-CGF-FD) scheme, for high frequency-resolution requirements. The proposed FDTD-CGF based schemes can be applied for accurate and fast prediction of wideband ECC response, instead of the conventional scattering parameter based techniques which have several limitations. Numerical examples of the proposed FDTD-CGF techniques are provided for two-element MIMO systems involving thin-wire half-wavelength dipoles in parallel side-by-side as well as orthogonal arrangements. The results obtained from the FDTD-CGF techniques are compared with results from commercial electromagnetic solver Ansys HFSS, to verify the validity of proposed approach.

  15. Chronic subdural hematoma: Surgical management and outcome in 986 cases: A classification and regression tree approach

    PubMed Central

    Rovlias, Aristedis; Theodoropoulos, Spyridon; Papoutsakis, Dimitrios

    2015-01-01

    Background: Chronic subdural hematoma (CSDH) is one of the most common clinical entities in daily neurosurgical practice which carries a most favorable prognosis. However, because of the advanced age and medical problems of patients, surgical therapy is frequently associated with various complications. This study evaluated the clinical features, radiological findings, and neurological outcome in a large series of patients with CSDH. Methods: A classification and regression tree (CART) technique was employed in the analysis of data from 986 patients who were operated at Asclepeion General Hospital of Athens from January 1986 to December 2011. Burr holes evacuation with closed system drainage has been the operative technique of first choice at our institution for 29 consecutive years. A total of 27 prognostic factors were examined to predict the outcome at 3-month postoperatively. Results: Our results indicated that neurological status on admission was the best predictor of outcome. With regard to the other data, age, brain atrophy, thickness and density of hematoma, subdural accumulation of air, and antiplatelet and anticoagulant therapy were found to correlate significantly with prognosis. The overall cross-validated predictive accuracy of CART model was 85.34%, with a cross-validated relative error of 0.326. Conclusions: Methodologically, CART technique is quite different from the more commonly used methods, with the primary benefit of illustrating the important prognostic variables as related to outcome. Since, the ideal therapy for the treatment of CSDH is still under debate, this technique may prove useful in developing new therapeutic strategies and approaches for patients with CSDH. PMID:26257985

  16. General upper bound on single-event upset rate. [due to ionizing radiation in orbiting vehicle avionics

    NASA Technical Reports Server (NTRS)

    Chlouber, Dean; O'Neill, Pat; Pollock, Jim

    1990-01-01

    A technique of predicting an upper bound on the rate at which single-event upsets due to ionizing radiation occur in semiconducting memory cells is described. The upper bound on the upset rate, which depends on the high-energy particle environment in earth orbit and accelerator cross-section data, is given by the product of an upper-bound linear energy-transfer spectrum and the mean cross section of the memory cell. Plots of the spectrum are given for low-inclination and polar orbits. An alternative expression for the exact upset rate is also presented. Both methods rely only on experimentally obtained cross-section data and are valid for sensitive bit regions having arbitrary shape.

  17. Measurement of 58Fe (p , n)58Co reaction cross-section within the proton energy range of 3.38 to 19.63 MeV

    NASA Astrophysics Data System (ADS)

    Ghosh, Reetuparna; Badwar, Sylvia; Lawriniang, Bioletty; Jyrwa, Betylda; Naik, Haldhara; Naik, Yeshwant; Suryanarayana, Saraswatula Venkata; Ganesan, Srinivasan

    2017-08-01

    The 58Fe (p , n)58Co reaction cross-section within Giant Dipole Resonance (GDR) region i.e. from 3.38 to 19.63 MeV was measured by stacked-foil activation and off-line γ-ray spectrometric technique using the BARC-TIFR Pelletron facility at Mumbai. The present data were compared with the existing literature data and found to be in good agreement. The 58Fe (p , n)58Co reaction cross-section as a function of proton energy was also theoretically calculated by using the computer code TALYS-1.8 and found to be in good agreement, which shows the validity of the TALYS-1.8 program.

  18. Evaluating the statistical performance of less applied algorithms in classification of worldview-3 imagery data in an urbanized landscape

    NASA Astrophysics Data System (ADS)

    Ranaie, Mehrdad; Soffianian, Alireza; Pourmanafi, Saeid; Mirghaffari, Noorollah; Tarkesh, Mostafa

    2018-03-01

    In recent decade, analyzing the remotely sensed imagery is considered as one of the most common and widely used procedures in the environmental studies. In this case, supervised image classification techniques play a central role. Hence, taking a high resolution Worldview-3 over a mixed urbanized landscape in Iran, three less applied image classification methods including Bagged CART, Stochastic gradient boosting model and Neural network with feature extraction were tested and compared with two prevalent methods: random forest and support vector machine with linear kernel. To do so, each method was run ten time and three validation techniques was used to estimate the accuracy statistics consist of cross validation, independent validation and validation with total of train data. Moreover, using ANOVA and Tukey test, statistical difference significance between the classification methods was significantly surveyed. In general, the results showed that random forest with marginal difference compared to Bagged CART and stochastic gradient boosting model is the best performing method whilst based on independent validation there was no significant difference between the performances of classification methods. It should be finally noted that neural network with feature extraction and linear support vector machine had better processing speed than other.

  19. High-grading bias: subtle problems with assessing power of selected subsets of loci for population assignment.

    PubMed

    Waples, Robin S

    2010-07-01

    Recognition of the importance of cross-validation ('any technique or instance of assessing how the results of a statistical analysis will generalize to an independent dataset'; Wiktionary, en.wiktionary.org) is one reason that the U.S. Securities and Exchange Commission requires all investment products to carry some variation of the disclaimer, 'Past performance is no guarantee of future results.' Even a cursory examination of financial behaviour, however, demonstrates that this warning is regularly ignored, even by those who understand what an independent dataset is. In the natural sciences, an analogue to predicting future returns for an investment strategy is predicting power of a particular algorithm to perform with new data. Once again, the key to developing an unbiased assessment of future performance is through testing with independent data--that is, data that were in no way involved in developing the method in the first place. A 'gold-standard' approach to cross-validation is to divide the data into two parts, one used to develop the algorithm, the other used to test its performance. Because this approach substantially reduces the sample size that can be used in constructing the algorithm, researchers often try other variations of cross-validation to accomplish the same ends. As illustrated by Anderson in this issue of Molecular Ecology Resources, however, not all attempts at cross-validation produce the desired result. Anderson used simulated data to evaluate performance of several software programs designed to identify subsets of loci that can be effective for assigning individuals to population of origin based on multilocus genetic data. Such programs are likely to become increasingly popular as researchers seek ways to streamline routine analyses by focusing on small sets of loci that contain most of the desired signal. Anderson found that although some of the programs made an attempt at cross-validation, all failed to meet the 'gold standard' of using truly independent data and therefore produced overly optimistic assessments of power of the selected set of loci--a phenomenon known as 'high grading bias.'

  20. The R package "sperrorest" : Parallelized spatial error estimation and variable importance assessment for geospatial machine learning

    NASA Astrophysics Data System (ADS)

    Schratz, Patrick; Herrmann, Tobias; Brenning, Alexander

    2017-04-01

    Computational and statistical prediction methods such as the support vector machine have gained popularity in remote-sensing applications in recent years and are often compared to more traditional approaches like maximum-likelihood classification. However, the accuracy assessment of such predictive models in a spatial context needs to account for the presence of spatial autocorrelation in geospatial data by using spatial cross-validation and bootstrap strategies instead of their now more widely used non-spatial equivalent. The R package sperrorest by A. Brenning [IEEE International Geoscience and Remote Sensing Symposium, 1, 374 (2012)] provides a generic interface for performing (spatial) cross-validation of any statistical or machine-learning technique available in R. Since spatial statistical models as well as flexible machine-learning algorithms can be computationally expensive, parallel computing strategies are required to perform cross-validation efficiently. The most recent major release of sperrorest therefore comes with two new features (aside from improved documentation): The first one is the parallelized version of sperrorest(), parsperrorest(). This function features two parallel modes to greatly speed up cross-validation runs. Both parallel modes are platform independent and provide progress information. par.mode = 1 relies on the pbapply package and calls interactively (depending on the platform) parallel::mclapply() or parallel::parApply() in the background. While forking is used on Unix-Systems, Windows systems use a cluster approach for parallel execution. par.mode = 2 uses the foreach package to perform parallelization. This method uses a different way of cluster parallelization than the parallel package does. In summary, the robustness of parsperrorest() is increased with the implementation of two independent parallel modes. A new way of partitioning the data in sperrorest is provided by partition.factor.cv(). This function gives the user the possibility to perform cross-validation at the level of some grouping structure. As an example, in remote sensing of agricultural land uses, pixels from the same field contain nearly identical information and will thus be jointly placed in either the test set or the training set. Other spatial sampling resampling strategies are already available and can be extended by the user.

  1. The Naïve Overfitting Index Selection (NOIS): A new method to optimize model complexity for hyperspectral data

    NASA Astrophysics Data System (ADS)

    Rocha, Alby D.; Groen, Thomas A.; Skidmore, Andrew K.; Darvishzadeh, Roshanak; Willemen, Louise

    2017-11-01

    The growing number of narrow spectral bands in hyperspectral remote sensing improves the capacity to describe and predict biological processes in ecosystems. But it also poses a challenge to fit empirical models based on such high dimensional data, which often contain correlated and noisy predictors. As sample sizes, to train and validate empirical models, seem not to be increasing at the same rate, overfitting has become a serious concern. Overly complex models lead to overfitting by capturing more than the underlying relationship, and also through fitting random noise in the data. Many regression techniques claim to overcome these problems by using different strategies to constrain complexity, such as limiting the number of terms in the model, by creating latent variables or by shrinking parameter coefficients. This paper is proposing a new method, named Naïve Overfitting Index Selection (NOIS), which makes use of artificially generated spectra, to quantify the relative model overfitting and to select an optimal model complexity supported by the data. The robustness of this new method is assessed by comparing it to a traditional model selection based on cross-validation. The optimal model complexity is determined for seven different regression techniques, such as partial least squares regression, support vector machine, artificial neural network and tree-based regressions using five hyperspectral datasets. The NOIS method selects less complex models, which present accuracies similar to the cross-validation method. The NOIS method reduces the chance of overfitting, thereby avoiding models that present accurate predictions that are only valid for the data used, and too complex to make inferences about the underlying process.

  2. Methodological issues in volumetric magnetic resonance imaging of the brain in the Edinburgh High Risk Project.

    PubMed

    Whalley, H C; Kestelman, J N; Rimmington, J E; Kelso, A; Abukmeil, S S; Best, J J; Johnstone, E C; Lawrie, S M

    1999-07-30

    The Edinburgh High Risk Project is a longitudinal study of brain structure (and function) in subjects at high risk of developing schizophrenia in the next 5-10 years for genetic reasons. In this article we describe the methods of volumetric analysis of structural magnetic resonance images used in the study. We also consider potential sources of error in these methods: the validity of our image analysis techniques; inter- and intra-rater reliability; possible positional variation; and thresholding criteria used in separating brain from cerebro-spinal fluid (CSF). Investigation with a phantom test object (of similar imaging characteristics to the brain) provided evidence for the validity of our image acquisition and analysis techniques. Both inter- and intra-rater reliability were found to be good in whole brain measures but less so for smaller regions. There were no statistically significant differences in positioning across the three study groups (patients with schizophrenia, high risk subjects and normal volunteers). A new technique for thresholding MRI scans longitudinally is described (the 'rescale' method) and compared with our established method (thresholding by eye). Few differences between the two techniques were seen at 3- and 6-month follow-up. These findings demonstrate the validity and reliability of the structural MRI analysis techniques used in the Edinburgh High Risk Project, and highlight methodological issues of general concern in cross-sectional and longitudinal studies of brain structure in healthy control subjects and neuropsychiatric populations.

  3. Transport methods and interactions for space radiations

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Townsend, Lawrence W.; Schimmerling, Walter S.; Khandelwal, Govind S.; Khan, Ferdous S.; Nealy, John E.; Cucinotta, Francis A.; Simonsen, Lisa C.; Shinn, Judy L.; Norbury, John W.

    1991-01-01

    A review of the program in space radiation protection at the Langley Research Center is given. The relevant Boltzmann equations are given with a discussion of approximation procedures for space applications. The interaction coefficients are related to solution of the many-body Schroedinger equation with nuclear and electromagnetic forces. Various solution techniques are discussed to obtain relevant interaction cross sections with extensive comparison with experiments. Solution techniques for the Boltzmann equations are discussed in detail. Transport computer code validation is discussed through analytical benchmarking, comparison with other codes, comparison with laboratory experiments and measurements in space. Applications to lunar and Mars missions are discussed.

  4. Automated Cross-Sectional Measurement Method of Intracranial Dural Venous Sinuses.

    PubMed

    Lublinsky, S; Friedman, A; Kesler, A; Zur, D; Anconina, R; Shelef, I

    2016-03-01

    MRV is an important blood vessel imaging and diagnostic tool for the evaluation of stenosis, occlusions, or aneurysms. However, an accurate image-processing tool for vessel comparison is unavailable. The purpose of this study was to develop and test an automated technique for vessel cross-sectional analysis. An algorithm for vessel cross-sectional analysis was developed that included 7 main steps: 1) image registration, 2) masking, 3) segmentation, 4) skeletonization, 5) cross-sectional planes, 6) clustering, and 7) cross-sectional analysis. Phantom models were used to validate the technique. The method was also tested on a control subject and a patient with idiopathic intracranial hypertension (4 large sinuses tested: right and left transverse sinuses, superior sagittal sinus, and straight sinus). The cross-sectional area and shape measurements were evaluated before and after lumbar puncture in patients with idiopathic intracranial hypertension. The vessel-analysis algorithm had a high degree of stability with <3% of cross-sections manually corrected. All investigated principal cranial blood sinuses had a significant cross-sectional area increase after lumbar puncture (P ≤ .05). The average triangularity of the transverse sinuses was increased, and the mean circularity of the sinuses was decreased by 6% ± 12% after lumbar puncture. Comparison of phantom and real data showed that all computed errors were <1 voxel unit, which confirmed that the method provided a very accurate solution. In this article, we present a novel automated imaging method for cross-sectional vessels analysis. The method can provide an efficient quantitative detection of abnormalities in the dural sinuses. © 2016 by American Journal of Neuroradiology.

  5. Scoring and staging systems using cox linear regression modeling and recursive partitioning.

    PubMed

    Lee, J W; Um, S H; Lee, J B; Mun, J; Cho, H

    2006-01-01

    Scoring and staging systems are used to determine the order and class of data according to predictors. Systems used for medical data, such as the Child-Turcotte-Pugh scoring and staging systems for ordering and classifying patients with liver disease, are often derived strictly from physicians' experience and intuition. We construct objective and data-based scoring/staging systems using statistical methods. We consider Cox linear regression modeling and recursive partitioning techniques for censored survival data. In particular, to obtain a target number of stages we propose cross-validation and amalgamation algorithms. We also propose an algorithm for constructing scoring and staging systems by integrating local Cox linear regression models into recursive partitioning, so that we can retain the merits of both methods such as superior predictive accuracy, ease of use, and detection of interactions between predictors. The staging system construction algorithms are compared by cross-validation evaluation of real data. The data-based cross-validation comparison shows that Cox linear regression modeling is somewhat better than recursive partitioning when there are only continuous predictors, while recursive partitioning is better when there are significant categorical predictors. The proposed local Cox linear recursive partitioning has better predictive accuracy than Cox linear modeling and simple recursive partitioning. This study indicates that integrating local linear modeling into recursive partitioning can significantly improve prediction accuracy in constructing scoring and staging systems.

  6. An empirical assessment of validation practices for molecular classifiers

    PubMed Central

    Castaldi, Peter J.; Dahabreh, Issa J.

    2011-01-01

    Proposed molecular classifiers may be overfit to idiosyncrasies of noisy genomic and proteomic data. Cross-validation methods are often used to obtain estimates of classification accuracy, but both simulations and case studies suggest that, when inappropriate methods are used, bias may ensue. Bias can be bypassed and generalizability can be tested by external (independent) validation. We evaluated 35 studies that have reported on external validation of a molecular classifier. We extracted information on study design and methodological features, and compared the performance of molecular classifiers in internal cross-validation versus external validation for 28 studies where both had been performed. We demonstrate that the majority of studies pursued cross-validation practices that are likely to overestimate classifier performance. Most studies were markedly underpowered to detect a 20% decrease in sensitivity or specificity between internal cross-validation and external validation [median power was 36% (IQR, 21–61%) and 29% (IQR, 15–65%), respectively]. The median reported classification performance for sensitivity and specificity was 94% and 98%, respectively, in cross-validation and 88% and 81% for independent validation. The relative diagnostic odds ratio was 3.26 (95% CI 2.04–5.21) for cross-validation versus independent validation. Finally, we reviewed all studies (n = 758) which cited those in our study sample, and identified only one instance of additional subsequent independent validation of these classifiers. In conclusion, these results document that many cross-validation practices employed in the literature are potentially biased and genuine progress in this field will require adoption of routine external validation of molecular classifiers, preferably in much larger studies than in current practice. PMID:21300697

  7. Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2017-08-07

    The sensitivity and uncertainty analysis course will introduce students to k eff sensitivity data, cross-section uncertainty data, how k eff sensitivity data and k eff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.

  8. Detection and quantification of adulteration in sandalwood oil through near infrared spectroscopy.

    PubMed

    Kuriakose, Saji; Thankappan, Xavier; Joe, Hubert; Venkataraman, Venkateswaran

    2010-10-01

    The confirmation of authenticity of essential oils and the detection of adulteration are problems of increasing importance in the perfumes, pharmaceutical, flavor and fragrance industries. This is especially true for 'value added' products like sandalwood oil. A methodical study is conducted here to demonstrate the potential use of Near Infrared (NIR) spectroscopy along with multivariate calibration models like principal component regression (PCR) and partial least square regression (PLSR) as rapid analytical techniques for the qualitative and quantitative determination of adulterants in sandalwood oil. After suitable pre-processing of the NIR raw spectral data, the models are built-up by cross-validation. The lowest Root Mean Square Error of Cross-Validation and Calibration (RMSECV and RMSEC % v/v) are used as a decision supporting system to fix the optimal number of factors. The coefficient of determination (R(2)) and the Root Mean Square Error of Prediction (RMSEP % v/v) in the prediction sets are used as the evaluation parameters (R(2) = 0.9999 and RMSEP = 0.01355). The overall result leads to the conclusion that NIR spectroscopy with chemometric techniques could be successfully used as a rapid, simple, instant and non-destructive method for the detection of adulterants, even 1% of the low-grade oils, in the high quality form of sandalwood oil.

  9. Using deep learning for detecting gender in adult chest radiographs

    NASA Astrophysics Data System (ADS)

    Xue, Zhiyun; Antani, Sameer; Long, L. Rodney; Thoma, George R.

    2018-03-01

    In this paper, we present a method for automatically identifying the gender of an imaged person using their frontal chest x-ray images. Our work is motivated by the need to determine missing gender information in some datasets. The proposed method employs the technique of convolutional neural network (CNN) based deep learning and transfer learning to overcome the challenge of developing handcrafted features in limited data. Specifically, the method consists of four main steps: pre-processing, CNN feature extractor, feature selection, and classifier. The method is tested on a combined dataset obtained from several sources with varying acquisition quality resulting in different pre-processing steps that are applied for each. For feature extraction, we tested and compared four CNN architectures, viz., AlexNet, VggNet, GoogLeNet, and ResNet. We applied a feature selection technique, since the feature length is larger than the number of images. Two popular classifiers: SVM and Random Forest, are used and compared. We evaluated the classification performance by cross-validation and used seven performance measures. The best performer is the VggNet-16 feature extractor with the SVM classifier, with accuracy of 86.6% and ROC Area being 0.932 for 5-fold cross validation. We also discuss several misclassified cases and describe future work for performance improvement.

  10. A closer look at cross-validation for assessing the accuracy of gene regulatory networks and models.

    PubMed

    Tabe-Bordbar, Shayan; Emad, Amin; Zhao, Sihai Dave; Sinha, Saurabh

    2018-04-26

    Cross-validation (CV) is a technique to assess the generalizability of a model to unseen data. This technique relies on assumptions that may not be satisfied when studying genomics datasets. For example, random CV (RCV) assumes that a randomly selected set of samples, the test set, well represents unseen data. This assumption doesn't hold true where samples are obtained from different experimental conditions, and the goal is to learn regulatory relationships among the genes that generalize beyond the observed conditions. In this study, we investigated how the CV procedure affects the assessment of supervised learning methods used to learn gene regulatory networks (or in other applications). We compared the performance of a regression-based method for gene expression prediction estimated using RCV with that estimated using a clustering-based CV (CCV) procedure. Our analysis illustrates that RCV can produce over-optimistic estimates of the model's generalizability compared to CCV. Next, we defined the 'distinctness' of test set from training set and showed that this measure is predictive of performance of the regression method. Finally, we introduced a simulated annealing method to construct partitions with gradually increasing distinctness and showed that performance of different gene expression prediction methods can be better evaluated using this method.

  11. Differential evolution algorithm-based kernel parameter selection for Fukunaga-Koontz Transform subspaces construction

    NASA Astrophysics Data System (ADS)

    Binol, Hamidullah; Bal, Abdullah; Cukur, Huseyin

    2015-10-01

    The performance of the kernel based techniques depends on the selection of kernel parameters. That's why; suitable parameter selection is an important problem for many kernel based techniques. This article presents a novel technique to learn the kernel parameters in kernel Fukunaga-Koontz Transform based (KFKT) classifier. The proposed approach determines the appropriate values of kernel parameters through optimizing an objective function constructed based on discrimination ability of KFKT. For this purpose we have utilized differential evolution algorithm (DEA). The new technique overcomes some disadvantages such as high time consumption existing in the traditional cross-validation method, and it can be utilized in any type of data. The experiments for target detection applications on the hyperspectral images verify the effectiveness of the proposed method.

  12. Alternative methods to evaluate trial level surrogacy.

    PubMed

    Abrahantes, Josè Cortiñas; Shkedy, Ziv; Molenberghs, Geert

    2008-01-01

    The evaluation and validation of surrogate endpoints have been extensively studied in the last decade. Prentice [1] and Freedman, Graubard and Schatzkin [2] laid the foundations for the evaluation of surrogate endpoints in randomized clinical trials. Later, Buyse et al. [5] proposed a meta-analytic methodology, producing different methods for different settings, which was further studied by Alonso and Molenberghs [9], in their unifying approach based on information theory. In this article, we focus our attention on the trial-level surrogacy and propose alternative procedures to evaluate such surrogacy measure, which do not pre-specify the type of association. A promising correction based on cross-validation is investigated. As well as the construction of confidence intervals for this measure. In order to avoid making assumption about the type of relationship between the treatment effects and its distribution, a collection of alternative methods, based on regression trees, bagging, random forests, and support vector machines, combined with bootstrap-based confidence interval and, should one wish, in conjunction with a cross-validation based correction, will be proposed and applied. We apply the various strategies to data from three clinical studies: in opthalmology, in advanced colorectal cancer, and in schizophrenia. The results obtained for the three case studies are compared; they indicate that using random forest or bagging models produces larger estimated values for the surrogacy measure, which are in general stabler and the confidence interval narrower than linear regression and support vector regression. For the advanced colorectal cancer studies, we even found the trial-level surrogacy is considerably different from what has been reported. In general the alternative methods are more computationally demanding, and specially the calculation of the confidence intervals, require more computational time that the delta-method counterpart. First, more flexible modeling techniques can be used, allowing for other type of association. Second, when no cross-validation-based correction is applied, overly optimistic trial-level surrogacy estimates will be found, thus cross-validation is highly recommendable. Third, the use of the delta method to calculate confidence intervals is not recommendable since it makes assumptions valid only in very large samples. It may also produce range-violating limits. We therefore recommend alternatives: bootstrap methods in general. Also, the information-theoretic approach produces comparable results with the bagging and random forest approaches, when cross-validation correction is applied. It is also important to observe that, even for the case in which the linear model might be a good option too, bagging methods perform well too, and their confidence intervals were more narrow.

  13. [Orofacial alterations and surface electromyography in neurodevelopmental disorders].

    PubMed

    Rosell-Clari, V

    2017-02-24

    Surface electromyography has become a widely used technique for measuring the activity of different muscle groups. Although the reliability and validity of the technique are discussed, there is an important body of scientific literature that defends the use of this technique. To present through a case study, the two basic uses of surface electromyography: the measurement of orofacial muscular activity and use it as biofeedback modulator of the muscular activity itself. A 10 years-old girl with a dolichocephalic and prognosis facial profile, anterior open bite and bilateral cross bite, bilateral Angle class II occlusion and atypical swallowing with lingual interposition. The MioTool Face by Miotec Suite 1.0, it could use until 8-channel bipolar surface electromyography. Surface electrodes were placed in the orofacial musculature and the results obtained were measured and visualized through the software Miograph and Biotrainer. The results confirm those obtained through the clinical exploration of the patient and support the use of these measurements for the estimation and validation of mechanical models of the masticatory and swallowing system. Electromyographic biofeedback is shown as an effective technique to self-control the force performed in key muscle groups by performing primary activities such as chewing and swallowing.

  14. Improving lung cancer prognosis assessment by incorporating synthetic minority oversampling technique and score fusion method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Shiju; Qian, Wei; Guan, Yubao

    2016-06-15

    Purpose: This study aims to investigate the potential to improve lung cancer recurrence risk prediction performance for stage I NSCLS patients by integrating oversampling, feature selection, and score fusion techniques and develop an optimal prediction model. Methods: A dataset involving 94 early stage lung cancer patients was retrospectively assembled, which includes CT images, nine clinical and biological (CB) markers, and outcome of 3-yr disease-free survival (DFS) after surgery. Among the 94 patients, 74 remained DFS and 20 had cancer recurrence. Applying a computer-aided detection scheme, tumors were segmented from the CT images and 35 quantitative image (QI) features were initiallymore » computed. Two normalized Gaussian radial basis function network (RBFN) based classifiers were built based on QI features and CB markers separately. To improve prediction performance, the authors applied a synthetic minority oversampling technique (SMOTE) and a BestFirst based feature selection method to optimize the classifiers and also tested fusion methods to combine QI and CB based prediction results. Results: Using a leave-one-case-out cross-validation (K-fold cross-validation) method, the computed areas under a receiver operating characteristic curve (AUCs) were 0.716 ± 0.071 and 0.642 ± 0.061, when using the QI and CB based classifiers, respectively. By fusion of the scores generated by the two classifiers, AUC significantly increased to 0.859 ± 0.052 (p < 0.05) with an overall prediction accuracy of 89.4%. Conclusions: This study demonstrated the feasibility of improving prediction performance by integrating SMOTE, feature selection, and score fusion techniques. Combining QI features and CB markers and performing SMOTE prior to feature selection in classifier training enabled RBFN based classifier to yield improved prediction accuracy.« less

  15. Design of Novel Chemotherapeutic Agents Targeting Checkpoint Kinase 1 Using 3D-QSAR Modeling and Molecular Docking Methods.

    PubMed

    Balupuri, Anand; Balasubramanian, Pavithra K; Cho, Seung J

    2016-01-01

    Checkpoint kinase 1 (Chk1) has emerged as a potential therapeutic target for design and development of novel anticancer drugs. Herein, we have performed three-dimensional quantitative structure-activity relationship (3D-QSAR) and molecular docking analyses on a series of diazacarbazoles to design potent Chk1 inhibitors. 3D-QSAR models were developed using comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) techniques. Docking studies were performed using AutoDock. The best CoMFA and CoMSIA models exhibited cross-validated correlation coefficient (q2) values of 0.631 and 0.585, and non-cross-validated correlation coefficient (r2) values of 0.933 and 0.900, respectively. CoMFA and CoMSIA models showed reasonable external predictabilities (r2 pred) of 0.672 and 0.513, respectively. A satisfactory performance in the various internal and external validation techniques indicated the reliability and robustness of the best model. Docking studies were performed to explore the binding mode of inhibitors inside the active site of Chk1. Molecular docking revealed that hydrogen bond interactions with Lys38, Glu85 and Cys87 are essential for Chk1 inhibitory activity. The binding interaction patterns observed during docking studies were complementary to 3D-QSAR results. Information obtained from the contour map analysis was utilized to design novel potent Chk1 inhibitors. Their activities and binding affinities were predicted using the derived model and docking studies. Designed inhibitors were proposed as potential candidates for experimental synthesis.

  16. In silico prediction of Tetrahymena pyriformis toxicity for diverse industrial chemicals with substructure pattern recognition and machine learning methods.

    PubMed

    Cheng, Feixiong; Shen, Jie; Yu, Yue; Li, Weihua; Liu, Guixia; Lee, Philip W; Tang, Yun

    2011-03-01

    There is an increasing need for the rapid safety assessment of chemicals by both industries and regulatory agencies throughout the world. In silico techniques are practical alternatives in the environmental hazard assessment. It is especially true to address the persistence, bioaccumulative and toxicity potentials of organic chemicals. Tetrahymena pyriformis toxicity is often used as a toxic endpoint. In this study, 1571 diverse unique chemicals were collected from the literature and composed of the largest diverse data set for T. pyriformis toxicity. Classification predictive models of T. pyriformis toxicity were developed by substructure pattern recognition and different machine learning methods, including support vector machine (SVM), C4.5 decision tree, k-nearest neighbors and random forest. The results of a 5-fold cross-validation showed that the SVM method performed better than other algorithms. The overall predictive accuracies of the SVM classification model with radial basis functions kernel was 92.2% for the 5-fold cross-validation and 92.6% for the external validation set, respectively. Furthermore, several representative substructure patterns for characterizing T. pyriformis toxicity were also identified via the information gain analysis methods. Copyright © 2010 Elsevier Ltd. All rights reserved.

  17. Alzheimer's Disease Assessment: A Review and Illustrations Focusing on Item Response Theory Techniques.

    PubMed

    Balsis, Steve; Choudhury, Tabina K; Geraci, Lisa; Benge, Jared F; Patrick, Christopher J

    2018-04-01

    Alzheimer's disease (AD) affects neurological, cognitive, and behavioral processes. Thus, to accurately assess this disease, researchers and clinicians need to combine and incorporate data across these domains. This presents not only distinct methodological and statistical challenges but also unique opportunities for the development and advancement of psychometric techniques. In this article, we describe relatively recent research using item response theory (IRT) that has been used to make progress in assessing the disease across its various symptomatic and pathological manifestations. We focus on applications of IRT to improve scoring, test development (including cross-validation and adaptation), and linking and calibration. We conclude by describing potential future multidimensional applications of IRT techniques that may improve the precision with which AD is measured.

  18. Empirical performance of interpolation techniques in risk-neutral density (RND) estimation

    NASA Astrophysics Data System (ADS)

    Bahaludin, H.; Abdullah, M. H.

    2017-03-01

    The objective of this study is to evaluate the empirical performance of interpolation techniques in risk-neutral density (RND) estimation. Firstly, the empirical performance is evaluated by using statistical analysis based on the implied mean and the implied variance of RND. Secondly, the interpolation performance is measured based on pricing error. We propose using the leave-one-out cross-validation (LOOCV) pricing error for interpolation selection purposes. The statistical analyses indicate that there are statistical differences between the interpolation techniques:second-order polynomial, fourth-order polynomial and smoothing spline. The results of LOOCV pricing error shows that interpolation by using fourth-order polynomial provides the best fitting to option prices in which it has the lowest value error.

  19. Selection of regularization parameter in total variation image restoration.

    PubMed

    Liao, Haiyong; Li, Fang; Ng, Michael K

    2009-11-01

    We consider and study total variation (TV) image restoration. In the literature there are several regularization parameter selection methods for Tikhonov regularization problems (e.g., the discrepancy principle and the generalized cross-validation method). However, to our knowledge, these selection methods have not been applied to TV regularization problems. The main aim of this paper is to develop a fast TV image restoration method with an automatic selection of the regularization parameter scheme to restore blurred and noisy images. The method exploits the generalized cross-validation (GCV) technique to determine inexpensively how much regularization to use in each restoration step. By updating the regularization parameter in each iteration, the restored image can be obtained. Our experimental results for testing different kinds of noise show that the visual quality and SNRs of images restored by the proposed method is promising. We also demonstrate that the method is efficient, as it can restore images of size 256 x 256 in approximately 20 s in the MATLAB computing environment.

  20. Tuning support vector machines for minimax and Neyman-Pearson classification.

    PubMed

    Davenport, Mark A; Baraniuk, Richard G; Scott, Clayton D

    2010-10-01

    This paper studies the training of support vector machine (SVM) classifiers with respect to the minimax and Neyman-Pearson criteria. In principle, these criteria can be optimized in a straightforward way using a cost-sensitive SVM. In practice, however, because these criteria require especially accurate error estimation, standard techniques for tuning SVM parameters, such as cross-validation, can lead to poor classifier performance. To address this issue, we first prove that the usual cost-sensitive SVM, here called the 2C-SVM, is equivalent to another formulation called the 2nu-SVM. We then exploit a characterization of the 2nu-SVM parameter space to develop a simple yet powerful approach to error estimation based on smoothing. In an extensive experimental study, we demonstrate that smoothing significantly improves the accuracy of cross-validation error estimates, leading to dramatic performance gains. Furthermore, we propose coordinate descent strategies that offer significant gains in computational efficiency, with little to no loss in performance.

  1. Utilization of advanced calibration techniques in stochastic rock fall analysis of quarry slopes

    NASA Astrophysics Data System (ADS)

    Preh, Alexander; Ahmadabadi, Morteza; Kolenprat, Bernd

    2016-04-01

    In order to study rock fall dynamics, a research project was conducted by the Vienna University of Technology and the Austrian Central Labour Inspectorate (Federal Ministry of Labour, Social Affairs and Consumer Protection). A part of this project included 277 full-scale drop tests at three different quarries in Austria and recording key parameters of the rock fall trajectories. The tests involved a total of 277 boulders ranging from 0.18 to 1.8 m in diameter and from 0.009 to 8.1 Mg in mass. The geology of these sites included strong rock belonging to igneous, metamorphic and volcanic types. In this paper the results of the tests are used for calibration and validation a new stochastic computer model. It is demonstrated that the error of the model (i.e. the difference between observed and simulated results) has a lognormal distribution. Selecting two parameters, advanced calibration techniques including Markov Chain Monte Carlo Technique, Maximum Likelihood and Root Mean Square Error (RMSE) are utilized to minimize the error. Validation of the model based on the cross validation technique reveals that in general, reasonable stochastic approximations of the rock fall trajectories are obtained in all dimensions, including runout, bounce heights and velocities. The approximations are compared to the measured data in terms of median, 95% and maximum values. The results of the comparisons indicate that approximate first-order predictions, using a single set of input parameters, are possible and can be used to aid practical hazard and risk assessment.

  2. E&V (Evaluation and Validation) Reference Manual, Version 1.0.

    DTIC Science & Technology

    1988-07-01

    references featured in the Reference Manual. G-05097a GENERAL REFERENCE INFORMATION EXTRACTED , FROM * INDEXES AND CROSS REFERENCES CHAPTER 4...at E&V techniques through many different paths, and provides a means to extract useful information along the way. /^c^^s; /r^ ^yr*•**•»» * L...electronically (preferred) to szymansk@ajpo.sei.cmu.edu or by regular mail to Mr. Raymond Szymanski . AFWAUAAAF, Wright Patterson AFB, OH 45433-6543. ES-2

  3. Efficient strategies for leave-one-out cross validation for genomic best linear unbiased prediction.

    PubMed

    Cheng, Hao; Garrick, Dorian J; Fernando, Rohan L

    2017-01-01

    A random multiple-regression model that simultaneously fit all allele substitution effects for additive markers or haplotypes as uncorrelated random effects was proposed for Best Linear Unbiased Prediction, using whole-genome data. Leave-one-out cross validation can be used to quantify the predictive ability of a statistical model. Naive application of Leave-one-out cross validation is computationally intensive because the training and validation analyses need to be repeated n times, once for each observation. Efficient Leave-one-out cross validation strategies are presented here, requiring little more effort than a single analysis. Efficient Leave-one-out cross validation strategies is 786 times faster than the naive application for a simulated dataset with 1,000 observations and 10,000 markers and 99 times faster with 1,000 observations and 100 markers. These efficiencies relative to the naive approach using the same model will increase with increases in the number of observations. Efficient Leave-one-out cross validation strategies are presented here, requiring little more effort than a single analysis.

  4. Crossing Fibers Detection with an Analytical High Order Tensor Decomposition

    PubMed Central

    Megherbi, T.; Kachouane, M.; Oulebsir-Boumghar, F.; Deriche, R.

    2014-01-01

    Diffusion magnetic resonance imaging (dMRI) is the only technique to probe in vivo and noninvasively the fiber structure of human brain white matter. Detecting the crossing of neuronal fibers remains an exciting challenge with an important impact in tractography. In this work, we tackle this challenging problem and propose an original and efficient technique to extract all crossing fibers from diffusion signals. To this end, we start by estimating, from the dMRI signal, the so-called Cartesian tensor fiber orientation distribution (CT-FOD) function, whose maxima correspond exactly to the orientations of the fibers. The fourth order symmetric positive definite tensor that represents the CT-FOD is then analytically decomposed via the application of a new theoretical approach and this decomposition is used to accurately extract all the fibers orientations. Our proposed high order tensor decomposition based approach is minimal and allows recovering the whole crossing fibers without any a priori information on the total number of fibers. Various experiments performed on noisy synthetic data, on phantom diffusion, data and on human brain data validate our approach and clearly demonstrate that it is efficient, robust to noise and performs favorably in terms of angular resolution and accuracy when compared to some classical and state-of-the-art approaches. PMID:25246940

  5. Computing discharge using the index velocity method

    USGS Publications Warehouse

    Levesque, Victor A.; Oberg, Kevin A.

    2012-01-01

    Application of the index velocity method for computing continuous records of discharge has become increasingly common, especially since the introduction of low-cost acoustic Doppler velocity meters (ADVMs) in 1997. Presently (2011), the index velocity method is being used to compute discharge records for approximately 470 gaging stations operated and maintained by the U.S. Geological Survey. The purpose of this report is to document and describe techniques for computing discharge records using the index velocity method. Computing discharge using the index velocity method differs from the traditional stage-discharge method by separating velocity and area into two ratings—the index velocity rating and the stage-area rating. The outputs from each of these ratings, mean channel velocity (V) and cross-sectional area (A), are then multiplied together to compute a discharge. For the index velocity method, V is a function of such parameters as streamwise velocity, stage, cross-stream velocity, and velocity head, and A is a function of stage and cross-section shape. The index velocity method can be used at locations where stage-discharge methods are used, but it is especially appropriate when more than one specific discharge can be measured for a specific stage. After the ADVM is selected, installed, and configured, the stage-area rating and the index velocity rating must be developed. A standard cross section is identified and surveyed in order to develop the stage-area rating. The standard cross section should be surveyed every year for the first 3 years of operation and thereafter at a lesser frequency, depending on the susceptibility of the cross section to change. Periodic measurements of discharge are used to calibrate and validate the index rating for the range of conditions experienced at the gaging station. Data from discharge measurements, ADVMs, and stage sensors are compiled for index-rating analysis. Index ratings are developed by means of regression techniques in which the mean cross-sectional velocity for the standard section is related to the measured index velocity. Most ratings are simple-linear regressions, but more complex ratings may be necessary in some cases. Once the rating is established, validation measurements should be made periodically. Over time, validation measurements may provide additional definition to the rating or result in the creation of a new rating. The computation of discharge is the last step in the index velocity method, and in some ways it is the most straight-forward step. This step differs little from the steps used to compute discharge records for stage-discharge gaging stations. The ratings are entered into database software used for records computation, and continuous records of discharge are computed.

  6. Validation of RNAi Silencing Efficiency Using Gene Array Data shows 18.5% Failure Rate across 429 Independent Experiments.

    PubMed

    Munkácsy, Gyöngyi; Sztupinszki, Zsófia; Herman, Péter; Bán, Bence; Pénzváltó, Zsófia; Szarvas, Nóra; Győrffy, Balázs

    2016-09-27

    No independent cross-validation of success rate for studies utilizing small interfering RNA (siRNA) for gene silencing has been completed before. To assess the influence of experimental parameters like cell line, transfection technique, validation method, and type of control, we have to validate these in a large set of studies. We utilized gene chip data published for siRNA experiments to assess success rate and to compare methods used in these experiments. We searched NCBI GEO for samples with whole transcriptome analysis before and after gene silencing and evaluated the efficiency for the target and off-target genes using the array-based expression data. Wilcoxon signed-rank test was used to assess silencing efficacy and Kruskal-Wallis tests and Spearman rank correlation were used to evaluate study parameters. All together 1,643 samples representing 429 experiments published in 207 studies were evaluated. The fold change (FC) of down-regulation of the target gene was above 0.7 in 18.5% and was above 0.5 in 38.7% of experiments. Silencing efficiency was lowest in MCF7 and highest in SW480 cells (FC = 0.59 and FC = 0.30, respectively, P = 9.3E-06). Studies utilizing Western blot for validation performed better than those with quantitative polymerase chain reaction (qPCR) or microarray (FC = 0.43, FC = 0.47, and FC = 0.55, respectively, P = 2.8E-04). There was no correlation between type of control, transfection method, publication year, and silencing efficiency. Although gene silencing is a robust feature successfully cross-validated in the majority of experiments, efficiency remained insufficient in a significant proportion of studies. Selection of cell line model and validation method had the highest influence on silencing proficiency.

  7. Prostate tissue characterization/classification in 144 patient population using wavelet and higher order spectra features from transrectal ultrasound images.

    PubMed

    Pareek, Gyan; Acharya, U Rajendra; Sree, S Vinitha; Swapna, G; Yantri, Ratna; Martis, Roshan Joy; Saba, Luca; Krishnamurthi, Ganapathy; Mallarini, Giorgio; El-Baz, Ayman; Al Ekish, Shadi; Beland, Michael; Suri, Jasjit S

    2013-12-01

    In this work, we have proposed an on-line computer-aided diagnostic system called "UroImage" that classifies a Transrectal Ultrasound (TRUS) image into cancerous or non-cancerous with the help of non-linear Higher Order Spectra (HOS) features and Discrete Wavelet Transform (DWT) coefficients. The UroImage system consists of an on-line system where five significant features (one DWT-based feature and four HOS-based features) are extracted from the test image. These on-line features are transformed by the classifier parameters obtained using the training dataset to determine the class. We trained and tested six classifiers. The dataset used for evaluation had 144 TRUS images which were split into training and testing sets. Three-fold and ten-fold cross-validation protocols were adopted for training and estimating the accuracy of the classifiers. The ground truth used for training was obtained using the biopsy results. Among the six classifiers, using 10-fold cross-validation technique, Support Vector Machine and Fuzzy Sugeno classifiers presented the best classification accuracy of 97.9% with equally high values for sensitivity, specificity and positive predictive value. Our proposed automated system, which achieved more than 95% values for all the performance measures, can be an adjunct tool to provide an initial diagnosis for the identification of patients with prostate cancer. The technique, however, is limited by the limitations of 2D ultrasound guided biopsy, and we intend to improve our technique by using 3D TRUS images in the future.

  8. The cross-cultural equivalence of participation instruments: a systematic review.

    PubMed

    Stevelink, S A M; van Brakel, W H

    2013-07-01

    Concepts such as health-related quality of life, disability and participation may differ across cultures. Consequently, when assessing such a concept using a measure developed elsewhere, it is important to test its cultural equivalence. Previous research suggested a lack of cultural equivalence testing in several areas of measurement. This paper reviews the process of cross-cultural equivalence testing of instruments to measure participation in society. An existing cultural equivalence framework was adapted and used to assess participation instruments on five categories of equivalence: conceptual, item, semantic, measurement and operational equivalence. For each category, several aspects were rated, resulting in an overall category rating of 'minimal/none', 'partial' or 'extensive'. The best possible overall study rating was five 'extensive' ratings. Articles were included if the instruments focussed explicitly on measuring 'participation' and were theoretically grounded in the ICIDH(-2) or ICF. Cross-validation articles were only included if it concerned an adaptation of an instrument developed in a high or middle-income country to a low-income country or vice versa. Eight cross-cultural validation studies were included in which five participation instruments were tested (Impact on Participation and Autonomy, London Handicap Scale, Perceived Impact and Problem Profile, Craig Handicap Assessment Reporting Technique, Participation Scale). Of these eight studies, only three received at least two 'extensive' ratings for the different categories of equivalence. The majority of the cultural equivalence ratings given were 'partial' and 'minimal/none'. The majority of the 'none/minimal' ratings were given for item and measurement equivalence. The cross-cultural equivalence testing of the participation instruments included leaves much to be desired. A detailed checklist is proposed for designing a cross-validation study. Once a study has been conducted, the checklist can be used to ensure comprehensive reporting of the validation (equivalence) testing process and its results. • Participation instruments are often used in a different cultural setting than initial developed for. • The conceptualization of participation may vary across cultures. Therefore, cultural equivalence – the extent to which an instrument is equally suitable for use in two or more cultures – is an important concept to address. • This review showed that the process of cultural equivalence testing of the included participation instruments was often addressed insufficiently. • Clinicians should be aware that application of participations instruments in a different culture than initially developed for needs prior testing of cultural validity in the next context.

  9. Field assessment of noncontact stream gauging using portable surface velocity radars (SVR)

    NASA Astrophysics Data System (ADS)

    Welber, Matilde; Le Coz, Jérôme; Laronne, Jonathan B.; Zolezzi, Guido; Zamler, Daniel; Dramais, Guillaume; Hauet, Alexandre; Salvaro, Martino

    2016-02-01

    The applicability of a portable, commercially available surface velocity radar (SVR) for noncontact stream gauging was evaluated through a series of field-scale experiments carried out in a variety of sites and deployment conditions. Comparisons with various concurrent techniques showed acceptable agreement with velocity profiles, with larger uncertainties close to the banks. In addition to discharge error sources shared with intrusive velocity-area techniques, SVR discharge estimates are affected by flood-induced changes in the bed profile and by the selection of a depth-averaged to surface velocity ratio, or velocity coefficient (α). Cross-sectional averaged velocity coefficients showed smaller fluctuations and closer agreement with theoretical values than those computed on individual verticals, especially in channels with high relative roughness. Our findings confirm that α = 0.85 is a valid default value, with a preferred site-specific calibration to avoid underestimation of discharge in very smooth channels (relative roughness ˜ 0.001) and overestimation in very rough channels (relative roughness > 0.05). Theoretically derived and site-calibrated values of α also give accurate SVR-based discharge estimates (within 10%) for low and intermediate roughness flows (relative roughness 0.001 to 0.05). Moreover, discharge uncertainty does not exceed 10% even for a limited number of SVR positions along the cross section (particularly advantageous to gauge unsteady flood flows and very large floods), thereby extending the range of validity of rating curves.

  10. Protocol Design Challenges in the Detection of Awareness in Aware Subjects Using EEG Signals.

    PubMed

    Henriques, J; Gabriel, D; Grigoryeva, L; Haffen, E; Moulin, T; Aubry, R; Pazart, L; Ortega, J-P

    2016-10-01

    Recent studies have evidenced serious difficulties in detecting covert awareness with electroencephalography-based techniques both in unresponsive patients and in healthy control subjects. This work reproduces the protocol design in two recent mental imagery studies with a larger group comprising 20 healthy volunteers. The main goal is assessing if modifications in the signal extraction techniques, training-testing/cross-validation routines, and hypotheses evoked in the statistical analysis, can provide solutions to the serious difficulties documented in the literature. The lack of robustness in the results advises for further search of alternative protocols more suitable for machine learning classification and of better performing signal treatment techniques. Specific recommendations are made using the findings in this work. © EEG and Clinical Neuroscience Society (ECNS) 2014.

  11. Comparative Study on the Different Testing Techniques in Tree Classification for Detecting the Learning Motivation

    NASA Astrophysics Data System (ADS)

    Juliane, C.; Arman, A. A.; Sastramihardja, H. S.; Supriana, I.

    2017-03-01

    Having motivation to learn is a successful requirement in a learning process, and needs to be maintained properly. This study aims to measure learning motivation, especially in the process of electronic learning (e-learning). Here, data mining approach was chosen as a research method. For the testing process, the accuracy comparative study on the different testing techniques was conducted, involving Cross Validation and Percentage Split. The best accuracy was generated by J48 algorithm with a percentage split technique reaching at 92.19 %. This study provided an overview on how to detect the presence of learning motivation in the context of e-learning. It is expected to be good contribution for education, and to warn the teachers for whom they have to provide motivation.

  12. Phase-Based Adaptive Estimation of Magnitude-Squared Coherence Between Turbofan Internal Sensors and Far-Field Microphone Signals

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2015-01-01

    A cross-power spectrum phase based adaptive technique is discussed which iteratively determines the time delay between two digitized signals that are coherent. The adaptive delay algorithm belongs to a class of algorithms that identifies a minimum of a pattern matching function. The algorithm uses a gradient technique to find the value of the adaptive delay that minimizes a cost function based in part on the slope of a linear function that fits the measured cross power spectrum phase and in part on the standard error of the curve fit. This procedure is applied to data from a Honeywell TECH977 static-engine test. Data was obtained using a combustor probe, two turbine exit probes, and far-field microphones. Signals from this instrumentation are used estimate the post-combustion residence time in the combustor. Comparison with previous studies of the post-combustion residence time validates this approach. In addition, the procedure removes the bias due to misalignment of signals in the calculation of coherence which is a first step in applying array processing methods to the magnitude squared coherence data. The procedure also provides an estimate of the cross-spectrum phase-offset.

  13. Lessons Learned from a Cross-Model Validation between a Discrete Event Simulation Model and a Cohort State-Transition Model for Personalized Breast Cancer Treatment.

    PubMed

    Jahn, Beate; Rochau, Ursula; Kurzthaler, Christina; Paulden, Mike; Kluibenschädl, Martina; Arvandi, Marjan; Kühne, Felicitas; Goehler, Alexander; Krahn, Murray D; Siebert, Uwe

    2016-04-01

    Breast cancer is the most common malignancy among women in developed countries. We developed a model (the Oncotyrol breast cancer outcomes model) to evaluate the cost-effectiveness of a 21-gene assay when used in combination with Adjuvant! Online to support personalized decisions about the use of adjuvant chemotherapy. The goal of this study was to perform a cross-model validation. The Oncotyrol model evaluates the 21-gene assay by simulating a hypothetical cohort of 50-year-old women over a lifetime horizon using discrete event simulation. Primary model outcomes were life-years, quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs). We followed the International Society for Pharmacoeconomics and Outcomes Research-Society for Medical Decision Making (ISPOR-SMDM) best practice recommendations for validation and compared modeling results of the Oncotyrol model with the state-transition model developed by the Toronto Health Economics and Technology Assessment (THETA) Collaborative. Both models were populated with Canadian THETA model parameters, and outputs were compared. The differences between the models varied among the different validation end points. The smallest relative differences were in costs, and the greatest were in QALYs. All relative differences were less than 1.2%. The cost-effectiveness plane showed that small differences in the model structure can lead to different sets of nondominated test-treatment strategies with different efficiency frontiers. We faced several challenges: distinguishing between differences in outcomes due to different modeling techniques and initial coding errors, defining meaningful differences, and selecting measures and statistics for comparison (means, distributions, multivariate outcomes). Cross-model validation was crucial to identify and correct coding errors and to explain differences in model outcomes. In our comparison, small differences in either QALYs or costs led to changes in ICERs because of changes in the set of dominated and nondominated strategies. © The Author(s) 2015.

  14. Double Cross-Validation in Multiple Regression: A Method of Estimating the Stability of Results.

    ERIC Educational Resources Information Center

    Rowell, R. Kevin

    In multiple regression analysis, where resulting predictive equation effectiveness is subject to shrinkage, it is especially important to evaluate result replicability. Double cross-validation is an empirical method by which an estimate of invariance or stability can be obtained from research data. A procedure for double cross-validation is…

  15. Signal processing and neural network toolbox and its application to failure diagnosis and prognosis

    NASA Astrophysics Data System (ADS)

    Tu, Fang; Wen, Fang; Willett, Peter K.; Pattipati, Krishna R.; Jordan, Eric H.

    2001-07-01

    Many systems are comprised of components equipped with self-testing capability; however, if the system is complex involving feedback and the self-testing itself may occasionally be faulty, tracing faults to a single or multiple causes is difficult. Moreover, many sensors are incapable of reliable decision-making on their own. In such cases, a signal processing front-end that can match inference needs will be very helpful. The work is concerned with providing an object-oriented simulation environment for signal processing and neural network-based fault diagnosis and prognosis. In the toolbox, we implemented a wide range of spectral and statistical manipulation methods such as filters, harmonic analyzers, transient detectors, and multi-resolution decomposition to extract features for failure events from data collected by data sensors. Then we evaluated multiple learning paradigms for general classification, diagnosis and prognosis. The network models evaluated include Restricted Coulomb Energy (RCE) Neural Network, Learning Vector Quantization (LVQ), Decision Trees (C4.5), Fuzzy Adaptive Resonance Theory (FuzzyArtmap), Linear Discriminant Rule (LDR), Quadratic Discriminant Rule (QDR), Radial Basis Functions (RBF), Multiple Layer Perceptrons (MLP) and Single Layer Perceptrons (SLP). Validation techniques, such as N-fold cross-validation and bootstrap techniques, are employed for evaluating the robustness of network models. The trained networks are evaluated for their performance using test data on the basis of percent error rates obtained via cross-validation, time efficiency, generalization ability to unseen faults. Finally, the usage of neural networks for the prediction of residual life of turbine blades with thermal barrier coatings is described and the results are shown. The neural network toolbox has also been applied to fault diagnosis in mixed-signal circuits.

  16. Cross-Validating Chinese Language Mental Health Recovery Measures in Hong Kong

    ERIC Educational Resources Information Center

    Bola, John; Chan, Tiffany Hill Ching; Chen, Eric HY; Ng, Roger

    2016-01-01

    Objectives: Promoting recovery in mental health services is hampered by a shortage of reliable and valid measures, particularly in Hong Kong. We seek to cross validate two Chinese language measures of recovery and one of recovery-promoting environments. Method: A cross-sectional survey of people recovering from early episode psychosis (n = 121)…

  17. Quantitative determination and classification of energy drinks using near-infrared spectroscopy.

    PubMed

    Rácz, Anita; Héberger, Károly; Fodor, Marietta

    2016-09-01

    Almost a hundred commercially available energy drink samples from Hungary, Slovakia, and Greece were collected for the quantitative determination of their caffeine and sugar content with FT-NIR spectroscopy and high-performance liquid chromatography (HPLC). Calibration models were built with partial least-squares regression (PLSR). An HPLC-UV method was used to measure the reference values for caffeine content, while sugar contents were measured with the Schoorl method. Both the nominal sugar content (as indicated on the cans) and the measured sugar concentration were used as references. Although the Schoorl method has larger error and bias, appropriate models could be developed using both references. The validation of the models was based on sevenfold cross-validation and external validation. FT-NIR analysis is a good candidate to replace the HPLC-UV method, because it is much cheaper than any chromatographic method, while it is also more time-efficient. The combination of FT-NIR with multidimensional chemometric techniques like PLSR can be a good option for the detection of low caffeine concentrations in energy drinks. Moreover, three types of energy drinks that contain (i) taurine, (ii) arginine, and (iii) none of these two components were classified correctly using principal component analysis and linear discriminant analysis. Such classifications are important for the detection of adulterated samples and for quality control, as well. In this case, more than a hundred samples were used for the evaluation. The classification was validated with cross-validation and several randomization tests (X-scrambling). Graphical Abstract The way of energy drinks from cans to appropriate chemometric models.

  18. Post-partum depression in Kinshasa, Democratic Republic of Congo: validation of a concept using a mixed-methods cross-cultural approach.

    PubMed

    Bass, Judith K; Ryder, Robert W; Lammers, Marie-Christine; Mukaba, Thibaut N; Bolton, Paul A

    2008-12-01

    To determine if a post-partum depression syndrome exists among mothers in Kinshasa, Democratic Republic of Congo, by adapting and validating standard screening instruments. Using qualitative interviewing techniques, we interviewed a convenience sample of 80 women living in a large peri-urban community to better understand local conceptions of mental illness. We used this information to adapt two standard depression screeners, the Edinburgh Post-partum Depression Scale and the Hopkins Symptom Checklist. In a subsequent quantitative study, we identified another 133 women with and without the local depression syndrome and used this information to validate the adapted screening instruments. Based on the qualitative data, we found a local syndrome that closely approximates the Western model of major depressive disorder. The women we interviewed, representative of the local populace, considered this an important syndrome among new mothers because it negatively affects women and their young children. Women (n = 41) identified as suffering from this syndrome had statistically significantly higher depression severity scores on both adapted screeners than women identified as not having this syndrome (n = 20; P < 0.0001). When it is unclear or unknown if Western models of psychopathology are appropriate for use in the local context, these models must be validated to ensure cross-cultural applicability. Using a mixed-methods approach we found a local syndrome similar to depression and validated instruments to screen for this disorder. As the importance of compromised mental health in developing world populations becomes recognized, the methods described in this report will be useful more widely.

  19. Single-station monitoring of volcanoes using seismic ambient noise

    NASA Astrophysics Data System (ADS)

    De Plaen, Raphael S. M.; Lecocq, Thomas; Caudron, Corentin; Ferrazzini, Valérie; Francis, Olivier

    2016-08-01

    Seismic ambient noise cross correlation is increasingly used to monitor volcanic activity. However, this method is usually limited to volcanoes equipped with large and dense networks of broadband stations. The single-station approach may provide a powerful and reliable alternative to the classical "cross-station" approach when measuring variation of seismic velocities. We implemented it on the Piton de la Fournaise in Reunion Island, a very active volcano with a remarkable multidisciplinary continuous monitoring. Over the past decade, this volcano has been increasingly studied using the traditional cross-correlation technique and therefore represents a unique laboratory to validate our approach. Our results, tested on stations located up to 3.5 km from the eruptive site, performed as well as the classical approach to detect the volcanic eruption in the 1-2 Hz frequency band. This opens new perspectives to successfully forecast volcanic activity at volcanoes equipped with a single three-component seismometer.

  20. Detrended fluctuation analysis for major depressive disorder.

    PubMed

    Mumtaz, Wajid; Malik, Aamir Saeed; Ali, Syed Saad Azhar; Yasin, Mohd Azhar Mohd; Amin, Hafeezullah

    2015-01-01

    Clinical utility of Electroencephalography (EEG) based diagnostic studies is less clear for major depressive disorder (MDD). In this paper, a novel machine learning (ML) scheme was presented to discriminate the MDD patients and healthy controls. The proposed method inherently involved feature extraction, selection, classification and validation. The EEG data acquisition involved eyes closed (EC) and eyes open (EO) conditions. At feature extraction stage, the de-trended fluctuation analysis (DFA) was performed, based on the EEG data, to achieve scaling exponents. The DFA was performed to analyzes the presence or absence of long-range temporal correlations (LRTC) in the recorded EEG data. The scaling exponents were used as input features to our proposed system. At feature selection stage, 3 different techniques were used for comparison purposes. Logistic regression (LR) classifier was employed. The method was validated by a 10-fold cross-validation. As results, we have observed that the effect of 3 different reference montages on the computed features. The proposed method employed 3 different types of feature selection techniques for comparison purposes as well. The results show that the DFA analysis performed better in LE data compared with the IR and AR data. In addition, during Wilcoxon ranking, the AR performed better than LE and IR. Based on the results, it was concluded that the DFA provided useful information to discriminate the MDD patients and with further validation can be employed in clinics for diagnosis of MDD.

  1. Computationally efficient confidence intervals for cross-validated area under the ROC curve estimates.

    PubMed

    LeDell, Erin; Petersen, Maya; van der Laan, Mark

    In binary classification problems, the area under the ROC curve (AUC) is commonly used to evaluate the performance of a prediction model. Often, it is combined with cross-validation in order to assess how the results will generalize to an independent data set. In order to evaluate the quality of an estimate for cross-validated AUC, we obtain an estimate of its variance. For massive data sets, the process of generating a single performance estimate can be computationally expensive. Additionally, when using a complex prediction method, the process of cross-validating a predictive model on even a relatively small data set can still require a large amount of computation time. Thus, in many practical settings, the bootstrap is a computationally intractable approach to variance estimation. As an alternative to the bootstrap, we demonstrate a computationally efficient influence curve based approach to obtaining a variance estimate for cross-validated AUC.

  2. Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context.

    PubMed

    Martinez, Josue G; Carroll, Raymond J; Müller, Samuel; Sampson, Joshua N; Chatterjee, Nilanjan

    2011-11-01

    When employing model selection methods with oracle properties such as the smoothly clipped absolute deviation (SCAD) and the Adaptive Lasso, it is typical to estimate the smoothing parameter by m-fold cross-validation, for example, m = 10. In problems where the true regression function is sparse and the signals large, such cross-validation typically works well. However, in regression modeling of genomic studies involving Single Nucleotide Polymorphisms (SNP), the true regression functions, while thought to be sparse, do not have large signals. We demonstrate empirically that in such problems, the number of selected variables using SCAD and the Adaptive Lasso, with 10-fold cross-validation, is a random variable that has considerable and surprising variation. Similar remarks apply to non-oracle methods such as the Lasso. Our study strongly questions the suitability of performing only a single run of m-fold cross-validation with any oracle method, and not just the SCAD and Adaptive Lasso.

  3. Computationally efficient confidence intervals for cross-validated area under the ROC curve estimates

    PubMed Central

    Petersen, Maya; van der Laan, Mark

    2015-01-01

    In binary classification problems, the area under the ROC curve (AUC) is commonly used to evaluate the performance of a prediction model. Often, it is combined with cross-validation in order to assess how the results will generalize to an independent data set. In order to evaluate the quality of an estimate for cross-validated AUC, we obtain an estimate of its variance. For massive data sets, the process of generating a single performance estimate can be computationally expensive. Additionally, when using a complex prediction method, the process of cross-validating a predictive model on even a relatively small data set can still require a large amount of computation time. Thus, in many practical settings, the bootstrap is a computationally intractable approach to variance estimation. As an alternative to the bootstrap, we demonstrate a computationally efficient influence curve based approach to obtaining a variance estimate for cross-validated AUC. PMID:26279737

  4. Crossing trend analysis methodology and application for Turkish rainfall records

    NASA Astrophysics Data System (ADS)

    Şen, Zekâi

    2018-01-01

    Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.

  5. On the prompt identification of traces of explosives

    NASA Astrophysics Data System (ADS)

    Trobajo, M. T.; López-Cabeceira, M. M.; Carriegos, M. V.; Díez-Machío, H.

    2014-12-01

    Some recent results in the use of Raman spectroscopy for recognition of explosives are reviewed. Experimental study using spectra data base has been developed. In order to simulate a more real situation, both blank substances and explosives substances have been considered in this research. Statistic classification techniques have been performed. Estimations of prediction errors were obtained by cross-validation methods. These results can be applied in airport security systems in order to prevent terror acts (by the detection of explosive/flammable substances).

  6. Broadband computation of the scattering coefficients of infinite arbitrary cylinders.

    PubMed

    Blanchard, Cédric; Guizal, Brahim; Felbacq, Didier

    2012-07-01

    We employ a time-domain method to compute the near field on a contour enclosing infinitely long cylinders of arbitrary cross section and constitution. We therefore recover the cylindrical Hankel coefficients of the expansion of the field outside the circumscribed circle of the structure. The recovered coefficients enable the wideband analysis of complex systems, e.g., the determination of the radar cross section becomes straightforward. The prescription for constructing such a numerical tool is provided in great detail. The method is validated by computing the scattering coefficients for a homogeneous circular cylinder illuminated by a plane wave, a problem for which an analytical solution exists. Finally, some radiation properties of an optical antenna are examined by employing the proposed technique.

  7. Cross-cultural examination of measurement invariance of the Beck Depression Inventory-II.

    PubMed

    Dere, Jessica; Watters, Carolyn A; Yu, Stephanie Chee-Min; Bagby, R Michael; Ryder, Andrew G; Harkness, Kate L

    2015-03-01

    Given substantial rates of major depressive disorder among college and university students, as well as the growing cultural diversity on many campuses, establishing the cross-cultural validity of relevant assessment tools is important. In the current investigation, we examined the Beck Depression Inventory-Second Edition (BDI-II; Beck, Steer, & Brown, 1996) among Chinese-heritage (n = 933) and European-heritage (n = 933) undergraduates in North America. The investigation integrated 3 distinct lines of inquiry: (a) the literature on cultural variation in depressive symptom reporting between people of Chinese and Western heritage; (b) recent developments regarding the factor structure of the BDI-II; and (c) the application of advanced statistical techniques to the issue of cross-cultural measurement invariance. A bifactor model was found to represent the optimal factor structure of the BDI-II. Multigroup confirmatory factor analysis showed that the BDI-II had strong measurement invariance across both culture and gender. In group comparisons with latent and observed variables, Chinese-heritage students scored higher than European-heritage students on cognitive symptoms of depression. This finding deviates from the commonly held view that those of Chinese heritage somatize depression. These findings hold implications for the study and use of the BDI-II, highlight the value of advanced statistical techniques such as multigroup confirmatory factor analysis, and offer methodological lessons for cross-cultural psychopathology research more broadly. 2015 APA, all rights reserved

  8. Cross-Validation of easyCBM Reading Cut Scores in Washington: 2009-2010. Technical Report #1109

    ERIC Educational Resources Information Center

    Irvin, P. Shawn; Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Tindal, Gerald

    2011-01-01

    This technical report presents results from a cross-validation study designed to identify optimal cut scores when using easyCBM[R] reading tests in Washington state. The cross-validation study analyzes data from the 2009-2010 academic year for easyCBM[R] reading measures. A sample of approximately 900 students per grade, randomly split into two…

  9. Criterion for evaluating the predictive ability of nonlinear regression models without cross-validation.

    PubMed

    Kaneko, Hiromasa; Funatsu, Kimito

    2013-09-23

    We propose predictive performance criteria for nonlinear regression models without cross-validation. The proposed criteria are the determination coefficient and the root-mean-square error for the midpoints between k-nearest-neighbor data points. These criteria can be used to evaluate predictive ability after the regression models are updated, whereas cross-validation cannot be performed in such a situation. The proposed method is effective and helpful in handling big data when cross-validation cannot be applied. By analyzing data from numerical simulations and quantitative structural relationships, we confirm that the proposed criteria enable the predictive ability of the nonlinear regression models to be appropriately quantified.

  10. Combining NMR spectral and structural data to form models of polychlorinated dibenzodioxins, dibenzofurans, and biphenyls binding to the AhR

    NASA Astrophysics Data System (ADS)

    Beger, Richard D.; Buzatu, Dan A.; Wilkes, Jon G.

    2002-10-01

    A three-dimensional quantitative spectrometric data-activity relationship (3D-QSDAR) modeling technique which uses NMR spectral and structural information that is combined in a 3D-connectivity matrix has been developed. A 3D-connectivity matrix was built by displaying all possible assigned carbon NMR chemical shifts, carbon-to-carbon connections, and distances between the carbons. Two-dimensional 13C-13C COSY and 2D slices from the distance dimension of the 3D-connectivity matrix were used to produce a relationship among the 2D spectral patterns for polychlorinated dibenzofurans, dibenzodioxins, and biphenyls (PCDFs, PCDDs, and PCBs respectively) binding to the aryl hydrocarbon receptor (AhR). We refer to this technique as comparative structural connectivity spectral analysis (CoSCoSA) modeling. All CoSCoSA models were developed using forward multiple linear regression analysis of the predicted 13C NMR structure-connectivity spectral bins. A CoSCoSA model for 26 PCDFs had an explained variance (r2) of 0.93 and an average leave-four-out cross-validated variance (q4 2) of 0.89. A CoSCoSA model for 14 PCDDs produced an r2 of 0.90 and an average leave-two-out cross-validated variance (q2 2) of 0.79. One CoSCoSA model for 12 PCBs gave an r2 of 0.91 and an average q2 2 of 0.80. Another CoSCoSA model for all 52 compounds had an r2 of 0.85 and an average q4 2 of 0.52. Major benefits of CoSCoSA modeling include ease of development since the technique does not use molecular docking routines.

  11. A general Bayesian image reconstruction algorithm with entropy prior: Preliminary application to HST data

    NASA Astrophysics Data System (ADS)

    Nunez, Jorge; Llacer, Jorge

    1993-10-01

    This paper describes a general Bayesian iterative algorithm with entropy prior for image reconstruction. It solves the cases of both pure Poisson data and Poisson data with Gaussian readout noise. The algorithm maintains positivity of the solution; it includes case-specific prior information (default map) and flatfield corrections; it removes background and can be accelerated to be faster than the Richardson-Lucy algorithm. In order to determine the hyperparameter that balances the entropy and liklihood terms in the Bayesian approach, we have used a liklihood cross-validation technique. Cross-validation is more robust than other methods because it is less demanding in terms of the knowledge of exact data characteristics and of the point-spread function. We have used the algorithm to reconstruct successfully images obtained in different space-and ground-based imaging situations. It has been possible to recover most of the original intended capabilities of the Hubble Space Telescope (HST) wide field and planetary camera (WFPC) and faint object camera (FOC) from images obtained in their present state. Semireal simulations for the future wide field planetary camera 2 show that even after the repair of the spherical abberration problem, image reconstruction can play a key role in improving the resolution of the cameras, well beyond the design of the Hubble instruments. We also show that ground-based images can be reconstructed successfully with the algorithm. A technique which consists of dividing the CCD observations into two frames, with one-half the exposure time each, emerges as a recommended procedure for the utilization of the described algorithms. We have compared our technique with two commonly used reconstruction algorithms: the Richardson-Lucy and the Cambridge maximum entropy algorithms.

  12. Ensemble classification of individual Pinus crowns from multispectral satellite imagery and airborne LiDAR

    NASA Astrophysics Data System (ADS)

    Kukunda, Collins B.; Duque-Lazo, Joaquín; González-Ferreiro, Eduardo; Thaden, Hauke; Kleinn, Christoph

    2018-03-01

    Distinguishing tree species is relevant in many contexts of remote sensing assisted forest inventory. Accurate tree species maps support management and conservation planning, pest and disease control and biomass estimation. This study evaluated the performance of applying ensemble techniques with the goal of automatically distinguishing Pinus sylvestris L. and Pinus uncinata Mill. Ex Mirb within a 1.3 km2 mountainous area in Barcelonnette (France). Three modelling schemes were examined, based on: (1) high-density LiDAR data (160 returns m-2), (2) Worldview-2 multispectral imagery, and (3) Worldview-2 and LiDAR in combination. Variables related to the crown structure and height of individual trees were extracted from the normalized LiDAR point cloud at individual-tree level, after performing individual tree crown (ITC) delineation. Vegetation indices and the Haralick texture indices were derived from Worldview-2 images and served as independent spectral variables. Selection of the best predictor subset was done after a comparison of three variable selection procedures: (1) Random Forests with cross validation (AUCRFcv), (2) Akaike Information Criterion (AIC) and (3) Bayesian Information Criterion (BIC). To classify the species, 9 regression techniques were combined using ensemble models. Predictions were evaluated using cross validation and an independent dataset. Integration of datasets and models improved individual tree species classification (True Skills Statistic, TSS; from 0.67 to 0.81) over individual techniques and maintained strong predictive power (Relative Operating Characteristic, ROC = 0.91). Assemblage of regression models and integration of the datasets provided more reliable species distribution maps and associated tree-scale mapping uncertainties. Our study highlights the potential of model and data assemblage at improving species classifications needed in present-day forest planning and management.

  13. A cross-validation Delphi method approach to the diagnosis and treatment of personality disorders in older adults.

    PubMed

    Rosowsky, Erlene; Young, Alexander S; Malloy, Mary C; van Alphen, S P J; Ellison, James M

    2018-03-01

    The Delphi method is a consensus-building technique using expert opinion to formulate a shared framework for understanding a topic with limited empirical support. This cross-validation study replicates one completed in the Netherlands and Belgium, and explores US experts' views on the diagnosis and treatment of older adults with personality disorders (PD). Twenty-one geriatric PD experts participated in a Delphi survey addressing diagnosis and treatment of older adults with PD. The European survey was translated and administered electronically. First-round consensus was reached for 16 out of 18 items relevant to diagnosis and specific mental health programs for personality disorders in older adults. Experts agreed on the usefulness of establishing criteria for specific types of treatments. The majority of psychologists did not initially agree on the usefulness of pharmacotherapy. Expert consensus was reached following two subsequent rounds after clarification addressing medication use. Study results suggest consensus among regarding psychosocial treatments. Limited acceptance amongst US psychologists about the suitability of pharmacotherapy for late-life PDs contrasted with the views expressed by experts surveyed in Netherlands and Belgium studies.

  14. Ultrabroadband photonic internet: safety aspects

    NASA Astrophysics Data System (ADS)

    Kalicki, Arkadiusz; Romaniuk, Ryszard

    2008-11-01

    Web applications became most popular medium in the Internet. Popularity, easiness of web application frameworks together with careless development results in high number of vulnerabilities and attacks. There are several types of attacks possible because of improper input validation. SQL injection is ability to execute arbitrary SQL queries in a database through an existing application. Cross-site scripting is the vulnerability which allows malicious web users to inject code into the web pages viewed by other users. Cross-Site Request Forgery (CSRF) is an attack that tricks the victim into loading a page that contains malicious request. Web spam in blogs. There are several techniques to mitigate attacks. Most important are web application strong design, correct input validation, defined data types for each field and parameterized statements in SQL queries. Server hardening with firewall, modern security policies systems and safe web framework interpreter configuration are essential. It is advised to keep proper security level on client side, keep updated software and install personal web firewalls or IDS/IPS systems. Good habits are logging out from services just after finishing work and using even separate web browser for most important sites, like e-banking.

  15. Cross-validation to select Bayesian hierarchical models in phylogenetics.

    PubMed

    Duchêne, Sebastián; Duchêne, David A; Di Giallonardo, Francesca; Eden, John-Sebastian; Geoghegan, Jemma L; Holt, Kathryn E; Ho, Simon Y W; Holmes, Edward C

    2016-05-26

    Recent developments in Bayesian phylogenetic models have increased the range of inferences that can be drawn from molecular sequence data. Accordingly, model selection has become an important component of phylogenetic analysis. Methods of model selection generally consider the likelihood of the data under the model in question. In the context of Bayesian phylogenetics, the most common approach involves estimating the marginal likelihood, which is typically done by integrating the likelihood across model parameters, weighted by the prior. Although this method is accurate, it is sensitive to the presence of improper priors. We explored an alternative approach based on cross-validation that is widely used in evolutionary analysis. This involves comparing models according to their predictive performance. We analysed simulated data and a range of viral and bacterial data sets using a cross-validation approach to compare a variety of molecular clock and demographic models. Our results show that cross-validation can be effective in distinguishing between strict- and relaxed-clock models and in identifying demographic models that allow growth in population size over time. In most of our empirical data analyses, the model selected using cross-validation was able to match that selected using marginal-likelihood estimation. The accuracy of cross-validation appears to improve with longer sequence data, particularly when distinguishing between relaxed-clock models. Cross-validation is a useful method for Bayesian phylogenetic model selection. This method can be readily implemented even when considering complex models where selecting an appropriate prior for all parameters may be difficult.

  16. Simultaneous estimation of cross-validation errors in least squares collocation applied for statistical testing and evaluation of the noise variance components

    NASA Astrophysics Data System (ADS)

    Behnabian, Behzad; Mashhadi Hossainali, Masoud; Malekzadeh, Ahad

    2018-02-01

    The cross-validation technique is a popular method to assess and improve the quality of prediction by least squares collocation (LSC). We present a formula for direct estimation of the vector of cross-validation errors (CVEs) in LSC which is much faster than element-wise CVE computation. We show that a quadratic form of CVEs follows Chi-squared distribution. Furthermore, a posteriori noise variance factor is derived by the quadratic form of CVEs. In order to detect blunders in the observations, estimated standardized CVE is proposed as the test statistic which can be applied when noise variances are known or unknown. We use LSC together with the methods proposed in this research for interpolation of crustal subsidence in the northern coast of the Gulf of Mexico. The results show that after detection and removing outliers, the root mean square (RMS) of CVEs and estimated noise standard deviation are reduced about 51 and 59%, respectively. In addition, RMS of LSC prediction error at data points and RMS of estimated noise of observations are decreased by 39 and 67%, respectively. However, RMS of LSC prediction error on a regular grid of interpolation points covering the area is only reduced about 4% which is a consequence of sparse distribution of data points for this case study. The influence of gross errors on LSC prediction results is also investigated by lower cutoff CVEs. It is indicated that after elimination of outliers, RMS of this type of errors is also reduced by 19.5% for a 5 km radius of vicinity. We propose a method using standardized CVEs for classification of dataset into three groups with presumed different noise variances. The noise variance components for each of the groups are estimated using restricted maximum-likelihood method via Fisher scoring technique. Finally, LSC assessment measures were computed for the estimated heterogeneous noise variance model and compared with those of the homogeneous model. The advantage of the proposed method is the reduction in estimated noise levels for those groups with the fewer number of noisy data points.

  17. A Real-Time Earthquake Precursor Detection Technique Using TEC from a GPS Network

    NASA Astrophysics Data System (ADS)

    Alp Akyol, Ali; Arikan, Feza; Arikan, Orhan

    2016-07-01

    Anomalies have been observed in the ionospheric electron density distribution prior to strong earthquakes. However, most of the reported results are obtained by earthquake analysis. Therefore, their implementation in practice is highly problematic. Recently, a novel earthquake precursor detection technique based on spatio-temporal analysis of Total Electron Content (TEC) data obtained from Turkish National Permanent GPS Network (TNPGN) is developed by IONOLAB group (www.ionolab.org). In the present study, the developed detection technique is implemented in a causal setup over the available data set in test phase that enables the real time implementation. The performance of the developed earthquake prediction technique is evaluated by using 10 fold cross validation over the data obtained in 2011. Among the 23 earthquakes that have magnitudes higher than 5, the developed technique can detect precursors of 14 earthquakes while producing 8 false alarms. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR 14/001 projects.

  18. Screening for postdeployment conditions: development and cross-validation of an embedded validity scale in the neurobehavioral symptom inventory.

    PubMed

    Vanderploeg, Rodney D; Cooper, Douglas B; Belanger, Heather G; Donnell, Alison J; Kennedy, Jan E; Hopewell, Clifford A; Scott, Steven G

    2014-01-01

    To develop and cross-validate internal validity scales for the Neurobehavioral Symptom Inventory (NSI). Four existing data sets were used: (1) outpatient clinical traumatic brain injury (TBI)/neurorehabilitation database from a military site (n = 403), (2) National Department of Veterans Affairs TBI evaluation database (n = 48 175), (3) Florida National Guard nonclinical TBI survey database (n = 3098), and (4) a cross-validation outpatient clinical TBI/neurorehabilitation database combined across 2 military medical centers (n = 206). Secondary analysis of existing cohort data to develop (study 1) and cross-validate (study 2) internal validity scales for the NSI. The NSI, Mild Brain Injury Atypical Symptoms, and Personality Assessment Inventory scores. Study 1: Three NSI validity scales were developed, composed of 5 unusual items (Negative Impression Management [NIM5]), 6 low-frequency items (LOW6), and the combination of 10 nonoverlapping items (Validity-10). Cut scores maximizing sensitivity and specificity on these measures were determined, using a Mild Brain Injury Atypical Symptoms score of 8 or more as the criterion for invalidity. Study 2: The same validity scale cut scores again resulted in the highest classification accuracy and optimal balance between sensitivity and specificity in the cross-validation sample, using a Personality Assessment Inventory Negative Impression Management scale with a T score of 75 or higher as the criterion for invalidity. The NSI is widely used in the Department of Defense and Veterans Affairs as a symptom-severity assessment following TBI, but is subject to symptom overreporting or exaggeration. This study developed embedded NSI validity scales to facilitate the detection of invalid response styles. The NSI Validity-10 scale appears to hold considerable promise for validity assessment when the NSI is used as a population-screening tool.

  19. Study of photon correlation techniques for processing of laser velocimeter signals

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1977-01-01

    The objective was to provide the theory and a system design for a new type of photon counting processor for low level dual scatter laser velocimeter (LV) signals which would be capable of both the first order measurements of mean flow and turbulence intensity and also the second order time statistics: cross correlation auto correlation, and related spectra. A general Poisson process model for low level LV signals and noise which is valid from the photon-resolved regime all the way to the limiting case of nonstationary Gaussian noise was used. Computer simulation algorithms and higher order statistical moment analysis of Poisson processes were derived and applied to the analysis of photon correlation techniques. A system design using a unique dual correlate and subtract frequency discriminator technique is postulated and analyzed. Expectation analysis indicates that the objective measurements are feasible.

  20. The scale invariant generator technique for quantifying anisotropic scale invariance

    NASA Astrophysics Data System (ADS)

    Lewis, G. M.; Lovejoy, S.; Schertzer, D.; Pecknold, S.

    1999-11-01

    Scale invariance is rapidly becoming a new paradigm for geophysics. However, little attention has been paid to the anisotropy that is invariably present in geophysical fields in the form of differential stratification and rotation, texture and morphology. In order to account for scaling anisotropy, the formalism of generalized scale invariance (GSI) was developed. Until now there has existed only a single fairly ad hoc GSI analysis technique valid for studying differential rotation. In this paper, we use a two-dimensional representation of the linear approximation to generalized scale invariance, to obtain a much improved technique for quantifying anisotropic scale invariance called the scale invariant generator technique (SIG). The accuracy of the technique is tested using anisotropic multifractal simulations and error estimates are provided for the geophysically relevant range of parameters. It is found that the technique yields reasonable estimates for simulations with a diversity of anisotropic and statistical characteristics. The scale invariant generator technique can profitably be applied to the scale invariant study of vertical/horizontal and space/time cross-sections of geophysical fields as well as to the study of the texture/morphology of fields.

  1. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    PubMed

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in challenging Raman endoscopic applications.

  2. Cross-Validation of easyCBM Reading Cut Scores in Oregon: 2009-2010. Technical Report #1108

    ERIC Educational Resources Information Center

    Park, Bitnara Jasmine; Irvin, P. Shawn; Anderson, Daniel; Alonzo, Julie; Tindal, Gerald

    2011-01-01

    This technical report presents results from a cross-validation study designed to identify optimal cut scores when using easyCBM[R] reading tests in Oregon. The cross-validation study analyzes data from the 2009-2010 academic year for easyCBM[R] reading measures. A sample of approximately 2,000 students per grade, randomly split into two groups of…

  3. Predictions of the pathological response to neoadjuvant chemotherapy in patients with primary breast cancer using a data mining technique.

    PubMed

    Takada, M; Sugimoto, M; Ohno, S; Kuroi, K; Sato, N; Bando, H; Masuda, N; Iwata, H; Kondo, M; Sasano, H; Chow, L W C; Inamoto, T; Naito, Y; Tomita, M; Toi, M

    2012-07-01

    Nomogram, a standard technique that utilizes multiple characteristics to predict efficacy of treatment and likelihood of a specific status of an individual patient, has been used for prediction of response to neoadjuvant chemotherapy (NAC) in breast cancer patients. The aim of this study was to develop a novel computational technique to predict the pathological complete response (pCR) to NAC in primary breast cancer patients. A mathematical model using alternating decision trees, an epigone of decision tree, was developed using 28 clinicopathological variables that were retrospectively collected from patients treated with NAC (n = 150), and validated using an independent dataset from a randomized controlled trial (n = 173). The model selected 15 variables to predict the pCR with yielding area under the receiver operating characteristics curve (AUC) values of 0.766 [95 % confidence interval (CI)], 0.671-0.861, P value < 0.0001) in cross-validation using training dataset and 0.787 (95 % CI 0.716-0.858, P value < 0.0001) in the validation dataset. Among three subtypes of breast cancer, the luminal subgroup showed the best discrimination (AUC = 0.779, 95 % CI 0.641-0.917, P value = 0.0059). The developed model (AUC = 0.805, 95 % CI 0.716-0.894, P value < 0.0001) outperformed multivariate logistic regression (AUC = 0.754, 95 % CI 0.651-0.858, P value = 0.00019) of validation datasets without missing values (n = 127). Several analyses, e.g. bootstrap analysis, revealed that the developed model was insensitive to missing values and also tolerant to distribution bias among the datasets. Our model based on clinicopathological variables showed high predictive ability for pCR. This model might improve the prediction of the response to NAC in primary breast cancer patients.

  4. Using thermochonology to validate a balanced cross section along the Karnali River, far-western Nepal

    NASA Astrophysics Data System (ADS)

    Battistella, C.; Robinson, D.; McQuarrie, N.; Ghoshal, S.

    2017-12-01

    Multiple valid balanced cross sections can be produced from mapped surface and subsurface data. By integrating low temperature thermochronologic data, we are better able to predict subsurface geometries. Existing valid balanced cross section for far western Nepal are few (Robinson et al., 2006) and do not incorporate thermochronologic data because the data did not exist. The data published along the Simikot cross section along the Karnali River since then include muscovite Ar, zircon U-Th/He and apatite fission track. We present new mapping and a new valid balanced cross section that takes into account the new field data as well as the limitations that thermochronologic data places on the kinematics of the cross section. Additional constrains include some new geomorphology data acquired since 2006 that indicate areas of increased vertical uplift, which indicate locations of buried ramps in the Main Himalayan thrust and guide the locations of Lesser Himalayan ramps in the balanced cross section. Future work will include flexural modeling, new low temperature thermochronometic data, and 2-D thermokinematic models from a sequentially forward modeled balanced cross sections in far western Nepal.

  5. On comprehensive recovery of an aftershock sequence with cross correlation

    NASA Astrophysics Data System (ADS)

    Kitov, I.; Bobrov, D.; Coyne, J.; Turyomurugyendo, G.

    2012-04-01

    We have introduced cross correlation between seismic waveforms as a technique for signal detection and automatic event building at the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization. The intuition behind signal detection is simple - small and mid-sized seismic events close in space should produce similar signals at the same seismic stations. Equivalently, these signals have to be characterized by a high cross correlation coefficient. For array stations with many individual sensors distributed over a large area, signals from events at distances beyond, say, 50 km, are subject to destructive interference when cross correlated due to changing time delays between various channels. Thus, any cross correlation coefficient above some predefined threshold can be considered as a signature of a valid signal. With a dense grid of master events (spacing between adjacent masters between 20 km and 50 km corresponds to the statistically estimated correlation distance) with high quality (signal-to-noise ratio above 10) template waveforms at primary array stations of the International Monitoring System one can detect signals from and then build natural and manmade seismic events close to the master ones. The use of cross correlation allows detecting smaller signals (sometimes below noise level) than provided by the current IDC detecting techniques. As a result it is possible to automatically build from 50% to 100% more valid seismic events than included in the Reviewed Event Bulletin (REB). We have developed a tentative pipeline for automatic processing at the IDC. It includes three major stages. Firstly, we calculate cross correlation coefficient for a given master and continuous waveforms at the same stations and carry out signal detection as based on the statistical behavior of signal-to-noise ratio of the cross correlation coefficient. Secondly, a thorough screening is performed for all obtained signals using f-k analysis and F-statistics as applied to the cross-correlation traces at individual channels of all included array stations. Thirdly, local (i.e. confined to the correlation distance around the master event) association of origin times of all qualified signals is fulfilled. These origin times are calculated from the arrival times of these signals, which are reduced to the origin times by the travel times from the master event. An aftershock sequence of a mid-size earthquake is an ideal case to test cross correlation techniques for autiomatic event building. All events should be close to the mainshock and occur within several days. Here we analyse the aftershock sequence of an earthquake in the North Atlantic Ocean with mb(IDC)=4.79. The REB includes 38 events at distances less than 150 km from the mainshock. Our ultimate goal is to excersice the complete iterative procedure to find all possible aftershocks. We start with the mainshock and recover ten aftershocks with the largest number of stations to produce an initial set of master events with the highest quality templates. Then we find all aftershocks in the REB and many additional events, which were not originally found by the IDC. Using all events found after the first iteration as master events we find new events, which are also used in the next iteration. The iterative process stops when no new events can be found. In that sense the final set of aftershocks obtained with cross correlation is a comprehensive one.

  6. Synovial Fluid Response to Extensional Flow: Effects of Dilution and Intermolecular Interactions

    PubMed Central

    Haward, Simon J.

    2014-01-01

    In this study, a microfluidic cross-slot device is used to examine the extensional flow response of diluted porcine synovial fluid (PSF) samples using flow-induced birefringence (FIB) measurements. The PSF sample is diluted to 10× 20× and 30× its original mass in a phosphate-buffered saline and its FIB response measured as a function of the strain rate at the stagnation point of the cross-slots. Equivalent experiments are also carried out using trypsin-treated PSF (t-PSF) in which the protein content is digested away using an enzyme. The results show that, at the synovial fluid concentrations tested, the protein content plays a negligible role in either the fluid's bulk shear or extensional flow behaviour. This helps support the validity of the analysis of synovial fluid HA content, either by microfluidic or by other techniques where the synovial fluid is first diluted, and suggests that the HA and protein content in synovial fluid must be higher than a certain minimum threshold concentration before HA-protein or protein-protein interactions become significant. However a systematic shift in the FIB response as the PSF and t-PSF samples are progressively diluted indicates that HA-HA interactions remain significant at the concentrations tested. These interactions influence FIB-derived macromolecular parameters such as the relaxation time and the molecular weight distribution and therefore must be minimized for the best validity of this method as an analytical technique, in which non-interaction between molecules is assumed. PMID:24651529

  7. Synovial fluid response to extensional flow: effects of dilution and intermolecular interactions.

    PubMed

    Haward, Simon J

    2014-01-01

    In this study, a microfluidic cross-slot device is used to examine the extensional flow response of diluted porcine synovial fluid (PSF) samples using flow-induced birefringence (FIB) measurements. The PSF sample is diluted to 10× 20× and 30× its original mass in a phosphate-buffered saline and its FIB response measured as a function of the strain rate at the stagnation point of the cross-slots. Equivalent experiments are also carried out using trypsin-treated PSF (t-PSF) in which the protein content is digested away using an enzyme. The results show that, at the synovial fluid concentrations tested, the protein content plays a negligible role in either the fluid's bulk shear or extensional flow behaviour. This helps support the validity of the analysis of synovial fluid HA content, either by microfluidic or by other techniques where the synovial fluid is first diluted, and suggests that the HA and protein content in synovial fluid must be higher than a certain minimum threshold concentration before HA-protein or protein-protein interactions become significant. However a systematic shift in the FIB response as the PSF and t-PSF samples are progressively diluted indicates that HA-HA interactions remain significant at the concentrations tested. These interactions influence FIB-derived macromolecular parameters such as the relaxation time and the molecular weight distribution and therefore must be minimized for the best validity of this method as an analytical technique, in which non-interaction between molecules is assumed.

  8. Indicators of Ecological Change

    DTIC Science & Technology

    2005-03-01

    Vine 0.07 0.26 Yellow Jasmine LM Gymnopogon ambiguus Graminae Cryptophyte Geophyte Grass 0.17 0.45 Beard grass RLD Haplopappus divaricatus Asteraceae...cross-validation procedure. The cross-validation analysis 7 determines the percentage of observations correctly classified. In essence , a cross-8

  9. Empirical Performance of Cross-Validation With Oracle Methods in a Genomics Context

    PubMed Central

    Martinez, Josue G.; Carroll, Raymond J.; Müller, Samuel; Sampson, Joshua N.; Chatterjee, Nilanjan

    2012-01-01

    When employing model selection methods with oracle properties such as the smoothly clipped absolute deviation (SCAD) and the Adaptive Lasso, it is typical to estimate the smoothing parameter by m-fold cross-validation, for example, m = 10. In problems where the true regression function is sparse and the signals large, such cross-validation typically works well. However, in regression modeling of genomic studies involving Single Nucleotide Polymorphisms (SNP), the true regression functions, while thought to be sparse, do not have large signals. We demonstrate empirically that in such problems, the number of selected variables using SCAD and the Adaptive Lasso, with 10-fold cross-validation, is a random variable that has considerable and surprising variation. Similar remarks apply to non-oracle methods such as the Lasso. Our study strongly questions the suitability of performing only a single run of m-fold cross-validation with any oracle method, and not just the SCAD and Adaptive Lasso. PMID:22347720

  10. Expert system verification and validation study

    NASA Technical Reports Server (NTRS)

    French, Scott W.; Hamilton, David

    1992-01-01

    Five workshops on verification and validation (V&V) of expert systems (ES) where taught during this recent period of performance. Two key activities, previously performed under this contract, supported these recent workshops (1) Survey of state-of-the-practice of V&V of ES and (2) Development of workshop material and first class. The first activity involved performing an extensive survey of ES developers in order to answer several questions regarding the state-of-the-practice in V&V of ES. These questions related to the amount and type of V&V done and the successfulness of this V&V. The next key activity involved developing an intensive hands-on workshop in V&V of ES. This activity involved surveying a large number of V&V techniques, conventional as well as ES specific ones. In addition to explaining the techniques, we showed how each technique could be applied on a sample problem. References were included in the workshop material, and cross referenced to techniques, so that students would know where to go to find additional information about each technique. In addition to teaching specific techniques, we included an extensive amount of material on V&V concepts and how to develop a V&V plan for an ES project. We felt this material was necessary so that developers would be prepared to develop an orderly and structured approach to V&V. That is, they would have a process that supported the use of the specific techniques. Finally, to provide hands-on experience, we developed a set of case study exercises. These exercises were to provide an opportunity for the students to apply all the material (concepts, techniques, and planning material) to a realistic problem.

  11. Cross-Validation of Predictor Equations for Armor Crewman Performance

    DTIC Science & Technology

    1980-01-01

    Technical Report 447 CROSS-VALIDATION OF PREDICTOR EQUATIONS FOR ARMOR CREWMAN PERFORMANCE Anthony J. Maitland , Newell K. Eaton, and Janet F. Neft...ORG. REPORT NUMBER Anthony J/ Maitland . Newell K/EatorV. and B OTATO RN UBR. 9- PERFORMING ORGANIZATION NAME AND ADDRESS I0. PROGRAM ELEMENT, PROJECT...Technical Report 447 CROSS-VALIDATION OF PREDICTOR EQUATIONS FOR ARMOR CREWMAN PERFORMANCE Anthony J. Maitland , Newell K. Eaton, Accession For and

  12. Simultaneous Determination of Octinoxate, Oxybenzone, and Octocrylene in a Sunscreen Formulation Using Validated Spectrophotometric and Chemometric Methods.

    PubMed

    Abdel-Ghany, Maha F; Abdel-Aziz, Omar; Ayad, Miriam F; Mikawy, Neven N

    2015-01-01

    Accurate, reliable, and sensitive spectrophotometric and chemometric methods were developed for simultaneous determination of octinoxate (OMC), oxybenzone (OXY), and octocrylene (OCR) in a sunscreen formulation without prior separation steps, including derivative ratio spectra zero crossing (DRSZ), double divisor ratio spectra derivative (DDRD), mean centering ratio spectra (MCR), and partial least squares (PLS-2). With the DRSZ technique, the UV filters could be determined in the ranges of 0.5-13.0, 0.3-9.0, and 0.5-9.0 μg/mL at 265.2, 246.6, and 261.8 nm, respectively. By utilizing the DDRD technique, UV filters could be determined in the above ranges at 237.8, 241.0, and 254.2 nm, respectively. With the MCR technique, the UV filters could be determined in the above ranges at 381.7, 383.2, and 355.6 nm, respectively. The PLS-2 technique successfully quantified the examined UV filters in the ranges of 0.5-9.3, 0.3-7.1, and 0.5-6.9 μg/mL, respectively. All the methods were validated according to the International Conference on Harmonization guidelines and successfully applied to determine the UV filters in pure form, laboratory-prepared mixtures, and a sunscreen formulation. The obtained results were statistically compared with reference and reported methods of analysis for OXY, OMC, and OCR, and there were no significant differences with respect to accuracy and precision of the adopted techniques.

  13. Statistical validation of normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Burn-injured tissue detection for debridement surgery through the combination of non-invasive optical imaging techniques.

    PubMed

    Heredia-Juesas, Juan; Thatcher, Jeffrey E; Lu, Yang; Squiers, John J; King, Darlene; Fan, Wensheng; DiMaio, J Michael; Martinez-Lorenzo, Jose A

    2018-04-01

    The process of burn debridement is a challenging technique requiring significant skills to identify the regions that need excision and their appropriate excision depths. In order to assist surgeons, a machine learning tool is being developed to provide a quantitative assessment of burn-injured tissue. This paper presents three non-invasive optical imaging techniques capable of distinguishing four kinds of tissue-healthy skin, viable wound bed, shallow burn, and deep burn-during serial burn debridement in a porcine model. All combinations of these three techniques have been studied through a k-fold cross-validation method. In terms of global performance, the combination of all three techniques significantly improves the classification accuracy with respect to just one technique, from 0.42 up to more than 0.76. Furthermore, a non-linear spatial filtering based on the mode of a small neighborhood has been applied as a post-processing technique, in order to improve the performance of the classification. Using this technique, the global accuracy reaches a value close to 0.78 and, for some particular tissues and combination of techniques, the accuracy improves by 13%.

  15. Cross-Validation of the Africentrism Scale.

    ERIC Educational Resources Information Center

    Kwate, Naa Oyo A.

    2003-01-01

    Cross-validated the Africentrism Scale, investigating the relationship between Africentrism and demographic variables in a diverse sample of individuals of African descent. Results indicated that the scale demonstrated solid internal consistency and convergent validity. Age and education related to Africentrism, with younger and less educated…

  16. Methods for Multiloop Identification of Visual and Neuromuscular Pilot Responses.

    PubMed

    Olivari, Mario; Nieuwenhuizen, Frank M; Venrooij, Joost; Bülthoff, Heinrich H; Pollini, Lorenzo

    2015-12-01

    In this paper, identification methods are proposed to estimate the neuromuscular and visual responses of a multiloop pilot model. A conventional and widely used technique for simultaneous identification of the neuromuscular and visual systems makes use of cross-spectral density estimates. This paper shows that this technique requires a specific noninterference hypothesis, often implicitly assumed, that may be difficult to meet during actual experimental designs. A mathematical justification of the necessity of the noninterference hypothesis is given. Furthermore, two methods are proposed that do not have the same limitations. The first method is based on autoregressive models with exogenous inputs, whereas the second one combines cross-spectral estimators with interpolation in the frequency domain. The two identification methods are validated by offline simulations and contrasted to the classic method. The results reveal that the classic method fails when the noninterference hypothesis is not fulfilled; on the contrary, the two proposed techniques give reliable estimates. Finally, the three identification methods are applied to experimental data from a closed-loop control task with pilots. The two proposed techniques give comparable estimates, different from those obtained by the classic method. The differences match those found with the simulations. Thus, the two identification methods provide a good alternative to the classic method and make it possible to simultaneously estimate human's neuromuscular and visual responses in cases where the classic method fails.

  17. Epileptic seizure detection in EEG signal using machine learning techniques.

    PubMed

    Jaiswal, Abeg Kumar; Banka, Haider

    2018-03-01

    Epilepsy is a well-known nervous system disorder characterized by seizures. Electroencephalograms (EEGs), which capture brain neural activity, can detect epilepsy. Traditional methods for analyzing an EEG signal for epileptic seizure detection are time-consuming. Recently, several automated seizure detection frameworks using machine learning technique have been proposed to replace these traditional methods. The two basic steps involved in machine learning are feature extraction and classification. Feature extraction reduces the input pattern space by keeping informative features and the classifier assigns the appropriate class label. In this paper, we propose two effective approaches involving subpattern based PCA (SpPCA) and cross-subpattern correlation-based PCA (SubXPCA) with Support Vector Machine (SVM) for automated seizure detection in EEG signals. Feature extraction was performed using SpPCA and SubXPCA. Both techniques explore the subpattern correlation of EEG signals, which helps in decision-making process. SVM is used for classification of seizure and non-seizure EEG signals. The SVM was trained with radial basis kernel. All the experiments have been carried out on the benchmark epilepsy EEG dataset. The entire dataset consists of 500 EEG signals recorded under different scenarios. Seven different experimental cases for classification have been conducted. The classification accuracy was evaluated using tenfold cross validation. The classification results of the proposed approaches have been compared with the results of some of existing techniques proposed in the literature to establish the claim.

  18. Noninvasive imaging of bone microarchitecture

    PubMed Central

    Patsch, Janina M.; Burghardt, Andrew J.; Kazakia, Galateia; Majumdar, Sharmila

    2015-01-01

    The noninvasive quantification of peripheral compartment-specific bone microarchitecture is feasible with high-resolution peripheral quantitative computed tomography (HR-pQCT) and high-resolution magnetic resonance imaging (HR-MRI). In addition to classic morphometric indices, both techniques provide a suitable basis for virtual biomechanical testing using finite element (FE) analyses. Methodical limitations, morphometric parameter definition, and motion artifacts have to be considered to achieve optimal data interpretation from imaging studies. With increasing availability of in vivo high-resolution bone imaging techniques, special emphasis should be put on quality control including multicenter, cross-site validations. Importantly, conclusions from interventional studies investigating the effects of antiosteoporotic drugs on bone microarchitecture should be drawn with care, ideally involving imaging scientists, translational researchers, and clinicians. PMID:22172043

  19. Developing the Polish Educational Needs Assessment Tool (Pol-ENAT) in rheumatoid arthritis and systemic sclerosis: a cross-cultural validation study using Rasch analysis.

    PubMed

    Sierakowska, Matylda; Sierakowski, Stanisław; Sierakowska, Justyna; Horton, Mike; Ndosi, Mwidimi

    2015-03-01

    To undertake cross-cultural adaptation and validation of the educational needs assessment tool (ENAT) for use with people with rheumatoid arthritis (RA) and systemic sclerosis (SSc) in Poland. The study involved two main phases: (1) cross-cultural adaptation of the ENAT from English into Polish and (2) Cross-cultural validation of Polish Educational Needs Assessment Tool (Pol-ENAT). The first phase followed an established process of cross-cultural adaptation of self-report measures. The second phase involved completion of the Pol-ENAT by patients and subjecting the data to Rasch analysis to assess the construct validity, unidimensionality, internal consistency and cross-cultural invariance. An adequate conceptual equivalence was achieved following the adaptation process. The dataset for validation comprised a total of 278 patients, 237 (85.3 %) of which were female. In each disease group (145, RA and 133, SSc), the 7 domains of the Pol-ENAT were found to fit the Rasch model, X (2)(df) = 16.953(14), p = 0.259 and 8.132(14), p = 0.882 for RA and SSc, respectively. Internal consistency of the Pol-ENAT was high (patient separation index = 0.85 and 0.89 for SSc and RA, respectively), and unidimensionality was confirmed. Cross-cultural differential item functioning (DIF) was detected in some subscales, and DIF-adjusted conversion tables were calibrated to enable cross-cultural comparison of data between Poland and the UK. Using a standard process in cross-cultural adaptation, conceptual equivalence was achieved between the original (UK) ENAT and the adapted Pol-ENAT. Fit to the Rasch model, confirmed that the construct validity, unidimensionality and internal consistency of the ENAT have been preserved.

  20. Cross-cultural adaptation of the Oral Health Impact Profile (OHIP) for the Malaysian adult population.

    PubMed

    Saub, R; Locker, D; Allison, P; Disman, M

    2007-09-01

    The aim of this project was to develop an oral health related-quality of life measure for the Malaysian adult population aged 18 and above by the cross-cultural adaption the Oral Health Impact Profile (OHIP). The adaptation of the OHIP was based on the framework proposed by Herdman et al (1998). The OHIP was translated into the Malay language using a forward-backward translation technique. Thirty-six patients were interviewed to assess the conceptual equivalence and relevancy of each item. Based on the translation process and interview results a Malaysian version of the OHIP questionnaire was produced that contained 45 items. It was designated as the OHIP(M). This questionnaire was pre-tested on 20 patients to assess its face validity. A short 14-item version of the questionnaire was completed by 171 patients to assess the suitability of the Likert-type response format. Field-testing was conducted in order to assess the suitability of two modes of administration (mail and interview) and to establish the psychometric properties of the adapted measure. The pre-testing revealed that the OHIP(M) has good face validity. It was found that the five-point frequency Likert scale could be used for the Malaysian population. The OHIP(M) was reliable, where the scale Cronbach's alpha was 0.95 and the ICC value for test-retest reliability was 0.79. Three out four construct validity hypotheses tested were confirmed. OHIP(M) works equally well as the English version. OHIP(M) was found to be reliable and valid regardless of the mode of administration. However, this study only provides initial evidence for the reliability and validity of the measure. Further study is recommended to collect more evidence to support these results.

  1. An Efficient Data Partitioning to Improve Classification Performance While Keeping Parameters Interpretable

    PubMed Central

    Korjus, Kristjan; Hebart, Martin N.; Vicente, Raul

    2016-01-01

    Supervised machine learning methods typically require splitting data into multiple chunks for training, validating, and finally testing classifiers. For finding the best parameters of a classifier, training and validation are usually carried out with cross-validation. This is followed by application of the classifier with optimized parameters to a separate test set for estimating the classifier’s generalization performance. With limited data, this separation of test data creates a difficult trade-off between having more statistical power in estimating generalization performance versus choosing better parameters and fitting a better model. We propose a novel approach that we term “Cross-validation and cross-testing” improving this trade-off by re-using test data without biasing classifier performance. The novel approach is validated using simulated data and electrophysiological recordings in humans and rodents. The results demonstrate that the approach has a higher probability of discovering significant results than the standard approach of cross-validation and testing, while maintaining the nominal alpha level. In contrast to nested cross-validation, which is maximally efficient in re-using data, the proposed approach additionally maintains the interpretability of individual parameters. Taken together, we suggest an addition to currently used machine learning approaches which may be particularly useful in cases where model weights do not require interpretation, but parameters do. PMID:27564393

  2. An Efficient Data Partitioning to Improve Classification Performance While Keeping Parameters Interpretable.

    PubMed

    Korjus, Kristjan; Hebart, Martin N; Vicente, Raul

    2016-01-01

    Supervised machine learning methods typically require splitting data into multiple chunks for training, validating, and finally testing classifiers. For finding the best parameters of a classifier, training and validation are usually carried out with cross-validation. This is followed by application of the classifier with optimized parameters to a separate test set for estimating the classifier's generalization performance. With limited data, this separation of test data creates a difficult trade-off between having more statistical power in estimating generalization performance versus choosing better parameters and fitting a better model. We propose a novel approach that we term "Cross-validation and cross-testing" improving this trade-off by re-using test data without biasing classifier performance. The novel approach is validated using simulated data and electrophysiological recordings in humans and rodents. The results demonstrate that the approach has a higher probability of discovering significant results than the standard approach of cross-validation and testing, while maintaining the nominal alpha level. In contrast to nested cross-validation, which is maximally efficient in re-using data, the proposed approach additionally maintains the interpretability of individual parameters. Taken together, we suggest an addition to currently used machine learning approaches which may be particularly useful in cases where model weights do not require interpretation, but parameters do.

  3. Methods to compute reliabilities for genomic predictions of feed intake

    USDA-ARS?s Scientific Manuscript database

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  4. Measurement of 100- and 290-MeV/A Carbon Incident Neutron Production Cross Sections for Carbon, Nitrogen and Oxygen

    NASA Astrophysics Data System (ADS)

    Shigyo, N.; Uozumi, U.; Uehara, H.; Nishizawa, T.; Mizuno, T.; Takamiya, M.; Hashiguchi, T.; Satoh, D.; Sanami, T.; Koba, Y.; Takada, M.; Matsufuji, N.

    2014-05-01

    Neutron double-differential cross sections from carbon ion incident on carbon, nitrogen and oxygen targets have been measured for neutron energies down to 0.6 MeV in wide directions from 15∘ to 90∘ with 100- and 290-MeV/A incident energies at the Heavy Ion Medical Accelerator in Chiba (HIMAC), National Institute of Radiological Sciences. Two sizes of NE213 scintillators were used as neutron detectors in order to enable neutron energy from below one to several hundred MeV. The neutron energy was measured by the time-of-flight technique between the beam pickup detector and an NE213 scintillator. By using the experimental data, the validity of the calculation results by the PHITS code was examined.

  5. FAST TRACK COMMUNICATION Critical exponents of domain walls in the two-dimensional Potts model

    NASA Astrophysics Data System (ADS)

    Dubail, Jérôme; Lykke Jacobsen, Jesper; Saleur, Hubert

    2010-12-01

    We address the geometrical critical behavior of the two-dimensional Q-state Potts model in terms of the spin clusters (i.e. connected domains where the spin takes a constant value). These clusters are different from the usual Fortuin-Kasteleyn clusters, and are separated by domain walls that can cross and branch. We develop a transfer matrix technique enabling the formulation and numerical study of spin clusters even when Q is not an integer. We further identify geometrically the crossing events which give rise to conformal correlation functions. This leads to an infinite series of fundamental critical exponents h_{\\ell _1-\\ell _2,2\\ell _1}, valid for 0 <= Q <= 4, that describe the insertion of ell1 thin and ell2 thick domain walls.

  6. On-Demand Associative Cross-Language Information Retrieval

    NASA Astrophysics Data System (ADS)

    Geraldo, André Pinto; Moreira, Viviane P.; Gonçalves, Marcos A.

    This paper proposes the use of algorithms for mining association rules as an approach for Cross-Language Information Retrieval. These algorithms have been widely used to analyse market basket data. The idea is to map the problem of finding associations between sales items to the problem of finding term translations over a parallel corpus. The proposal was validated by means of experiments using queries in two distinct languages: Portuguese and Finnish to retrieve documents in English. The results show that the performance of our proposed approach is comparable to the performance of the monolingual baseline and to query translation via machine translation, even though these systems employ more complex Natural Language Processing techniques. The combination between machine translation and our approach yielded the best results, even outperforming the monolingual baseline.

  7. Cross-validation of a dementia screening test in a heterogeneous population.

    PubMed

    Ritchie, K A; Hallerman, E F

    1989-09-01

    Recognition of the increasing importance of early dementia screening for both research and clinical purposes has led to the development of numerous screening instruments. The most promising of these are based on neuropsychological measures which are able to focus on very specific cognitive functions. Of these tests the Iowa screening test is of particular interest to researchers and clinicians working with heterogenous populations or wishing to make cross-cultural comparisons as it is relatively culture-fair and does not assume literacy. A preliminary study of the performance of the Iowa in an Israeli sample of diverse ethnic origins and low education level suggests it to be a very sensitive measure even in such groups. The study also demonstrates the inadvisability of adopting item weights derived by multivariate statistical techniques from another population.

  8. Single station monitoring of volcanoes using seismic ambient noise

    NASA Astrophysics Data System (ADS)

    De Plaen, R. S.; Lecocq, T.; Caudron, C.; Ferrazzini, V.; Francis, O.

    2016-12-01

    During volcanic eruptions, magma transport causes gas release, pressure perturbations and fracturing in the plumbing system. The potential subsequent surface deformation that can be detected using geodetic techniques and deep mechanical processes associated with magma pressurization and/or migration and their spatial-temporal evolution can be monitored with volcanic seismicity. However, these techniques respectively suffer from limited sensitivity to deep changes and a too short-term temporal distribution to expose early aseismic processes such as magma pressurisation. Seismic ambient noise cross-correlation uses the multiple scattering of seismic vibrations by heterogeneities in the crust to retrieves the Green's function for surface waves between two stations by cross-correlating these diffuse wavefields. Seismic velocity changes are then typically measured from the cross-correlation functions with applications for volcanoes, large magnitude earthquakes in the far field and smaller magnitude earthquakes at smaller distances. This technique is increasingly used as a non-destructive way to continuously monitor small seismic velocity changes ( 0.1%) associated with volcanic activity, although it is usually limited to volcanoes equipped with large and dense networks of broadband stations. The single-station approach may provide a powerful and reliable alternative to the classical "cross-stations" approach when measuring variation of seismic velocities. We implemented it on the Piton de la Fournaise in Reunion Island, a very active volcano with a remarkable multi-disciplinary continuous monitoring. Over the past decade, this volcano was increasingly studied using the traditional cross-station approach and therefore represents a unique laboratory to validate our approach. Our results, tested on stations located up to 3.5 km from the eruptive site, performed as well as the classical approach to detect the volcanic eruption in the 1-2 Hz frequency band. This opens new perspectives to successfully forecast volcanic activity at volcanoes equipped with a single 3-component seismometer.

  9. A Novel Hybrid Intelligent Indoor Location Method for Mobile Devices by Zones Using Wi-Fi Signals

    PubMed Central

    Castañón–Puga, Manuel; Salazar, Abby Stephanie; Aguilar, Leocundo; Gaxiola-Pacheco, Carelia; Licea, Guillermo

    2015-01-01

    The increasing use of mobile devices in indoor spaces brings challenges to location methods. This work presents a hybrid intelligent method based on data mining and Type-2 fuzzy logic to locate mobile devices in an indoor space by zones using Wi-Fi signals from selected access points (APs). This approach takes advantage of wireless local area networks (WLANs) over other types of architectures and implements the complete method in a mobile application using the developed tools. Besides, the proposed approach is validated by experimental data obtained from case studies and the cross-validation technique. For the purpose of generating the fuzzy rules that conform to the Takagi–Sugeno fuzzy system structure, a semi-supervised data mining technique called subtractive clustering is used. This algorithm finds centers of clusters from the radius map given by the collected signals from APs. Measurements of Wi-Fi signals can be noisy due to several factors mentioned in this work, so this method proposed the use of Type-2 fuzzy logic for modeling and dealing with such uncertain information. PMID:26633417

  10. A Novel Hybrid Intelligent Indoor Location Method for Mobile Devices by Zones Using Wi-Fi Signals.

    PubMed

    Castañón-Puga, Manuel; Salazar, Abby Stephanie; Aguilar, Leocundo; Gaxiola-Pacheco, Carelia; Licea, Guillermo

    2015-12-02

    The increasing use of mobile devices in indoor spaces brings challenges to location methods. This work presents a hybrid intelligent method based on data mining and Type-2 fuzzy logic to locate mobile devices in an indoor space by zones using Wi-Fi signals from selected access points (APs). This approach takes advantage of wireless local area networks (WLANs) over other types of architectures and implements the complete method in a mobile application using the developed tools. Besides, the proposed approach is validated by experimental data obtained from case studies and the cross-validation technique. For the purpose of generating the fuzzy rules that conform to the Takagi-Sugeno fuzzy system structure, a semi-supervised data mining technique called subtractive clustering is used. This algorithm finds centers of clusters from the radius map given by the collected signals from APs. Measurements of Wi-Fi signals can be noisy due to several factors mentioned in this work, so this method proposed the use of Type-2 fuzzy logic for modeling and dealing with such uncertain information.

  11. Developing a validation for environmental sustainability

    NASA Astrophysics Data System (ADS)

    Adewale, Bamgbade Jibril; Mohammed, Kamaruddeen Ahmed; Nawi, Mohd Nasrun Mohd; Aziz, Zulkifli

    2016-08-01

    One of the agendas for addressing environmental protection in construction is to reduce impacts and make the construction activities more sustainable. This important consideration has generated several research interests within the construction industry, especially considering the construction damaging effects on the ecosystem, such as various forms of environmental pollution, resource depletion and biodiversity loss on a global scale. Using Partial Least Squares-Structural Equation Modeling technique, this study validates environmental sustainability (ES) construct in the context of large construction firms in Malaysia. A cross-sectional survey was carried out where data was collected from Malaysian large construction firms using a structured questionnaire. Results of this study revealed that business innovativeness and new technology are important in determining environmental sustainability (ES) of the Malaysian construction firms. It also established an adequate level of internal consistency reliability, convergent validity and discriminant validity for each of this study's constructs. And based on this result, it could be suggested that the indicators for organisational innovativeness dimensions (business innovativeness and new technology) are useful to measure these constructs in order to study construction firms' tendency to adopt environmental sustainability (ES) in their project execution.

  12. Issues in cross-cultural validity: example from the adaptation, reliability, and validity testing of a Turkish version of the Stanford Health Assessment Questionnaire.

    PubMed

    Küçükdeveci, Ayse A; Sahin, Hülya; Ataman, Sebnem; Griffiths, Bridget; Tennant, Alan

    2004-02-15

    Guidelines have been established for cross-cultural adaptation of outcome measures. However, invariance across cultures must also be demonstrated through analysis of Differential Item Functioning (DIF). This is tested in the context of a Turkish adaptation of the Health Assessment Questionnaire (HAQ). Internal construct validity of the adapted HAQ is assessed by Rasch analysis; reliability, by internal consistency and the intraclass correlation coefficient; external construct validity, by association with impairments and American College of Rheumatology functional stages. Cross-cultural validity is tested through DIF by comparison with data from the UK version of the HAQ. The adapted version of the HAQ demonstrated good internal construct validity through fit of the data to the Rasch model (mean item fit 0.205; SD 0.998). Reliability was excellent (alpha = 0.97) and external construct validity was confirmed by expected associations. DIF for culture was found in only 1 item. Cross-cultural validity was found to be sufficient for use in international studies between the UK and Turkey. Future adaptation of instruments should include analysis of DIF at the field testing stage in the adaptation process.

  13. Noninvasive Classification of Hepatic Fibrosis Based on Texture Parameters From Double Contrast-Enhanced Magnetic Resonance Images

    PubMed Central

    Bahl, Gautam; Cruite, Irene; Wolfson, Tanya; Gamst, Anthony C.; Collins, Julie M.; Chavez, Alyssa D.; Barakat, Fatma; Hassanein, Tarek; Sirlin, Claude B.

    2016-01-01

    Purpose To demonstrate a proof of concept that quantitative texture feature analysis of double contrast-enhanced magnetic resonance imaging (MRI) can classify fibrosis noninvasively, using histology as a reference standard. Materials and Methods A Health Insurance Portability and Accountability Act (HIPAA)-compliant Institutional Review Board (IRB)-approved retrospective study of 68 patients with diffuse liver disease was performed at a tertiary liver center. All patients underwent double contrast-enhanced MRI, with histopathology-based staging of fibrosis obtained within 12 months of imaging. The MaZda software program was used to compute 279 texture parameters for each image. A statistical regularization technique, generalized linear model (GLM)-path, was used to develop a model based on texture features for dichotomous classification of fibrosis category (F ≤2 vs. F ≥3) of the 68 patients, with histology as the reference standard. The model's performance was assessed and cross-validated. There was no additional validation performed on an independent cohort. Results Cross-validated sensitivity, specificity, and total accuracy of the texture feature model in classifying fibrosis were 91.9%, 83.9%, and 88.2%, respectively. Conclusion This study shows proof of concept that accurate, noninvasive classification of liver fibrosis is possible by applying quantitative texture analysis to double contrast-enhanced MRI. Further studies are needed in independent cohorts of subjects. PMID:22851409

  14. The Cross Validation of the Attitudes toward Mainstreaming Scale (ATMS).

    ERIC Educational Resources Information Center

    Berryman, Joan D.; Neal, W. R. Jr.

    1980-01-01

    Reliability and factorial validity of the Attitudes Toward Mainstreaming Scale was supported in a cross-validation study with teachers. Three factors emerged: learning capability, general mainstreaming, and traditional limiting disabilities. Factor intercorrelations varied from .42 to .55; correlations between total scores and individual factors…

  15. Setting up a Rayleigh Scattering Based Flow Measuring System in a Large Nozzle Testing Facility

    NASA Technical Reports Server (NTRS)

    Panda, Jayanta; Gomez, Carlos R.

    2002-01-01

    A molecular Rayleigh scattering based air density measurement system has been built in a large nozzle testing facility at NASA Glenn Research Center. The technique depends on the light scattering by gas molecules present in air; no artificial seeding is required. Light from a single mode, continuous wave laser was transmitted to the nozzle facility by optical fiber, and light scattered by gas molecules, at various points along the laser beam, is collected and measured by photon-counting electronics. By placing the laser beam and collection optics on synchronized traversing units, the point measurement technique is made effective for surveying density variation over a cross-section of the nozzle plume. Various difficulties associated with dust particles, stray light, high noise level and vibration are discussed. Finally, a limited amount of data from an underexpanded jet are presented and compared with expected variations to validate the technique.

  16. Spatial resolution enhancement of terrestrial features using deconvolved SSM/I microwave brightness temperatures

    NASA Technical Reports Server (NTRS)

    Farrar, Michael R.; Smith, Eric A.

    1992-01-01

    A method for enhancing the 19, 22, and 37 GHz measurements of the SSM/I (Special Sensor Microwave/Imager) to the spatial resolution and sampling density of the high resolution 85-GHz channel is presented. An objective technique for specifying the tuning parameter, which balances the tradeoff between resolution and noise, is developed in terms of maximizing cross-channel correlations. Various validation procedures are performed to demonstrate the effectiveness of the method, which hopefully will provide researchers with a valuable tool in multispectral applications of satellite radiometer data.

  17. Non-iterative characterization of few-cycle laser pulses using flat-top gates.

    PubMed

    Selm, Romedi; Krauss, Günther; Leitenstorfer, Alfred; Zumbusch, Andreas

    2012-03-12

    We demonstrate a method for broadband laser pulse characterization based on a spectrally resolved cross-correlation with a narrowband flat-top gate pulse. Excellent phase-matching by collinear excitation in a microscope focus is exploited by degenerate four-wave mixing in a microscope slide. Direct group delay extraction of an octave spanning spectrum which is generated in a highly nonlinear fiber allows for spectral phase retrieval. The validity of the technique is supported by the comparison with an independent second-harmonic fringe-resolved autocorrelation measurement for an 11 fs laser pulse.

  18. An adaptive deep learning approach for PPG-based identification.

    PubMed

    Jindal, V; Birjandtalab, J; Pouyan, M Baran; Nourani, M

    2016-08-01

    Wearable biosensors have become increasingly popular in healthcare due to their capabilities for low cost and long term biosignal monitoring. This paper presents a novel two-stage technique to offer biometric identification using these biosensors through Deep Belief Networks and Restricted Boltzman Machines. Our identification approach improves robustness in current monitoring procedures within clinical, e-health and fitness environments using Photoplethysmography (PPG) signals through deep learning classification models. The approach is tested on TROIKA dataset using 10-fold cross validation and achieved an accuracy of 96.1%.

  19. HClass: Automatic classification tool for health pathologies using artificial intelligence techniques.

    PubMed

    Garcia-Chimeno, Yolanda; Garcia-Zapirain, Begonya

    2015-01-01

    The classification of subjects' pathologies enables a rigorousness to be applied to the treatment of certain pathologies, as doctors on occasions play with so many variables that they can end up confusing some illnesses with others. Thanks to Machine Learning techniques applied to a health-record database, it is possible to make using our algorithm. hClass contains a non-linear classification of either a supervised, non-supervised or semi-supervised type. The machine is configured using other techniques such as validation of the set to be classified (cross-validation), reduction in features (PCA) and committees for assessing the various classifiers. The tool is easy to use, and the sample matrix and features that one wishes to classify, the number of iterations and the subjects who are going to be used to train the machine all need to be introduced as inputs. As a result, the success rate is shown either via a classifier or via a committee if one has been formed. A 90% success rate is obtained in the ADABoost classifier and 89.7% in the case of a committee (comprising three classifiers) when PCA is applied. This tool can be expanded to allow the user to totally characterise the classifiers by adjusting them to each classification use.

  20. Improving machine learning reproducibility in genetic association studies with proportional instance cross validation (PICV).

    PubMed

    Piette, Elizabeth R; Moore, Jason H

    2018-01-01

    Machine learning methods and conventions are increasingly employed for the analysis of large, complex biomedical data sets, including genome-wide association studies (GWAS). Reproducibility of machine learning analyses of GWAS can be hampered by biological and statistical factors, particularly so for the investigation of non-additive genetic interactions. Application of traditional cross validation to a GWAS data set may result in poor consistency between the training and testing data set splits due to an imbalance of the interaction genotypes relative to the data as a whole. We propose a new cross validation method, proportional instance cross validation (PICV), that preserves the original distribution of an independent variable when splitting the data set into training and testing partitions. We apply PICV to simulated GWAS data with epistatic interactions of varying minor allele frequencies and prevalences and compare performance to that of a traditional cross validation procedure in which individuals are randomly allocated to training and testing partitions. Sensitivity and positive predictive value are significantly improved across all tested scenarios for PICV compared to traditional cross validation. We also apply PICV to GWAS data from a study of primary open-angle glaucoma to investigate a previously-reported interaction, which fails to significantly replicate; PICV however improves the consistency of testing and training results. Application of traditional machine learning procedures to biomedical data may require modifications to better suit intrinsic characteristics of the data, such as the potential for highly imbalanced genotype distributions in the case of epistasis detection. The reproducibility of genetic interaction findings can be improved by considering this variable imbalance in cross validation implementation, such as with PICV. This approach may be extended to problems in other domains in which imbalanced variable distributions are a concern.

  1. Compensation of the impact of low-cost manufacturing techniques in the design of E-plane multiport waveguide junctions

    NASA Astrophysics Data System (ADS)

    San-Blas, A. A.; Roca, J. M.; Cogollos, S.; Morro, J. V.; Boria, V. E.; Gimeno, B.

    2016-06-01

    In this work, a full-wave tool for the accurate analysis and design of compensated E-plane multiport junctions is proposed. The implemented tool is capable of evaluating the undesired effects related to the use of low-cost manufacturing techniques, which are mostly due to the introduction of rounded corners in the cross section of the rectangular waveguides of the device. The obtained results show that, although stringent mechanical effects are imposed, it is possible to compensate for the impact of the cited low-cost manufacturing techniques by redesigning the matching elements considered in the original device. Several new designs concerning a great variety of E-plane components (such as right-angled bends, T-junctions and magic-Ts) are presented, and useful design guidelines are provided. The implemented tool, which is mainly based on the boundary integral-resonant mode expansion technique, has been successfully validated by comparing the obtained results to simulated data provided by a commercial software based on the finite element method.

  2. Cross-platform comparison of nucleic acid hybridization: toward quantitative reference standards.

    PubMed

    Halvorsen, Ken; Agris, Paul F

    2014-11-15

    Measuring interactions between biological molecules is vitally important to both basic and applied research as well as development of pharmaceuticals. Although a wide and growing range of techniques is available to measure various kinetic and thermodynamic properties of interacting biomolecules, it can be difficult to compare data across techniques of different laboratories and personnel or even across different instruments using the same technique. Here we evaluate relevant biological interactions based on complementary DNA and RNA oligonucleotides that could be used as reference standards for many experimental systems. We measured thermodynamics of duplex formation using isothermal titration calorimetry, differential scanning calorimetry, and ultraviolet-visible (UV-vis) monitored denaturation/renaturation. These standards can be used to validate results, compare data from disparate techniques, act as a teaching tool for laboratory classes, or potentially to calibrate instruments. The RNA and DNA standards have many attractive features, including low cost, high purity, easily measurable concentrations, and minimal handling concerns, making them ideal for use as a reference material. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Cross-platform comparison of nucleic acid hybridization: toward quantitative reference standardsa

    PubMed Central

    Halvorsen, Ken; Agris, Paul F.

    2014-01-01

    Measuring interactions between biological molecules is vitally important to both basic and applied research, as well as development of pharmaceuticals. While a wide and growing range of techniques are available to measure various kinetic and thermodynamic properties of interacting biomolecules, it can be difficult to compare data across techniques of different laboratories and personnel, or even across different instruments using the same technique. Here we evaluate relevant biological interactions based on complementary DNA and RNA oligonucleotides that could be used as reference standards for many experimental systems. We measured thermodynamics of duplex formation using Isothermal Titration Calorimetry, Differential Scanning Calorimetry, and UV-Vis monitored denaturation/renaturation. These standards can be used to validate results, compare data from disparate techniques, act as a teaching tool for laboratory classes, or potentially to calibrate instruments. The RNA and DNA standards have many attractive features including low cost, high purity, easily measureable concentrations, and minimal handling concerns, making them ideal for use as a reference material. PMID:25124363

  4. Evaluating the sources of water to wells: Three techniques for metamodeling of a groundwater flow model

    USGS Publications Warehouse

    Fienen, Michael N.; Nolan, Bernard T.; Feinstein, Daniel T.

    2016-01-01

    For decision support, the insights and predictive power of numerical process models can be hampered by insufficient expertise and computational resources required to evaluate system response to new stresses. An alternative is to emulate the process model with a statistical “metamodel.” Built on a dataset of collocated numerical model input and output, a groundwater flow model was emulated using a Bayesian Network, an Artificial neural network, and a Gradient Boosted Regression Tree. The response of interest was surface water depletion expressed as the source of water-to-wells. The results have application for managing allocation of groundwater. Each technique was tuned using cross validation and further evaluated using a held-out dataset. A numerical MODFLOW-USG model of the Lake Michigan Basin, USA, was used for the evaluation. The performance and interpretability of each technique was compared pointing to advantages of each technique. The metamodel can extend to unmodeled areas.

  5. Comparative forensic soil analysis of New Jersey state parks using a combination of simple techniques with multivariate statistics.

    PubMed

    Bonetti, Jennifer; Quarino, Lawrence

    2014-05-01

    This study has shown that the combination of simple techniques with the use of multivariate statistics offers the potential for the comparative analysis of soil samples. Five samples were obtained from each of twelve state parks across New Jersey in both the summer and fall seasons. Each sample was examined using particle-size distribution, pH analysis in both water and 1 M CaCl2 , and a loss on ignition technique. Data from each of the techniques were combined, and principal component analysis (PCA) and canonical discriminant analysis (CDA) were used for multivariate data transformation. Samples from different locations could be visually differentiated from one another using these multivariate plots. Hold-one-out cross-validation analysis showed error rates as low as 3.33%. Ten blind study samples were analyzed resulting in no misclassifications using Mahalanobis distance calculations and visual examinations of multivariate plots. Seasonal variation was minimal between corresponding samples, suggesting potential success in forensic applications. © 2014 American Academy of Forensic Sciences.

  6. Exact Analysis of Squared Cross-Validity Coefficient in Predictive Regression Models

    ERIC Educational Resources Information Center

    Shieh, Gwowen

    2009-01-01

    In regression analysis, the notion of population validity is of theoretical interest for describing the usefulness of the underlying regression model, whereas the presumably more important concept of population cross-validity represents the predictive effectiveness for the regression equation in future research. It appears that the inference…

  7. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for maps that explicitly expressed geomorphically implausible relationships indicating that the predictive performance of a model might be misleading in the case a predictor systematically relates to a spatially consistent bias of the inventory. Furthermore, we observed that random forest-based maps displayed spatial artifacts. The most plausible susceptibility map of the study area showed smooth prediction surfaces while the underlying model revealed a high predictive capability and was generated with an accurate landslide inventory and predictors that did not directly describe a bias. However, none of the presented models was found to be completely unbiased. This study showed that high predictive performances cannot be equated with a high plausibility and applicability of subsequent landslide susceptibility maps. We suggest that greater emphasis should be placed on identifying confounding factors and biases in landslide inventories. A joint discussion between modelers and decision makers of the spatial pattern of the final susceptibility maps in the field might increase their acceptance and applicability.

  8. Design of a Competency Evaluation Model for Clinical Nursing Practicum, Based on Standardized Language Systems: Psychometric Validation Study.

    PubMed

    Iglesias-Parra, Maria Rosa; García-Guerrero, Alfonso; García-Mayor, Silvia; Kaknani-Uttumchandani, Shakira; León-Campos, Álvaro; Morales-Asencio, José Miguel

    2015-07-01

    To develop an evaluation system of clinical competencies for the practicum of nursing students based on the Nursing Interventions Classification (NIC). Psychometric validation study: the first two phases addressed definition and content validation, and the third phase consisted of a cross-sectional study for analyzing reliability. The study population was undergraduate nursing students and clinical tutors. Through the Delphi technique, 26 competencies and 91 interventions were isolated. Cronbach's α was 0.96. Factor analysis yielded 18 factors that explained 68.82% of the variance. Overall inter-item correlation was 0.26, and total-item correlation ranged between 0.66 and 0.19. A competency system for the nursing practicum, structured on the NIC, is a reliable method for assessing and evaluating clinical competencies. Further evaluations in other contexts are needed. The availability of standardized language systems in the nursing discipline supposes an ideal framework to develop the nursing curricula. © 2015 Sigma Theta Tau International.

  9. CFD Simulation and Experimental Validation of Fluid Flow and Particle Transport in a Model of Alveolated Airways

    PubMed Central

    Ma, Baoshun; Ruwet, Vincent; Corieri, Patricia; Theunissen, Raf; Riethmuller, Michel; Darquenne, Chantal

    2009-01-01

    Accurate modeling of air flow and aerosol transport in the alveolated airways is essential for quantitative predictions of pulmonary aerosol deposition. However, experimental validation of such modeling studies has been scarce. The objective of this study is to validate CFD predictions of flow field and particle trajectory with experiments within a scaled-up model of alveolated airways. Steady flow (Re = 0.13) of silicone oil was captured by particle image velocimetry (PIV), and the trajectories of 0.5 mm and 1.2 mm spherical iron beads (representing 0.7 to 14.6 μm aerosol in vivo) were obtained by particle tracking velocimetry (PTV). At twelve selected cross sections, the velocity profiles obtained by CFD matched well with those by PIV (within 1.7% on average). The CFD predicted trajectories also matched well with PTV experiments. These results showed that air flow and aerosol transport in models of human alveolated airways can be simulated by CFD techniques with reasonable accuracy. PMID:20161301

  10. CFD Simulation and Experimental Validation of Fluid Flow and Particle Transport in a Model of Alveolated Airways.

    PubMed

    Ma, Baoshun; Ruwet, Vincent; Corieri, Patricia; Theunissen, Raf; Riethmuller, Michel; Darquenne, Chantal

    2009-05-01

    Accurate modeling of air flow and aerosol transport in the alveolated airways is essential for quantitative predictions of pulmonary aerosol deposition. However, experimental validation of such modeling studies has been scarce. The objective of this study is to validate CFD predictions of flow field and particle trajectory with experiments within a scaled-up model of alveolated airways. Steady flow (Re = 0.13) of silicone oil was captured by particle image velocimetry (PIV), and the trajectories of 0.5 mm and 1.2 mm spherical iron beads (representing 0.7 to 14.6 mum aerosol in vivo) were obtained by particle tracking velocimetry (PTV). At twelve selected cross sections, the velocity profiles obtained by CFD matched well with those by PIV (within 1.7% on average). The CFD predicted trajectories also matched well with PTV experiments. These results showed that air flow and aerosol transport in models of human alveolated airways can be simulated by CFD techniques with reasonable accuracy.

  11. Infinite hidden conditional random fields for human behavior analysis.

    PubMed

    Bousmalis, Konstantinos; Zafeiriou, Stefanos; Morency, Louis-Philippe; Pantic, Maja

    2013-01-01

    Hidden conditional random fields (HCRFs) are discriminative latent variable models that have been shown to successfully learn the hidden structure of a given classification problem (provided an appropriate validation of the number of hidden states). In this brief, we present the infinite HCRF (iHCRF), which is a nonparametric model based on hierarchical Dirichlet processes and is capable of automatically learning the optimal number of hidden states for a classification task. We show how we learn the model hyperparameters with an effective Markov-chain Monte Carlo sampling technique, and we explain the process that underlines our iHCRF model with the Restaurant Franchise Rating Agencies analogy. We show that the iHCRF is able to converge to a correct number of represented hidden states, and outperforms the best finite HCRFs--chosen via cross-validation--for the difficult tasks of recognizing instances of agreement, disagreement, and pain. Moreover, the iHCRF manages to achieve this performance in significantly less total training, validation, and testing time.

  12. Hydrostatic weighing without head submersion in morbidly obese females.

    PubMed

    Evans, P E; Israel, R G; Flickinger, E G; O'Brien, K F; Donnelly, J E

    1989-08-01

    This study tests the validity of hydrostatic weighing without head submersion (HWNS) for determining the body density (Db) of morbidly obese (MO) females. Eighty MO females who were able to perform traditional hydrostatic weighing at residual volume (HW) underwent four counterbalanced trials for each procedure (HW and HWNS) to determine Db. Residual volume was determined by oxygen dilution. Twenty subjects were randomly excluded from the experimental group (EG) and assigned to a cross-validation group (CV). Simple linear regression was performed on EG data (n = 60, means = 36.8 y, means % fat = 50.1) to predict Db from HWNS (Db = 0.569563 [Db HWNS] + 0.408621, SEE = 0.0066). Comparison of the predicted and actual Db for CV group yielded r = 0.69, SEE = 0.0066, E statistic = 0.0067, mean difference = 0.0013 kg/L. The SEE and E statistic for body fat were 3.31 and 3.39, respectively. Mean difference for percent fat was 0.66%. Results indicate that HWNS is a valid technique for assessing body composition in MO females.

  13. Model selection and assessment for multi­-species occupancy models

    USGS Publications Warehouse

    Broms, Kristin M.; Hooten, Mevin B.; Fitzpatrick, Ryan M.

    2016-01-01

    While multi-species occupancy models (MSOMs) are emerging as a popular method for analyzing biodiversity data, formal checking and validation approaches for this class of models have lagged behind. Concurrent with the rise in application of MSOMs among ecologists, a quiet regime shift is occurring in Bayesian statistics where predictive model comparison approaches are experiencing a resurgence. Unlike single-species occupancy models that use integrated likelihoods, MSOMs are usually couched in a Bayesian framework and contain multiple levels. Standard model checking and selection methods are often unreliable in this setting and there is only limited guidance in the ecological literature for this class of models. We examined several different contemporary Bayesian hierarchical approaches for checking and validating MSOMs and applied these methods to a freshwater aquatic study system in Colorado, USA, to better understand the diversity and distributions of plains fishes. Our findings indicated distinct differences among model selection approaches, with cross-validation techniques performing the best in terms of prediction.

  14. Effects of Data Quality on the Characterization of Aerosol Properties from Multiple Sensors

    NASA Technical Reports Server (NTRS)

    Petrenko, Maksym; Ichoku, Charles; Leptoukh, Gregory

    2011-01-01

    Cross-comparison of aerosol properties between ground-based and spaceborne measurements is an important validation technique that helps to investigate the uncertainties of aerosol products acquired using spaceborne sensors. However, it has been shown that even minor differences in the cross-characterization procedure may significantly impact the results of such validation. Of particular consideration is the quality assurance I quality control (QA/QC) information - an auxiliary data indicating a "confidence" level (e.g., Bad, Fair, Good, Excellent, etc.) conferred by the retrieval algorithms on the produced data. Depending on the treatment of available QA/QC information, a cross-characterization procedure has the potential of filtering out invalid data points, such as uncertain or erroneous retrievals, which tend to reduce the credibility of such comparisons. However, under certain circumstances, even high QA/QC values may not fully guarantee the quality of the data. For example, retrievals in proximity of a cloud might be particularly perplexing for an aerosol retrieval algorithm, resulting in an invalid data that, nonetheless, could be assigned a high QA/QC confidence. In this presentation, we will study the effects of several QA/QC parameters on cross-characterization of aerosol properties between the data acquired by multiple spaceborne sensors. We will utilize the Multi-sensor Aerosol Products Sampling System (MAPSS) that provides a consistent platform for multi-sensor comparison, including collocation with measurements acquired by the ground-based Aerosol Robotic Network (AERONET), The multi-sensor spaceborne data analyzed include those acquired by the Terra-MODIS, Aqua-MODIS, Terra-MISR, Aura-OMI, Parasol-POLDER, and CalipsoCALIOP satellite instruments.

  15. Guidelines To Validate Control of Cross-Contamination during Washing of Fresh-Cut Leafy Vegetables.

    PubMed

    Gombas, D; Luo, Y; Brennan, J; Shergill, G; Petran, R; Walsh, R; Hau, H; Khurana, K; Zomorodi, B; Rosen, J; Varley, R; Deng, K

    2017-02-01

    The U.S. Food and Drug Administration requires food processors to implement and validate processes that will result in significantly minimizing or preventing the occurrence of hazards that are reasonably foreseeable in food production. During production of fresh-cut leafy vegetables, microbial contamination that may be present on the product can spread throughout the production batch when the product is washed, thus increasing the risk of illnesses. The use of antimicrobials in the wash water is a critical step in preventing such water-mediated cross-contamination; however, many factors can affect antimicrobial efficacy in the production of fresh-cut leafy vegetables, and the procedures for validating this key preventive control have not been articulated. Producers may consider three options for validating antimicrobial washing as a preventive control for cross-contamination. Option 1 involves the use of a surrogate for the microbial hazard and the demonstration that cross-contamination is prevented by the antimicrobial wash. Option 2 involves the use of antimicrobial sensors and the demonstration that a critical antimicrobial level is maintained during worst-case operating conditions. Option 3 validates the placement of the sensors in the processing equipment with the demonstration that a critical antimicrobial level is maintained at all locations, regardless of operating conditions. These validation options developed for fresh-cut leafy vegetables may serve as examples for validating processes that prevent cross-contamination during washing of other fresh produce commodities.

  16. FY 2016 Status Report on the Modeling of the M8 Calibration Series using MAMMOTH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Benjamin Allen; Ortensi, Javier; DeHart, Mark David

    2016-09-01

    This report provides a summary of the progress made towards validating the multi-physics reactor analysis application MAMMOTH using data from measurements performed at the Transient Reactor Test facility, TREAT. The work completed consists of a series of comparisons of TREAT element types (standard and control rod assemblies) in small geometries as well as slotted mini-cores to reference Monte Carlo simulations to ascertain the accuracy of cross section preparation techniques. After the successful completion of these smaller problems, a full core model of the half slotted core used in the M8 Calibration series was assembled. Full core MAMMOTH simulations were comparedmore » to Serpent reference calculations to assess the cross section preparation process for this larger configuration. As part of the validation process the M8 Calibration series included a steady state wire irradiation experiment and coupling factors for the experiment region. The shape of the power distribution obtained from the MAMMOTH simulation shows excellent agreement with the experiment. Larger differences were encountered in the calculation of the coupling factors, but there is also great uncertainty on how the experimental values were obtained. Future work will focus on resolving some of these differences.« less

  17. Evaluation of gene expression classification studies: factors associated with classification performance.

    PubMed

    Novianti, Putri W; Roes, Kit C B; Eijkemans, Marinus J C

    2014-01-01

    Classification methods used in microarray studies for gene expression are diverse in the way they deal with the underlying complexity of the data, as well as in the technique used to build the classification model. The MAQC II study on cancer classification problems has found that performance was affected by factors such as the classification algorithm, cross validation method, number of genes, and gene selection method. In this paper, we study the hypothesis that the disease under study significantly determines which method is optimal, and that additionally sample size, class imbalance, type of medical question (diagnostic, prognostic or treatment response), and microarray platform are potentially influential. A systematic literature review was used to extract the information from 48 published articles on non-cancer microarray classification studies. The impact of the various factors on the reported classification accuracy was analyzed through random-intercept logistic regression. The type of medical question and method of cross validation dominated the explained variation in accuracy among studies, followed by disease category and microarray platform. In total, 42% of the between study variation was explained by all the study specific and problem specific factors that we studied together.

  18. Tools based on multivariate statistical analysis for classification of soil and groundwater in Apulian agricultural sites.

    PubMed

    Ielpo, Pierina; Leardi, Riccardo; Pappagallo, Giuseppe; Uricchio, Vito Felice

    2017-06-01

    In this paper, the results obtained from multivariate statistical techniques such as PCA (Principal component analysis) and LDA (Linear discriminant analysis) applied to a wide soil data set are presented. The results have been compared with those obtained on a groundwater data set, whose samples were collected together with soil ones, within the project "Improvement of the Regional Agro-meteorological Monitoring Network (2004-2007)". LDA, applied to soil data, has allowed to distinguish the geographical origin of the sample from either one of the two macroaeras: Bari and Foggia provinces vs Brindisi, Lecce e Taranto provinces, with a percentage of correct prediction in cross validation of 87%. In the case of the groundwater data set, the best classification was obtained when the samples were grouped into three macroareas: Foggia province, Bari province and Brindisi, Lecce and Taranto provinces, by reaching a percentage of correct predictions in cross validation of 84%. The obtained information can be very useful in supporting soil and water resource management, such as the reduction of water consumption and the reduction of energy and chemical (nutrients and pesticides) inputs in agriculture.

  19. [Gaussian process regression and its application in near-infrared spectroscopy analysis].

    PubMed

    Feng, Ai-Ming; Fang, Li-Min; Lin, Min

    2011-06-01

    Gaussian process (GP) is applied in the present paper as a chemometric method to explore the complicated relationship between the near infrared (NIR) spectra and ingredients. After the outliers were detected by Monte Carlo cross validation (MCCV) method and removed from dataset, different preprocessing methods, such as multiplicative scatter correction (MSC), smoothing and derivate, were tried for the best performance of the models. Furthermore, uninformative variable elimination (UVE) was introduced as a variable selection technique and the characteristic wavelengths obtained were further employed as input for modeling. A public dataset with 80 NIR spectra of corn was introduced as an example for evaluating the new algorithm. The optimal models for oil, starch and protein were obtained by the GP regression method. The performance of the final models were evaluated according to the root mean square error of calibration (RMSEC), root mean square error of cross-validation (RMSECV), root mean square error of prediction (RMSEP) and correlation coefficient (r). The models give good calibration ability with r values above 0.99 and the prediction ability is also satisfactory with r values higher than 0.96. The overall results demonstrate that GP algorithm is an effective chemometric method and is promising for the NIR analysis.

  20. Reliable Digit Span: A Systematic Review and Cross-Validation Study

    ERIC Educational Resources Information Center

    Schroeder, Ryan W.; Twumasi-Ankrah, Philip; Baade, Lyle E.; Marshall, Paul S.

    2012-01-01

    Reliable Digit Span (RDS) is a heavily researched symptom validity test with a recent literature review yielding more than 20 studies ranging in dates from 1994 to 2011. Unfortunately, limitations within some of the research minimize clinical generalizability. This systematic review and cross-validation study was conducted to address these…

  1. Validation of Medical Tourism Service Quality Questionnaire (MTSQQ) for Iranian Hospitals.

    PubMed

    Qolipour, Mohammad; Torabipour, Amin; Khiavi, Farzad Faraji; Malehi, Amal Saki

    2017-03-01

    Assessing service quality is one of the basic requirements to develop the medical tourism industry. There is no valid and reliable tool to measure service quality of medical tourism. This study aimed to determine the reliability and validity of a Persian version of medical tourism service quality questionnaire for Iranian hospitals. To validate the medical tourism service quality questionnaire (MTSQQ), a cross-sectional study was conducted on 250 Iraqi patients referred to hospitals in Ahvaz (Iran) from 2015. To design a questionnaire and determine its content validity, the Delphi Technique (3 rounds) with the participation of 20 medical tourism experts was used. Construct validity of the questionnaire was assessed through exploratory and confirmatory factor analysis. Reliability was assessed using Cronbach's alpha coefficient. Data were analyzed by Excel 2007, SPSS version18, and Lisrel l8.0 software. The content validity of the questionnaire with CVI=0.775 was confirmed. According to exploratory factor analysis, the MTSQQ included 31 items and 8 dimensions (tangibility, reliability, responsiveness, assurance, empathy, exchange and travel facilities, technical and infrastructure facilities and safety and security). Construct validity of the questionnaire was confirmed, based on the goodness of fit quantities of model (RMSEA=0.032, CFI= 0.98, GFI=0.88). Cronbach's alpha coefficient was 0.837 and 0.919 for expectation and perception questionnaire. The results of the study showed that the medical tourism SERVQUAL questionnaire with 31 items and 8 dimensions was a valid and reliable tool to measure service quality of medical tourism in Iranian hospitals.

  2. Determination of free Zn2+ concentration in synthetic and natural samples with AGNES (Absence of Gradients and Nernstian Equilibrium Stripping) and DMT (Donnan Membrane Technique).

    PubMed

    Chito, Diana; Weng, Liping; Galceran, Josep; Companys, Encarnació; Puy, Jaume; van Riemsdijk, Willem H; van Leeuwen, Herman P

    2012-04-01

    The determination of free Zn(2+) ion concentration is a key in the study of environmental systems like river water and soils, due to its impact on bioavailability and toxicity. AGNES (Absence of Gradients and Nernstian Equilibrium Stripping) and DMT (Donnan Membrane Technique) are emerging techniques suited for the determination of free heavy metal concentrations, especially in the case of Zn(2+), given that there is no commercial Ion Selective Electrode. In this work, both techniques have been applied to synthetic samples (containing Zn and NTA) and natural samples (Rhine river water and soils), showing good agreement. pH fluctuations in DMT and N(2)/CO(2) purging system used in AGNES did not affect considerably the measurements done in Rhine river water and soil samples. Results of DMT in situ of Rhine river water are comparable to those of AGNES in the lab. The comparison of this work provides a cross-validation for both techniques. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Technique for measurement of characteristic impedance and propagation constant for porous materials

    NASA Astrophysics Data System (ADS)

    Jung, Ki Won; Atchley, Anthony A.

    2005-09-01

    Knowledge of acoustic properties such as characteristic impedance and complex propagation constant is useful to characterize the acoustic behaviors of porous materials. Song and Bolton's four-microphone method [J. Acoust. Soc. Am. 107, 1131-1152 (2000)] is one of the most widely employed techniques. In this method two microphones are used to determine the complex pressure amplitudes for each side of a sample. Muehleisen and Beamer [J. Acoust. Soc. Am. 117, 536-544 (2005)] improved upon a four-microphone method by interchanging microphones to reduce errors due to uncertainties in microphone response. In this paper, a multiple microphone technique is investigated to reconstruct the pressure field inside an impedance tube. Measurements of the acoustic properties of a material having square cross-section pores is used to check the validity of the technique. The values of characteristic impedance and complex propagation constant extracted from the reconstruction agree well with predicted values. Furthermore, this technique is used in investigating the acoustic properties of reticulated vitreous carbon (RVC) in the range of 250-1100 Hz.

  4. A method for the measurement of dispersion curves of circumferential guided waves radiating from curved shells: experimental validation and application to a femoral neck mimicking phantom

    NASA Astrophysics Data System (ADS)

    Nauleau, Pierre; Minonzio, Jean-Gabriel; Chekroun, Mathieu; Cassereau, Didier; Laugier, Pascal; Prada, Claire; Grimal, Quentin

    2016-07-01

    Our long-term goal is to develop an ultrasonic method to characterize the thickness, stiffness and porosity of the cortical shell of the femoral neck, which could enhance hip fracture risk prediction. To this purpose, we proposed to adapt a technique based on the measurement of guided waves. We previously evidenced the feasibility of measuring circumferential guided waves in a bone-mimicking phantom of a circular cross-section of even thickness. The goal of this study is to investigate the impact of the complex geometry of the femoral neck on the measurement of guided waves. Two phantoms of an elliptical cross-section and one phantom of a realistic cross-section were investigated. A 128-element array was used to record the inter-element response matrix of these waveguides. This experiment was simulated using a custom-made hybrid code. The response matrices were analyzed using a technique based on the physics of wave propagation. This method yields portions of dispersion curves of the waveguides which were compared to reference dispersion curves. For the elliptical phantoms, three portions of dispersion curves were determined with a good agreement between experiment, simulation and theory. The method was thus validated. The characteristic dimensions of the shell were found to influence the identification of the circumferential wave signals. The method was then applied to the signals backscattered by the superior half of constant thickness of the realistic phantom. A cut-off frequency and some portions of modes were measured, with a good agreement with the theoretical curves of a plate waveguide. We also observed that the method cannot be applied directly to the signals backscattered by the lower half of varying thicknesses of the phantom. The proposed approach could then be considered to evaluate the properties of the superior part of the femoral neck, which is known to be a clinically relevant site.

  5. Validity of clinical outcome measures to evaluate ankle range of motion during the weight-bearing lunge test.

    PubMed

    Hall, Emily A; Docherty, Carrie L

    2017-07-01

    To determine the concurrent validity of standard clinical outcome measures compared to laboratory outcome measure while performing the weight-bearing lunge test (WBLT). Cross-sectional study. Fifty participants performed the WBLT to determine dorsiflexion ROM using four different measurement techniques: dorsiflexion angle with digital inclinometer at 15cm distal to the tibial tuberosity (°), dorsiflexion angle with inclinometer at tibial tuberosity (°), maximum lunge distance (cm), and dorsiflexion angle using a 2D motion capture system (°). Outcome measures were recorded concurrently during each trial. To establish concurrent validity, Pearson product-moment correlation coefficients (r) were conducted, comparing each dependent variable to the 2D motion capture analysis (identified as the reference standard). A higher correlation indicates strong concurrent validity. There was a high correlation between each measurement technique and the reference standard. Specifically the correlation between the inclinometer placement at 15cm below the tibial tuberosity (44.9°±5.5°) and the motion capture angle (27.0°±6.0°) was r=0.76 (p=0.001), between the inclinometer placement at the tibial tuberosity angle (39.0°±4.6°) and the motion capture angle was r=0.71 (p=0.001), and between the distance from the wall clinical measure (10.3±3.0cm) to the motion capture angle was r=0.74 (p=0.001). This study determined that the clinical measures used during the WBLT have a high correlation with the reference standard for assessing dorsiflexion range of motion. Therefore, obtaining maximum lunge distance and inclinometer angles are both valid assessments during the weight-bearing lunge test. Copyright © 2016 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  6. Characterizing the functional MRI response using Tikhonov regularization.

    PubMed

    Vakorin, Vasily A; Borowsky, Ron; Sarty, Gordon E

    2007-09-20

    The problem of evaluating an averaged functional magnetic resonance imaging (fMRI) response for repeated block design experiments was considered within a semiparametric regression model with autocorrelated residuals. We applied functional data analysis (FDA) techniques that use a least-squares fitting of B-spline expansions with Tikhonov regularization. To deal with the noise autocorrelation, we proposed a regularization parameter selection method based on the idea of combining temporal smoothing with residual whitening. A criterion based on a generalized chi(2)-test of the residuals for white noise was compared with a generalized cross-validation scheme. We evaluated and compared the performance of the two criteria, based on their effect on the quality of the fMRI response. We found that the regularization parameter can be tuned to improve the noise autocorrelation structure, but the whitening criterion provides too much smoothing when compared with the cross-validation criterion. The ultimate goal of the proposed smoothing techniques is to facilitate the extraction of temporal features in the hemodynamic response for further analysis. In particular, these FDA methods allow us to compute derivatives and integrals of the fMRI signal so that fMRI data may be correlated with behavioral and physiological models. For example, positive and negative hemodynamic responses may be easily and robustly identified on the basis of the first derivative at an early time point in the response. Ultimately, these methods allow us to verify previously reported correlations between the hemodynamic response and the behavioral measures of accuracy and reaction time, showing the potential to recover new information from fMRI data. 2007 John Wiley & Sons, Ltd

  7. Automated detection of a preseizure state based on a decrease in synchronization in intracranial electroencephalogram recordings from epilepsy patients

    NASA Astrophysics Data System (ADS)

    Mormann, Florian; Andrzejak, Ralph G.; Kreuz, Thomas; Rieke, Christoph; David, Peter; Elger, Christian E.; Lehnertz, Klaus

    2003-02-01

    The question whether information extracted from the electroencephalogram (EEG) of epilepsy patients can be used for the prediction of seizures has recently attracted much attention. Several studies have reported evidence for the existence of a preseizure state that can be detected using different measures derived from the theory of dynamical systems. Most of these studies, however, have neglected to sufficiently investigate the specificity of the observed effects or suffer from other methodological shortcomings. In this paper we present an automated technique for the detection of a preseizure state from EEG recordings using two different measures for synchronization between recording sites, namely, the mean phase coherence as a measure for phase synchronization and the maximum linear cross correlation as a measure for lag synchronization. Based on the observation of characteristic drops in synchronization prior to seizure onset, we used this phenomenon for the characterization of a preseizure state and its distinction from the remaining seizure-free interval. After optimizing our technique on a group of 10 patients with temporal lobe epilepsy we obtained a successful detection of a preseizure state prior to 12 out of 14 analyzed seizures for both measures at a very high specificity as tested on recordings from the seizure-free interval. After checking for in-sample overtraining via cross validation, we applied a surrogate test to validate the observed predictability. Based on our results, we discuss the differences of the two synchronization measures in terms of the dynamics underlying seizure generation in focal epilepsies.

  8. Predicting discharge mortality after acute ischemic stroke using balanced data.

    PubMed

    Ho, King Chung; Speier, William; El-Saden, Suzie; Liebeskind, David S; Saver, Jeffery L; Bui, Alex A T; Arnold, Corey W

    2014-01-01

    Several models have been developed to predict stroke outcomes (e.g., stroke mortality, patient dependence, etc.) in recent decades. However, there is little discussion regarding the problem of between-class imbalance in stroke datasets, which leads to prediction bias and decreased performance. In this paper, we demonstrate the use of the Synthetic Minority Over-sampling Technique to overcome such problems. We also compare state of the art machine learning methods and construct a six-variable support vector machine (SVM) model to predict stroke mortality at discharge. Finally, we discuss how the identification of a reduced feature set allowed us to identify additional cases in our research database for validation testing. Our classifier achieved a c-statistic of 0.865 on the cross-validated dataset, demonstrating good classification performance using a reduced set of variables.

  9. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Darwish, Hany W.; Hassan, Said A.; Salem, Maissa Y.; El-Zeany, Badr A.

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively.

  10. Novel Breast Imaging and Machine Learning: Predicting Breast Lesion Malignancy at Cone-Beam CT Using Machine Learning Techniques.

    PubMed

    Uhlig, Johannes; Uhlig, Annemarie; Kunze, Meike; Beissbarth, Tim; Fischer, Uwe; Lotz, Joachim; Wienbeck, Susanne

    2018-05-24

    The purpose of this study is to evaluate the diagnostic performance of machine learning techniques for malignancy prediction at breast cone-beam CT (CBCT) and to compare them to human readers. Five machine learning techniques, including random forests, back propagation neural networks (BPN), extreme learning machines, support vector machines, and K-nearest neighbors, were used to train diagnostic models on a clinical breast CBCT dataset with internal validation by repeated 10-fold cross-validation. Two independent blinded human readers with profound experience in breast imaging and breast CBCT analyzed the same CBCT dataset. Diagnostic performance was compared using AUC, sensitivity, and specificity. The clinical dataset comprised 35 patients (American College of Radiology density type C and D breasts) with 81 suspicious breast lesions examined with contrast-enhanced breast CBCT. Forty-five lesions were histopathologically proven to be malignant. Among the machine learning techniques, BPNs provided the best diagnostic performance, with AUC of 0.91, sensitivity of 0.85, and specificity of 0.82. The diagnostic performance of the human readers was AUC of 0.84, sensitivity of 0.89, and specificity of 0.72 for reader 1 and AUC of 0.72, sensitivity of 0.71, and specificity of 0.67 for reader 2. AUC was significantly higher for BPN when compared with both reader 1 (p = 0.01) and reader 2 (p < 0.001). Machine learning techniques provide a high and robust diagnostic performance in the prediction of malignancy in breast lesions identified at CBCT. BPNs showed the best diagnostic performance, surpassing human readers in terms of AUC and specificity.

  11. LANDSAT-D MSS/TM tuned orbital jitter analysis model LDS900

    NASA Technical Reports Server (NTRS)

    Pollak, T. E.

    1981-01-01

    The final LANDSAT-D orbital dynamic math model (LSD900), comprised of all test validated substructures, was used to evaluate the jitter response of the MSS/TM experiments. A dynamic forced response analysis was performed at both the MSS and TM locations on all structural modes considered (thru 200 Hz). The analysis determined the roll angular response of the MSS/TM experiments to improve excitation generated by component operation. Cross axis and cross experiment responses were also calculated. The excitations were analytically represented by seven and nine term Fourier series approximations, for the MSS and TM experiment respectively, which enabled linear harmonic solution techniques to be applied to response calculations. Single worst case jitter was estimated by variations of the eigenvalue spectrum of model LSD 900. The probability of any worst case mode occurrence was investigated.

  12. Parametric bicubic spline and CAD tools for complex targets shape modelling in physical optics radar cross section prediction

    NASA Astrophysics Data System (ADS)

    Delogu, A.; Furini, F.

    1991-09-01

    Increasing interest in radar cross section (RCS) reduction is placing new demands on theoretical, computation, and graphic techniques for calculating scattering properties of complex targets. In particular, computer codes capable of predicting the RCS of an entire aircraft at high frequency and of achieving RCS control with modest structural changes, are becoming of paramount importance in stealth design. A computer code, evaluating the RCS of arbitrary shaped metallic objects that are computer aided design (CAD) generated, and its validation with measurements carried out using ALENIA RCS test facilities are presented. The code, based on the physical optics method, is characterized by an efficient integration algorithm with error control, in order to contain the computer time within acceptable limits, and by an accurate parametric representation of the target surface in terms of bicubic splines.

  13. Multivariate Bias Correction Procedures for Improving Water Quality Predictions from the SWAT Model

    NASA Astrophysics Data System (ADS)

    Arumugam, S.; Libera, D.

    2017-12-01

    Water quality observations are usually not available on a continuous basis for longer than 1-2 years at a time over a decadal period given the labor requirements making calibrating and validating mechanistic models difficult. Further, any physical model predictions inherently have bias (i.e., under/over estimation) and require post-simulation techniques to preserve the long-term mean monthly attributes. This study suggests a multivariate bias-correction technique and compares to a common technique in improving the performance of the SWAT model in predicting daily streamflow and TN loads across the southeast based on split-sample validation. The approach is a dimension reduction technique, canonical correlation analysis (CCA) that regresses the observed multivariate attributes with the SWAT model simulated values. The common approach is a regression based technique that uses an ordinary least squares regression to adjust model values. The observed cross-correlation between loadings and streamflow is better preserved when using canonical correlation while simultaneously reducing individual biases. Additionally, canonical correlation analysis does a better job in preserving the observed joint likelihood of observed streamflow and loadings. These procedures were applied to 3 watersheds chosen from the Water Quality Network in the Southeast Region; specifically, watersheds with sufficiently large drainage areas and number of observed data points. The performance of these two approaches are compared for the observed period and over a multi-decadal period using loading estimates from the USGS LOADEST model. Lastly, the CCA technique is applied in a forecasting sense by using 1-month ahead forecasts of P & T from ECHAM4.5 as forcings in the SWAT model. Skill in using the SWAT model for forecasting loadings and streamflow at the monthly and seasonal timescale is also discussed.

  14. Hospital survey on patient safety culture: psychometric analysis on a Scottish sample.

    PubMed

    Sarac, Cakil; Flin, Rhona; Mearns, Kathryn; Jackson, Jeanette

    2011-10-01

    To investigate the psychometric properties of the Hospital Survey on Patient Safety Culture on a Scottish NHS data set. The data were collected from 1969 clinical staff (estimated 22% response rate) from one acute hospital from each of seven Scottish Health boards. Using a split-half validation technique, the data were randomly split; an exploratory factor analysis was conducted on the calibration data set, and confirmatory factor analyses were conducted on the validation data set to investigate and check the original US model fit in a Scottish sample. Following the split-half validation technique, exploratory factor analysis results showed a 10-factor optimal measurement model. The confirmatory factor analyses were then performed to compare the model fit of two competing models (10-factor alternative model vs 12-factor original model). An S-B scaled χ(2) square difference test demonstrated that the original 12-factor model performed significantly better in a Scottish sample. Furthermore, reliability analyses of each component yielded satisfactory results. The mean scores on the climate dimensions in the Scottish sample were comparable with those found in other European countries. This study provided evidence that the original 12-factor structure of the Hospital Survey on Patient Safety Culture scale has been replicated in this Scottish sample. Therefore, no modifications are required to the original 12-factor model, which is suggested for use, since it would allow researchers the possibility of cross-national comparisons.

  15. Screening for Psychosocial Distress amongst War-Affected Children: Cross-Cultural Construct Validity of the CPDS

    ERIC Educational Resources Information Center

    Jordans, M. J. D.; Komproe, I. H.; Tol, W. A.; De Jong, J. T. V. M.

    2009-01-01

    Background: Large-scale psychosocial interventions in complex emergencies call for a screening procedure to identify individuals at risk. To date there are no screening instruments that are developed within low- and middle-income countries and validated for that purpose. The present study assesses the cross-cultural validity of the brief,…

  16. A technique for inferring zonal irregularity drift from single-station GNSS measurements of intensity (S4) and phase (σφ) scintillations

    NASA Astrophysics Data System (ADS)

    Carrano, Charles S.; Groves, Keith M.; Rino, Charles L.; Doherty, Patricia H.

    2016-08-01

    The zonal drift of ionospheric irregularities at low latitudes is most commonly measured by cross-correlating observations of a scintillating satellite signal made with a pair of closely spaced antennas. The Air Force Research Laboratory-Scintillation Network Decision Aid (AFRL-SCINDA) network operates a small number of very high frequency (VHF) spaced-receiver systems at low latitudes for this purpose. A far greater number of Global Navigation Satellite System (GNSS) scintillation monitors are operated by the AFRL-SCINDA network (25-30) and the Low-Latitude Ionospheric Sensor Network (35-50), but the receivers are too widely separated from each other for cross-correlation techniques to be effective. In this paper, we present an alternative approach that leverages the weak scatter scintillation theory to infer the zonal irregularity drift from single-station GNSS measurements of S4, σφ, and the propagation geometry. Unlike the spaced-receiver technique, this approach requires assumptions regarding the height of the scattering layer (which introduces a bias in the drift estimates) and the spectral index of the irregularities (which affects the spread of the drift estimates about the mean). Nevertheless, theory and experiment suggest that the ratio of σφ to S4 is less sensitive to these parameters than it is to the zonal drift. We validate the technique using VHF spaced-receiver measurements of zonal irregularity drift obtained from the AFRL-SCINDA network. While the spaced-receiver technique remains the preferred way to monitor the drift when closely spaced antenna pairs are available, our technique provides a new opportunity to monitor zonal irregularity drift using regional or global networks of widely separated GNSS scintillation monitors.

  17. Comparison between genetic parameters of cheese yield and nutrient recovery or whey loss traits measured from individual model cheese-making methods or predicted from unprocessed bovine milk samples using Fourier-transform infrared spectroscopy.

    PubMed

    Bittante, G; Ferragina, A; Cipolat-Gotet, C; Cecchinato, A

    2014-10-01

    Cheese yield is an important technological trait in the dairy industry. The aim of this study was to infer the genetic parameters of some cheese yield-related traits predicted using Fourier-transform infrared (FTIR) spectral analysis and compare the results with those obtained using an individual model cheese-producing procedure. A total of 1,264 model cheeses were produced using 1,500-mL milk samples collected from individual Brown Swiss cows, and individual measurements were taken for 10 traits: 3 cheese yield traits (fresh curd, curd total solids, and curd water as a percent of the weight of the processed milk), 4 milk nutrient recovery traits (fat, protein, total solids, and energy of the curd as a percent of the same nutrient in the processed milk), and 3 daily cheese production traits per cow (fresh curd, total solids, and water weight of the curd). Each unprocessed milk sample was analyzed using a MilkoScan FT6000 (Foss, Hillerød, Denmark) over the spectral range, from 5,000 to 900 wavenumber × cm(-1). The FTIR spectrum-based prediction models for the previously mentioned traits were developed using modified partial least-square regression. Cross-validation of the whole data set yielded coefficients of determination between the predicted and measured values in cross-validation of 0.65 to 0.95 for all traits, except for the recovery of fat (0.41). A 3-fold external validation was also used, in which the available data were partitioned into 2 subsets: a training set (one-third of the herds) and a testing set (two-thirds). The training set was used to develop calibration equations, whereas the testing subsets were used for external validation of the calibration equations and to estimate the heritabilities and genetic correlations of the measured and FTIR-predicted phenotypes. The coefficients of determination between the predicted and measured values in cross-validation results obtained from the training sets were very similar to those obtained from the whole data set, but the coefficient of determination of validation values for the external validation sets were much lower for all traits (0.30 to 0.73), and particularly for fat recovery (0.05 to 0.18), for the training sets compared with the full data set. For each testing subset, the (co)variance components for the measured and FTIR-predicted phenotypes were estimated using bivariate Bayesian analyses and linear models. The intraherd heritabilities for the predicted traits obtained from our internal cross-validation using the whole data set ranged from 0.085 for daily yield of curd solids to 0.576 for protein recovery, and were similar to those obtained from the measured traits (0.079 to 0.586, respectively). The heritabilities estimated from the testing data set used for external validation were more variable but similar (on average) to the corresponding values obtained from the whole data set. Moreover, the genetic correlations between the predicted and measured traits were high in general (0.791 to 0.996), and they were always higher than the corresponding phenotypic correlations (0.383 to 0.995), especially for the external validation subset. In conclusion, we herein report that application of the cross-validation technique to the whole data set tended to overestimate the predictive ability of FTIR spectra, give more precise phenotypic predictions than the calibrations obtained using smaller data sets, and yield genetic correlations similar to those obtained from the measured traits. Collectively, our findings indicate that FTIR predictions have the potential to be used as indicator traits for the rapid and inexpensive selection of dairy populations for improvement of cheese yield, milk nutrient recovery in curd, and daily cheese production per cow. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. Genomic prediction using different estimation methodology, blending and cross-validation techniques for growth traits and visual scores in Hereford and Braford cattle.

    PubMed

    Campos, G S; Reimann, F A; Cardoso, L L; Ferreira, C E R; Junqueira, V S; Schmidt, P I; Braccini Neto, J; Yokoo, M J I; Sollero, B P; Boligon, A A; Cardoso, F F

    2018-05-07

    The objective of the present study was to evaluate the accuracy and bias of direct and blended genomic predictions using different methods and cross-validation techniques for growth traits (weight and weight gains) and visual scores (conformation, precocity, muscling and size) obtained at weaning and at yearling in Hereford and Braford breeds. Phenotypic data contained 126,290 animals belonging to the Delta G Connection genetic improvement program, and a set of 3,545 animals genotyped with the 50K chip and 131 sires with the 777K. After quality control, 41,045 markers remained for all animals. An animal model was used to estimate (co)variances components and to predict breeding values, which were later used to calculate the deregressed estimated breeding values (DEBV). Animals with genotype and phenotype for the traits studied were divided into four or five groups by random and k-means clustering cross-validation strategies. The values of accuracy of the direct genomic values (DGV) were moderate to high magnitude for at weaning and at yearling traits, ranging from 0.19 to 0.45 for the k-means and 0.23 to 0.78 for random clustering among all traits. The greatest gain in relation to the pedigree BLUP (PBLUP) was 9.5% with the BayesB method with both the k-means and the random clustering. Blended genomic value accuracies ranged from 0.19 to 0.56 for k-means and from 0.21 to 0.82 for random clustering. The analyzes using the historical pedigree and phenotypes contributed additional information to calculate the GEBV and in general, the largest gains were for the single-step (ssGBLUP) method in bivariate analyses with a mean increase of 43.00% among all traits measured at weaning and of 46.27% for those evaluated at yearling. The accuracy values for the marker effects estimation methods were lower for k-means clustering, indicating that the training set relationship to the selection candidates is a major factor affecting accuracy of genomic predictions. The gains in accuracy obtained with genomic blending methods, mainly ssGBLUP in bivariate analyses, indicate that genomic predictions should be used as a tool to improve genetic gains in relation to the traditional PBLUP selection.

  19. Combined chamber-tower approach: Using eddy covariance measurements to cross-validate carbon fluxes modeled from manual chamber campaigns

    NASA Astrophysics Data System (ADS)

    Brümmer, C.; Moffat, A. M.; Huth, V.; Augustin, J.; Herbst, M.; Kutsch, W. L.

    2016-12-01

    Manual carbon dioxide flux measurements with closed chambers at scheduled campaigns are a versatile method to study management effects at small scales in multiple-plot experiments. The eddy covariance technique has the advantage of quasi-continuous measurements but requires large homogeneous areas of a few hectares. To evaluate the uncertainties associated with interpolating from individual campaigns to the whole vegetation period, we installed both techniques at an agricultural site in Northern Germany. The presented comparison covers two cropping seasons, winter oilseed rape in 2012/13 and winter wheat in 2013/14. Modeling half-hourly carbon fluxes from campaigns is commonly performed based on non-linear regressions for the light response and respiration. The daily averages of net CO2 modeled from chamber data deviated from eddy covariance measurements in the range of ± 5 g C m-2 day-1. To understand the observed differences and to disentangle the effects, we performed four additional setups (expert versus default settings of the non-linear regressions based algorithm, purely empirical modeling with artificial neural networks versus non-linear regressions, cross-validating using eddy covariance measurements as campaign fluxes, weekly versus monthly scheduling of campaigns) to model the half-hourly carbon fluxes for the whole vegetation period. The good agreement of the seasonal course of net CO2 at plot and field scale for our agricultural site demonstrates that both techniques are robust and yield consistent results at seasonal time scale even for a managed ecosystem with high temporal dynamics in the fluxes. This allows combining the respective advantages of factorial experiments at plot scale with dense time series data at field scale. Furthermore, the information from the quasi-continuous eddy covariance measurements can be used to derive vegetation proxies to support the interpolation of carbon fluxes in-between the manual chamber campaigns.

  20. Cross-cultural validation of instruments measuring health beliefs about colorectal cancer screening among Korean Americans.

    PubMed

    Lee, Shin-Young; Lee, Eunice E

    2015-02-01

    The purpose of this study was to report the instrument modification and validation processes to make existing health belief model scales culturally appropriate for Korean Americans (KAs) regarding colorectal cancer (CRC) screening utilization. Instrument translation, individual interviews using cognitive interviewing, and expert reviews were conducted during the instrument modification phase, and a pilot test and a cross-sectional survey were conducted during the instrument validation phase. Data analyses of the cross-sectional survey included internal consistency and construct validity using exploratory and confirmatory factor analysis. The main issues identified during the instrument modification phase were (a) cultural and linguistic translation issues and (b) newly developed items reflecting Korean cultural barriers. Cross-sectional survey analyses during the instrument validation phase revealed that all scales demonstrate good internal consistency reliability (Cronbach's alpha=.72~.88). Exploratory factor analysis showed that susceptibility and severity loaded on the same factor, which may indicate a threat variable. Items with low factor loadings in the confirmatory factor analysis may relate to (a) lack of knowledge about fecal occult blood testing and (b) multiple dimensions of the subscales. Methodological, sequential processes of instrument modification and validation, including translation, individual interviews, expert reviews, pilot testing and a cross-sectional survey, were provided in this study. The findings indicate that existing instruments need to be examined for CRC screening research involving KAs.

  1. Sediment transport patterns in the San Francisco Bay Coastal System from cross-validation of bedform asymmetry and modeled residual flux

    USGS Publications Warehouse

    Barnard, Patrick L.; Erikson, Li H.; Elias, Edwin P.L.; Dartnell, Peter

    2013-01-01

    The morphology of ~ 45,000 bedforms from 13 multibeam bathymetry surveys was used as a proxy for identifying net bedload sediment transport directions and pathways throughout the San Francisco Bay estuary and adjacent outer coast. The spatially-averaged shape asymmetry of the bedforms reveals distinct pathways of ebb and flood transport. Additionally, the region-wide, ebb-oriented asymmetry of 5% suggests net seaward-directed transport within the estuarine-coastal system, with significant seaward asymmetry at the mouth of San Francisco Bay (11%), through the northern reaches of the Bay (7-8%), and among the largest bedforms (21% for λ > 50 m). This general indication for the net transport of sand to the open coast strongly suggests that anthropogenic removal of sediment from the estuary, particularly along clearly defined seaward transport pathways, will limit the supply of sand to chronically eroding, open-coast beaches. The bedform asymmetry measurements significantly agree (up to ~ 76%) with modeled annual residual transport directions derived from a hydrodynamically-calibrated numerical model, and the orientation of adjacent, flow-sculpted seafloor features such as mega-flute structures, providing a comprehensive validation of the technique. The methods described in this paper to determine well-defined, cross-validated sediment transport pathways can be applied to estuarine-coastal systems globally where bedforms are present. The results can inform and improve regional sediment management practices to more efficiently utilize often limited sediment resources and mitigate current and future sediment supply-related impacts.

  2. Sediment transport patterns in the San Francisco Bay Coastal System from cross-validation of bedform asymmetry and modeled residual flux

    USGS Publications Warehouse

    Barnard, Patrick L.; Erikson, Li H.; Elias, Edwin P.L.; Dartnell, Peter; Barnard, P.L.; Jaffee, B.E.; Schoellhamer, D.H.

    2013-01-01

    The morphology of ~ 45,000 bedforms from 13 multibeam bathymetry surveys was used as a proxy for identifying net bedload sediment transport directions and pathways throughout the San Francisco Bay estuary and adjacent outer coast. The spatially-averaged shape asymmetry of the bedforms reveals distinct pathways of ebb and flood transport. Additionally, the region-wide, ebb-oriented asymmetry of 5% suggests net seaward-directed transport within the estuarine-coastal system, with significant seaward asymmetry at the mouth of San Francisco Bay (11%), through the northern reaches of the Bay (7–8%), and among the largest bedforms (21% for λ > 50 m). This general indication for the net transport of sand to the open coast strongly suggests that anthropogenic removal of sediment from the estuary, particularly along clearly defined seaward transport pathways, will limit the supply of sand to chronically eroding, open-coast beaches. The bedform asymmetry measurements significantly agree (up to ~ 76%) with modeled annual residual transport directions derived from a hydrodynamically-calibrated numerical model, and the orientation of adjacent, flow-sculpted seafloor features such as mega-flute structures, providing a comprehensive validation of the technique. The methods described in this paper to determine well-defined, cross-validated sediment transport pathways can be applied to estuarine-coastal systems globally where bedforms are present. The results can inform and improve regional sediment management practices to more efficiently utilize often limited sediment resources and mitigate current and future sediment supply-related impacts.

  3. Validity of the Male Depression Risk Scale in a representative Canadian sample: sensitivity and specificity in identifying men with recent suicide attempt.

    PubMed

    Rice, Simon M; Ogrodniczuk, John S; Kealy, David; Seidler, Zac E; Dhillon, Haryana M; Oliffe, John L

    2017-12-22

    Clinical practice and literature has supported the existence of a phenotypic sub-type of depression in men. While a number of self-report rating scales have been developed in order to empirically test the male depression construct, psychometric validation of these scales is limited. To confirm the psychometric properties of the multidimensional Male Depression Risk Scale (MDRS-22) and to develop clinical cut-off scores for the MDRS-22. Data were obtained from an online sample of 1000 Canadian men (median age (M) = 49.63, standard deviation (SD) = 14.60). Confirmatory factor analysis (CFA) was used to replicate the established six-factor model of the MDRS-22. Psychometric values of the MDRS subscales were comparable to the widely used Patient Health Questionnaire-9. CFA model fit indices indicated adequate model fit for the six-factor MDRS-22 model. ROC curve analysis indicated the MDRS-22 was effective for identifying those with a recent (previous four-weeks) suicide attempt (area under curve (AUC) values = 0.837). The MDRS-22 cut-off identified proportionally more (84.62%) cases of recent suicide attempt relative to the PHQ-9 moderate range (53.85%). The MDRS-22 is the first male-sensitive depression scale to be psychometrically validated using CFA techniques in independent and cross-nation samples. Additional studies should identify differential item functioning and evaluate cross-cultural effects.

  4. Cross-Cultural Validation of the Five-Factor Structure of Social Goals: A Filipino Investigation

    ERIC Educational Resources Information Center

    King, Ronnel B.; Watkins, David A.

    2012-01-01

    The aim of the present study was to test the cross-cultural validity of the five-factor structure of social goals that Dowson and McInerney proposed. Using both between-network and within-network approaches to construct validation, 1,147 Filipino high school students participated in the study. Confirmatory factor analysis indicated that the…

  5. FIRE: an SPSS program for variable selection in multiple linear regression analysis via the relative importance of predictors.

    PubMed

    Lorenzo-Seva, Urbano; Ferrando, Pere J

    2011-03-01

    We provide an SPSS program that implements currently recommended techniques and recent developments for selecting variables in multiple linear regression analysis via the relative importance of predictors. The approach consists of: (1) optimally splitting the data for cross-validation, (2) selecting the final set of predictors to be retained in the equation regression, and (3) assessing the behavior of the chosen model using standard indices and procedures. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from brm.psychonomic-journals.org/content/supplemental.

  6. Reconstruction of spatio-temporal temperature from sparse historical records using robust probabilistic principal component regression

    USGS Publications Warehouse

    Tipton, John; Hooten, Mevin B.; Goring, Simon

    2017-01-01

    Scientific records of temperature and precipitation have been kept for several hundred years, but for many areas, only a shorter record exists. To understand climate change, there is a need for rigorous statistical reconstructions of the paleoclimate using proxy data. Paleoclimate proxy data are often sparse, noisy, indirect measurements of the climate process of interest, making each proxy uniquely challenging to model statistically. We reconstruct spatially explicit temperature surfaces from sparse and noisy measurements recorded at historical United States military forts and other observer stations from 1820 to 1894. One common method for reconstructing the paleoclimate from proxy data is principal component regression (PCR). With PCR, one learns a statistical relationship between the paleoclimate proxy data and a set of climate observations that are used as patterns for potential reconstruction scenarios. We explore PCR in a Bayesian hierarchical framework, extending classical PCR in a variety of ways. First, we model the latent principal components probabilistically, accounting for measurement error in the observational data. Next, we extend our method to better accommodate outliers that occur in the proxy data. Finally, we explore alternatives to the truncation of lower-order principal components using different regularization techniques. One fundamental challenge in paleoclimate reconstruction efforts is the lack of out-of-sample data for predictive validation. Cross-validation is of potential value, but is computationally expensive and potentially sensitive to outliers in sparse data scenarios. To overcome the limitations that a lack of out-of-sample records presents, we test our methods using a simulation study, applying proper scoring rules including a computationally efficient approximation to leave-one-out cross-validation using the log score to validate model performance. The result of our analysis is a spatially explicit reconstruction of spatio-temporal temperature from a very sparse historical record.

  7. Adaptation and cross-cultural validation of the United States Primary Care Assessment Tool (expanded version) for use in South Africa

    PubMed Central

    Sayed, Abdul-Rauf; le Grange, Cynthia; Bhagwan, Susheela; Manga, Nayna

    2015-01-01

    Background Measuring primary care is important for health sector reform. The Primary Care Assessment Tool (PCAT) measures performance of elements essential for cost-effective care. Following minor adaptations prior to use in Cape Town in 2011, a few findings indicated a need to improve the content and cross-cultural validity for wider use in South Africa (SA). Aim This study aimed to validate the United States of America-developed PCAT before being used in a baseline measure of primary care performance prior to major reform. Setting Public sector primary care clinics, users, practitioners and managers in urban and rural districts in the Western Cape Province. Methods Face value evaluation of item phrasing and a combination of Delphi and Nominal Group Technique (NGT) methods with an expert panel and user focus group were used to obtain consensus on content relevant to SA. Original and new domains and items with > = 70% agreement were included in the South African version – ZA PCAT. Results All original PCAT domains achieved consensus on inclusion. One new domain, the primary healthcare (PHC) team, was added. Three of 95 original items achieved < 70% agreement, that is consensus to exclude as not relevant to SA; 19 new items were added. A few items needed minor rephrasing with local healthcare jargon. The demographic section was adapted to local socio-economic conditions. The adult PCAT was translated into isiXhosa and Afrikaans. Conclusion The PCAT is a valid measure of primary care performance in SA. The PHC team domain is an important addition, given its emphasis in PHC re-engineering. A combination of Delphi and NGT methods succeeded in obtaining consensus on a multi-domain, multi-item instrument in a resource- constrained environment. PMID:26245610

  8. Adaptation and cross-cultural validation of the United States Primary Care Assessment Tool (expanded version) for use in South Africa.

    PubMed

    Bresick, Graham; Sayed, Abdul-Rauf; le Grange, Cynthia; Bhagwan, Susheela; Manga, Nayna

    2015-06-19

    Measuring primary care is important for health sector reform. The Primary Care Assessment Tool (PCAT) measures performance of elements essential for cost-effective care. Following minor adaptations prior to use in Cape Town in 2011, a few findings indicated a need to improve the content and cross-cultural validity for wider use in South Africa (SA). This study aimed to validate the United States of America-developed PCAT before being used in a baseline measure of primary care performance prior to major reform. Public sector primary care clinics, users, practitioners and managers in urban and rural districts in the Western Cape Province. Face value evaluation of item phrasing and a combination of Delphi and Nominal Group Technique (NGT) methods with an expert panel and user focus group were used to obtain consensus on content relevant to SA. Original and new domains and items with > = 70% agreement were included in the South African version--ZA PCAT. All original PCAT domains achieved consensus on inclusion. One new domain, the primary healthcare (PHC) team, was added. Three of 95 original items achieved < 70% agreement, that is consensus to exclude as not relevant to SA; 19 new items were added. A few items needed minor rephrasing with local healthcare jargon. The demographic section was adapted to local socio-economic conditions. The adult PCAT was translated into isiXhosa and Afrikaans. The PCAT is a valid measure of primary care performance in SA. The PHC team domain is an important addition, given its emphasis in PHC re-engineering. A combination of Delphi and NGT methods succeeded in obtaining consensus on a multi-domain, multi-item instrument in a resource-constrained environment.

  9. An instrument to assess subjective task value beliefs regarding the decision to pursue postgraduate training.

    PubMed

    Hagemeier, Nicholas E; Murawski, Matthew M

    2014-02-12

    To develop and validate an instrument to assess subjective ratings of the perceived value of various postgraduate training paths followed using expectancy-value as a theoretical framework; and to explore differences in value beliefs across type of postgraduate training pursued and type of pharmacy training completed prior to postgraduate training. A survey instrument was developed to sample 4 theoretical domains of subjective task value: intrinsic value, attainment value, utility value, and perceived cost. Retrospective self-report methodology was employed to examine respondents' (N=1,148) subjective task value beliefs specific to their highest level of postgraduate training completed. Exploratory and confirmatory factor analytic techniques were used to evaluate and validate value belief constructs. Intrinsic, attainment, utility, cost, and financial value constructs resulted from exploratory factor analysis. Cross-validation resulted in a 26-item instrument that demonstrated good model fit. Differences in value beliefs were noted across type of postgraduate training pursued and pharmacy training characteristics. The Postgraduate Training Value Instrument demonstrated evidence of reliability and construct validity. The survey instrument can be used to assess value beliefs regarding multiple postgraduate training options in pharmacy and potentially inform targeted recruiting of individuals to those paths best matching their own value beliefs.

  10. Analytical method development of nifedipine and its degradants binary mixture using high performance liquid chromatography through a quality by design approach

    NASA Astrophysics Data System (ADS)

    Choiri, S.; Ainurofiq, A.; Ratri, R.; Zulmi, M. U.

    2018-03-01

    Nifedipin (NIF) is a photo-labile drug that easily degrades when it exposures a sunlight. This research aimed to develop of an analytical method using a high-performance liquid chromatography and implemented a quality by design approach to obtain effective, efficient, and validated analytical methods of NIF and its degradants. A 22 full factorial design approach with a curvature as a center point was applied to optimize of the analytical condition of NIF and its degradants. Mobile phase composition (MPC) and flow rate (FR) as factors determined on the system suitability parameters. The selected condition was validated by cross-validation using a leave one out technique. Alteration of MPC affected on time retention significantly. Furthermore, an increase of FR reduced the tailing factor. In addition, the interaction of both factors affected on an increase of the theoretical plates and resolution of NIF and its degradants. The selected analytical condition of NIF and its degradants has been validated at range 1 – 16 µg/mL that had good linearity, precision, accuration and efficient due to an analysis time within 10 min.

  11. Detection of brain tumor margins using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Juarez-Chambi, Ronald M.; Kut, Carmen; Rico-Jimenez, Jesus; Campos-Delgado, Daniel U.; Quinones-Hinojosa, Alfredo; Li, Xingde; Jo, Javier

    2018-02-01

    In brain cancer surgery, it is critical to achieve extensive resection without compromising adjacent healthy, noncancerous regions. Various technological advances have made major contributions in imaging, including intraoperative magnetic imaging (MRI) and computed tomography (CT). However, these technologies have pros and cons in providing quantitative, real-time and three-dimensional (3D) continuous guidance in brain cancer detection. Optical Coherence Tomography (OCT) is a non-invasive, label-free, cost-effective technique capable of imaging tissue in three dimensions and real time. The purpose of this study is to reliably and efficiently discriminate between non-cancer and cancerinfiltrated brain regions using OCT images. To this end, a mathematical model for quantitative evaluation known as the Blind End-Member and Abundances Extraction method (BEAE). This BEAE method is a constrained optimization technique which extracts spatial information from volumetric OCT images. Using this novel method, we are able to discriminate between cancerous and non-cancerous tissues and using logistic regression as a classifier for automatic brain tumor margin detection. Using this technique, we are able to achieve excellent performance using an extensive cross-validation of the training dataset (sensitivity 92.91% and specificity 98.15%) and again using an independent, blinded validation dataset (sensitivity 92.91% and specificity 86.36%). In summary, BEAE is well-suited to differentiate brain tissue which could support the guiding surgery process for tissue resection.

  12. Detection of brain tumor margins using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Juarez-Chambi, Ronald M.; Kut, Carmen; Rico-Jimenez, Jesus; Campos-Delgado, Daniel U.; Quinones-Hinojosa, Alfredo; Li, Xingde; Jo, Javier

    2018-02-01

    In brain cancer surgery, it is critical to achieve extensive resection without compromising adjacent healthy, non-cancerous regions. Various technological advances have made major contributions in imaging, including intraoperative magnetic imaging (MRI) and computed tomography (CT). However, these technologies have pros and cons in providing quantitative, real-time and three-dimensional (3D) continuous guidance in brain cancer detection. Optical Coherence Tomography (OCT) is a non-invasive, label-free, cost-effective technique capable of imaging tissue in three dimensions and real time. The purpose of this study is to reliably and efficiently discriminate between non-cancer and cancer-infiltrated brain regions using OCT images. To this end, a mathematical model for quantitative evaluation known as the Blind End- Member and Abundances Extraction method (BEAE). This BEAE method is a constrained optimization technique which extracts spatial information from volumetric OCT images. Using this novel method, we are able to discriminate between cancerous and non-cancerous tissues and using logistic regression as a classifier for automatic brain tumor margin detection. Using this technique, we are able to achieve excellent performance using an extensive cross-validation of the training dataset (sensitivity 92.91% and specificity 98.15%) and again using an independent, blinded validation dataset (sensitivity 92.91% and specificity 86.36%). In summary, BEAE is well-suited to differentiate brain tissue which could support the guiding surgery process for tissue resection.

  13. Simultaneous Determination of Metamizole, Thiamin and Pyridoxin Using UV-Spectroscopy in Combination with Multivariate Calibration

    PubMed Central

    Chotimah, Chusnul; Sudjadi; Riyanto, Sugeng; Rohman, Abdul

    2015-01-01

    Purpose: Analysis of drugs in multicomponent system officially is carried out using chromatographic technique, however, this technique is too laborious and involving sophisticated instrument. Therefore, UV-VIS spectrophotometry coupled with multivariate calibration of partial least square (PLS) for quantitative analysis of metamizole, thiamin and pyridoxin is developed in the presence of cyanocobalamine without any separation step. Methods: The calibration and validation samples are prepared. The calibration model is prepared by developing a series of sample mixture consisting these drugs in certain proportion. Cross validation of calibration sample using leave one out technique is used to identify the smaller set of components that provide the greatest predictive ability. The evaluation of calibration model was based on the coefficient of determination (R2) and root mean square error of calibration (RMSEC). Results: The results showed that the coefficient of determination (R2) for the relationship between actual values and predicted values for all studied drugs was higher than 0.99 indicating good accuracy. The RMSEC values obtained were relatively low, indicating good precision. The accuracy and presision results of developed method showed no significant difference compared to those obtained by official method of HPLC. Conclusion: The developed method (UV-VIS spectrophotometry in combination with PLS) was succesfully used for analysis of metamizole, thiamin and pyridoxin in tablet dosage form. PMID:26819934

  14. Model-based and Model-free Machine Learning Techniques for Diagnostic Prediction and Classification of Clinical Outcomes in Parkinson's Disease.

    PubMed

    Gao, Chao; Sun, Hanbo; Wang, Tuo; Tang, Ming; Bohnen, Nicolaas I; Müller, Martijn L T M; Herman, Talia; Giladi, Nir; Kalinin, Alexandr; Spino, Cathie; Dauer, William; Hausdorff, Jeffrey M; Dinov, Ivo D

    2018-05-08

    In this study, we apply a multidisciplinary approach to investigate falls in PD patients using clinical, demographic and neuroimaging data from two independent initiatives (University of Michigan and Tel Aviv Sourasky Medical Center). Using machine learning techniques, we construct predictive models to discriminate fallers and non-fallers. Through controlled feature selection, we identified the most salient predictors of patient falls including gait speed, Hoehn and Yahr stage, postural instability and gait difficulty-related measurements. The model-based and model-free analytical methods we employed included logistic regression, random forests, support vector machines, and XGboost. The reliability of the forecasts was assessed by internal statistical (5-fold) cross validation as well as by external out-of-bag validation. Four specific challenges were addressed in the study: Challenge 1, develop a protocol for harmonizing and aggregating complex, multisource, and multi-site Parkinson's disease data; Challenge 2, identify salient predictive features associated with specific clinical traits, e.g., patient falls; Challenge 3, forecast patient falls and evaluate the classification performance; and Challenge 4, predict tremor dominance (TD) vs. posture instability and gait difficulty (PIGD). Our findings suggest that, compared to other approaches, model-free machine learning based techniques provide a more reliable clinical outcome forecasting of falls in Parkinson's patients, for example, with a classification accuracy of about 70-80%.

  15. Development of Wind Speed Retrieval from Cross-Polarization Chinese Gaofen-3 Synthetic Aperture Radar in Typhoons

    PubMed Central

    Yuan, Xinzhe; Sun, Jian; Zhou, Wei; Zhang, Qingjun

    2018-01-01

    The purpose of our work is to determine the feasibility and effectiveness of retrieving sea surface wind speeds from C-band cross-polarization (herein vertical-horizontal, VH) Chinese Gaofen-3 (GF-3) SAR images in typhoons. In this study, we have collected three GF-3 SAR images acquired in Global Observation (GLO) and Wide ScanSAR (WSC) mode during the summer of 2017 from the China Sea, which includes the typhoons Noru, Doksuri and Talim. These images were collocated with wind simulations at 0.12° grids from a numeric model, called the Regional Assimilation and Prediction System-Typhoon model (GRAPES-TYM). Recent research shows that GRAPES-TYM has a good performance for typhoon simulation in the China Sea. Based on the dataset, the dependence of wind speed and of radar incidence angle on normalized radar cross (NRCS) of VH-polarization GF-3 SAR have been investigated, after which an empirical algorithm for wind speed retrieval from VH-polarization GF-3 SAR was tuned. An additional four VH-polarization GF-3 SAR images in three typhoons, Noru, Hato and Talim, were investigated in order to validate the proposed algorithm. SAR-derived winds were compared with measurements from Windsat winds at 0.25° grids with wind speeds up to 40 m/s, showing a 5.5 m/s root mean square error (RMSE) of wind speed and an improved RMSE of 5.1 m/s wind speed was achieved compared with the retrieval results validated against GRAPES-TYM winds. It is concluded that the proposed algorithm is a promising potential technique for strong wind retrieval from cross-polarization GF-3 SAR images without encountering a signal saturation problem. PMID:29385068

  16. Cross-Correlation-Based Structural System Identification Using Unmanned Aerial Vehicles

    PubMed Central

    Yoon, Hyungchul; Hoskere, Vedhus; Park, Jong-Woong; Spencer, Billie F.

    2017-01-01

    Computer vision techniques have been employed to characterize dynamic properties of structures, as well as to capture structural motion for system identification purposes. All of these methods leverage image-processing techniques using a stationary camera. This requirement makes finding an effective location for camera installation difficult, because civil infrastructure (i.e., bridges, buildings, etc.) are often difficult to access, being constructed over rivers, roads, or other obstacles. This paper seeks to use video from Unmanned Aerial Vehicles (UAVs) to address this problem. As opposed to the traditional way of using stationary cameras, the use of UAVs brings the issue of the camera itself moving; thus, the displacements of the structure obtained by processing UAV video are relative to the UAV camera. Some efforts have been reported to compensate for the camera motion, but they require certain assumptions that may be difficult to satisfy. This paper proposes a new method for structural system identification using the UAV video directly. Several challenges are addressed, including: (1) estimation of an appropriate scale factor; and (2) compensation for the rolling shutter effect. Experimental validation is carried out to validate the proposed approach. The experimental results demonstrate the efficacy and significant potential of the proposed approach. PMID:28891985

  17. Model assessment using a multi-metric ranking technique

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, P. J.; Lau, Y.; Alaka, G.; Marks, F.

    2017-12-01

    Validation comparisons of multiple models presents challenges when skill levels are similar, especially in regimes dominated by the climatological mean. Assessing skill separation will require advanced validation metrics and identifying adeptness in extreme events, but maintain simplicity for management decisions. Flexibility for operations is also an asset. This work postulates a weighted tally and consolidation technique which ranks results by multiple types of metrics. Variables include absolute error, bias, acceptable absolute error percentages, outlier metrics, model efficiency, Pearson correlation, Kendall's Tau, reliability Index, multiplicative gross error, and root mean squared differences. Other metrics, such as root mean square difference and rank correlation were also explored, but removed when the information was discovered to be generally duplicative to other metrics. While equal weights are applied, weights could be altered depending for preferred metrics. Two examples are shown comparing ocean models' currents and tropical cyclone products, including experimental products. The importance of using magnitude and direction for tropical cyclone track forecasts instead of distance, along-track, and cross-track are discussed. Tropical cyclone intensity and structure prediction are also assessed. Vector correlations are not included in the ranking process, but found useful in an independent context, and will be briefly reported.

  18. Towards Optical Coherence Tomography-based elastographic evaluation of human cartilage.

    PubMed

    Nebelung, Sven; Brill, Nicolai; Müller, Felix; Tingart, Markus; Pufe, Thomas; Merhof, Dorit; Schmitt, Robert; Jahr, Holger; Truhn, Daniel

    2016-03-01

    Optical Coherence Tomography (OCT) is an imaging technique that allows the surface and subsurface evaluation of semitransparent tissues by generating microscopic cross-sectional images in real time, to millimetre depths and at micrometre resolutions. As the differentiation of cartilage degeneration remains diagnostically challenging to standard imaging modalities, an OCT- and MRI-compatible indentation device for the assessment of cartilage functional properties was developed and validated in the present study. After describing the system design and performing its comprehensive validation, macroscopically intact human cartilage samples (n=5) were indented under control of displacement (δ1=202µm; δ2=405µm; δ3=607µm; δ4=810µm) and simultaneous OCT imaging through a transparent indenter piston in direct contact with the sample; thus, 3-D OCT datasets from surface and subsurface areas were obtained. OCT-based evaluation of loading-induced changes included qualitative assessment of image morphology and signal characteristics. For inter-method cross referencing, the device׳s compatibility with MRI as well as qualitative morphology changes under analogous indentation loading conditions were evaluated by a series of T2 weighted gradient echo sequences. Cartilage thickness measurements were performed using the needle-probe technique prior to OCT and MRI imaging, and subsequently referenced to sample thickness as determined by MRI and histology. Dynamic indentation testing was performed to determine Young׳s modulus for biomechanical reference purposes. Distinct differences in sample thickness as well as corresponding strains were found; however, no significant differences in cartilage thickness were found between the used techniques. Qualitative assessment of OCT and MRI images revealed either distinct or absent sample-specific patterns of morphological changes in relation to indentation loading. For OCT, the tissue area underneath the indenter piston could be qualitatively assessed and displayed in multiple reconstructions, while for MRI, T2 signal characteristics indicated the presence of water and related tissue pressurisation within the sample. In conclusion, the present indentation device has been developed, constructed and validated for qualitative assessment of human cartilage and its response to loading by OCT and MRI. Thereby, it may provide the basis for future quantitative approaches that measure loading-induced deformations within the tissue to generate maps of local tissue properties as well as investigate their relation to degeneration. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Effect of Disinfectants on Preventing the Cross-Contamination of Pathogens in Fresh Produce Washing Water

    PubMed Central

    Banach, Jennifer L.; Sampers, Imca; Van Haute, Sam; van der Fels-Klerx, H.J. (Ine)

    2015-01-01

    The potential cross-contamination of pathogens between clean and contaminated produce in the washing tank is highly dependent on the water quality. Process wash water disinfectants are applied to maintain the water quality during processing. The review examines the efficacy of process wash water disinfectants during produce processing with the aim to prevent cross-contamination of pathogens. Process wash water disinfection requires short contact times so microorganisms are rapidly inactivated. Free chlorine, chlorine dioxide, ozone, and peracetic acid were considered suitable disinfectants. A disinfectant’s reactivity with the organic matter will determine the disinfectant residual, which is of paramount importance for microbial inactivation and should be monitored in situ. Furthermore, the chemical and worker safety, and the legislative framework will determine the suitability of a disinfection technique. Current research often focuses on produce decontamination and to a lesser extent on preventing cross-contamination. Further research on a sanitizer’s efficacy in the washing water is recommended at the laboratory scale, in particular with experimental designs reflecting industrial conditions. Validation on the industrial scale is warranted to better understand the overall effects of a sanitizer. PMID:26213953

  20. Differential modal Zernike wavefront sensor employing a computer-generated hologram: a proposal.

    PubMed

    Mishra, Sanjay K; Bhatt, Rahul; Mohan, Devendra; Gupta, Arun Kumar; Sharma, Anurag

    2009-11-20

    The process of Zernike mode detection with a Shack-Hartmann wavefront sensor is computationally extensive. A holographic modal wavefront sensor has therefore evolved to process the data optically by use of the concept of equal and opposite phase bias. Recently, a multiplexed computer-generated hologram (CGH) technique was developed in which the output is in the form of bright dots that specify the presence and strength of a specific Zernike mode. We propose a wavefront sensor using the concept of phase biasing in the latter technique such that the output is a pair of bright dots for each mode to be sensed. A normalized difference signal between the intensities of the two dots is proportional to the amplitude of the sensed Zernike mode. In our method the number of holograms to be multiplexed is decreased, thereby reducing the modal cross talk significantly. We validated the proposed method through simulation studies for several cases. The simulation results demonstrate simultaneous wavefront detection of lower-order Zernike modes with a resolution better than lambda/50 for the wide measurement range of +/-3.5lambda with much reduced cross talk at high speed.

  1. Power Enhancement in High Dimensional Cross-Sectional Tests

    PubMed Central

    Fan, Jianqing; Liao, Yuan; Yao, Jiawei

    2016-01-01

    We propose a novel technique to boost the power of testing a high-dimensional vector H : θ = 0 against sparse alternatives where the null hypothesis is violated only by a couple of components. Existing tests based on quadratic forms such as the Wald statistic often suffer from low powers due to the accumulation of errors in estimating high-dimensional parameters. More powerful tests for sparse alternatives such as thresholding and extreme-value tests, on the other hand, require either stringent conditions or bootstrap to derive the null distribution and often suffer from size distortions due to the slow convergence. Based on a screening technique, we introduce a “power enhancement component”, which is zero under the null hypothesis with high probability, but diverges quickly under sparse alternatives. The proposed test statistic combines the power enhancement component with an asymptotically pivotal statistic, and strengthens the power under sparse alternatives. The null distribution does not require stringent regularity conditions, and is completely determined by that of the pivotal statistic. As specific applications, the proposed methods are applied to testing the factor pricing models and validating the cross-sectional independence in panel data models. PMID:26778846

  2. Registration of High Angular Resolution Diffusion MRI Images Using 4th Order Tensors⋆

    PubMed Central

    Barmpoutis, Angelos; Vemuri, Baba C.; Forder, John R.

    2009-01-01

    Registration of Diffusion Weighted (DW)-MRI datasets has been commonly achieved to date in literature by using either scalar or 2nd-order tensorial information. However, scalar or 2nd-order tensors fail to capture complex local tissue structures, such as fiber crossings, and therefore, datasets containing fiber-crossings cannot be registered accurately by using these techniques. In this paper we present a novel method for non-rigidly registering DW-MRI datasets that are represented by a field of 4th-order tensors. We use the Hellinger distance between the normalized 4th-order tensors represented as distributions, in order to achieve this registration. Hellinger distance is easy to compute, is scale and rotation invariant and hence allows for comparison of the true shape of distributions. Furthermore, we propose a novel 4th-order tensor re-transformation operator, which plays an essential role in the registration procedure and shows significantly better performance compared to the re-orientation operator used in literature for DTI registration. We validate and compare our technique with other existing scalar image and DTI registration methods using simulated diffusion MR data and real HARDI datasets. PMID:18051145

  3. Assessing cross-cultural validity of scales: a methodological review and illustrative example.

    PubMed

    Beckstead, Jason W; Yang, Chiu-Yueh; Lengacher, Cecile A

    2008-01-01

    In this article, we assessed the cross-cultural validity of the Women's Role Strain Inventory (WRSI), a multi-item instrument that assesses the degree of strain experienced by women who juggle the roles of working professional, student, wife and mother. Cross-cultural validity is evinced by demonstrating the measurement invariance of the WRSI. Measurement invariance is the extent to which items of multi-item scales function in the same way across different samples of respondents. We assessed measurement invariance by comparing a sample of working women in Taiwan with a similar sample from the United States. Structural equation models (SEMs) were employed to determine the invariance of the WRSI and to estimate the unique validity variance of its items. This article also provides nurse-researchers with the necessary underlying measurement theory and illustrates how SEMs may be applied to assess cross-cultural validity of instruments used in nursing research. Overall performance of the WRSI was acceptable but our analysis showed that some items did not display invariance properties across samples. Item analysis is presented and recommendations for improving the instrument are discussed.

  4. Isokinetic knee strength qualities as predictors of jumping performance in high-level volleyball athletes: multiple regression approach.

    PubMed

    Sattler, Tine; Sekulic, Damir; Spasic, Miodrag; Osmankac, Nedzad; Vicente João, Paulo; Dervisevic, Edvin; Hadzic, Vedran

    2016-01-01

    Previous investigations noted potential importance of isokinetic strength in rapid muscular performances, such as jumping. This study aimed to identify the influence of isokinetic-knee-strength on specific jumping performance in volleyball. The secondary aim of the study was to evaluate reliability and validity of the two volleyball-specific jumping tests. The sample comprised 67 female (21.96±3.79 years; 68.26±8.52 kg; 174.43±6.85 cm) and 99 male (23.62±5.27 years; 84.83±10.37 kg; 189.01±7.21 cm) high- volleyball players who competed in 1st and 2nd National Division. Subjects were randomly divided into validation (N.=55 and 33 for males and females, respectively) and cross-validation subsamples (N.=54 and 34 for males and females, respectively). Set of predictors included isokinetic tests, to evaluate the eccentric and concentric strength capacities of the knee extensors, and flexors for dominant and non-dominant leg. The main outcome measure for the isokinetic testing was peak torque (PT) which was later normalized for body mass and expressed as PT/Kg. Block-jump and spike-jump performances were measured over three trials, and observed as criteria. Forward stepwise multiple regressions were calculated for validation subsamples and then cross-validated. Cross validation included correlations between and t-test differences between observed and predicted scores; and Bland Altman graphics. Jumping tests were found to be reliable (spike jump: ICC of 0.79 and 0.86; block-jump: ICC of 0.86 and 0.90; for males and females, respectively), and their validity was confirmed by significant t-test differences between 1st vs. 2nd division players. Isokinetic variables were found to be significant predictors of jumping performance in females, but not among males. In females, the isokinetic-knee measures were shown to be stronger and more valid predictors of the block-jump (42% and 64% of the explained variance for validation and cross-validation subsample, respectively) than that of the spike-jump (39% and 34% of the explained variance for validation and cross-validation subsample, respectively). Differences between prediction models calculated for males and females are mostly explained by gender-specific biomechanics of jumping. Study defined importance of knee-isokinetic-strength in volleyball jumping performance in female athletes. Further studies should evaluate association between ankle-isokinetic-strength and volleyball-specific jumping performances. Results reinforce the need for the cross-validation of the prediction-models in sport and exercise sciences.

  5. A novel neural network based image reconstruction model with scale and rotation invariance for target identification and classification for Active millimetre wave imaging

    NASA Astrophysics Data System (ADS)

    Agarwal, Smriti; Bisht, Amit Singh; Singh, Dharmendra; Pathak, Nagendra Prasad

    2014-12-01

    Millimetre wave imaging (MMW) is gaining tremendous interest among researchers, which has potential applications for security check, standoff personal screening, automotive collision-avoidance, and lot more. Current state-of-art imaging techniques viz. microwave and X-ray imaging suffers from lower resolution and harmful ionizing radiation, respectively. In contrast, MMW imaging operates at lower power and is non-ionizing, hence, medically safe. Despite these favourable attributes, MMW imaging encounters various challenges as; still it is very less explored area and lacks suitable imaging methodology for extracting complete target information. Keeping in view of these challenges, a MMW active imaging radar system at 60 GHz was designed for standoff imaging application. A C-scan (horizontal and vertical scanning) methodology was developed that provides cross-range resolution of 8.59 mm. The paper further details a suitable target identification and classification methodology. For identification of regular shape targets: mean-standard deviation based segmentation technique was formulated and further validated using a different target shape. For classification: probability density function based target material discrimination methodology was proposed and further validated on different dataset. Lastly, a novel artificial neural network based scale and rotation invariant, image reconstruction methodology has been proposed to counter the distortions in the image caused due to noise, rotation or scale variations. The designed neural network once trained with sample images, automatically takes care of these deformations and successfully reconstructs the corrected image for the test targets. Techniques developed in this paper are tested and validated using four different regular shapes viz. rectangle, square, triangle and circle.

  6. Measurement and computer simulation of antennas on ships and aircraft for results of operational reliability

    NASA Astrophysics Data System (ADS)

    Kubina, Stanley J.

    1989-09-01

    The review of the status of computational electromagnetics by Miller and the exposition by Burke of the developments in one of the more important computer codes in the application of the electric field integral equation method, the Numerical Electromagnetic Code (NEC), coupled with Molinet's summary of progress in techniques based on the Geometrical Theory of Diffraction (GTD), provide a clear perspective on the maturity of the modern discipline of computational electromagnetics and its potential. Audone's exposition of the application to the computation of Radar Scattering Cross-section (RCS) is an indication of the breadth of practical applications and his exploitation of modern near-field measurement techniques reminds one of progress in the measurement discipline which is essential to the validation or calibration of computational modeling methodology when applied to complex structures such as aircraft and ships. The latter monograph also presents some comparison results with computational models. Some of the results presented for scale model and flight measurements show some serious disagreements in the lobe structure which would require some detailed examination. This also applies to the radiation patterns obtained by flight measurement compared with those obtained using wire-grid models and integral equation modeling methods. In the examples which follow, an attempt is made to match measurements results completely over the entire 2 to 30 MHz HF range for antennas on a large patrol aircraft. The problem of validating computer models of HF antennas on a helicopter and using computer models to generate radiation pattern information which cannot be obtained by measurements are discussed. The use of NEC computer models to analyze top-side ship configurations where measurement results are not available and only self-validation measures are available or at best comparisons with an alternate GTD computer modeling technique is also discussed.

  7. Validation of Medical Tourism Service Quality Questionnaire (MTSQQ) for Iranian Hospitals

    PubMed Central

    Qolipour, Mohammad; Torabipour, Amin; Khiavi, Farzad Faraji; Malehi, Amal Saki

    2017-01-01

    Introduction Assessing service quality is one of the basic requirements to develop the medical tourism industry. There is no valid and reliable tool to measure service quality of medical tourism. This study aimed to determine the reliability and validity of a Persian version of medical tourism service quality questionnaire for Iranian hospitals. Methods To validate the medical tourism service quality questionnaire (MTSQQ), a cross-sectional study was conducted on 250 Iraqi patients referred to hospitals in Ahvaz (Iran) from 2015. To design a questionnaire and determine its content validity, the Delphi Technique (3 rounds) with the participation of 20 medical tourism experts was used. Construct validity of the questionnaire was assessed through exploratory and confirmatory factor analysis. Reliability was assessed using Cronbach’s alpha coefficient. Data were analyzed by Excel 2007, SPSS version18, and Lisrel l8.0 software. Results The content validity of the questionnaire with CVI=0.775 was confirmed. According to exploratory factor analysis, the MTSQQ included 31 items and 8 dimensions (tangibility, reliability, responsiveness, assurance, empathy, exchange and travel facilities, technical and infrastructure facilities and safety and security). Construct validity of the questionnaire was confirmed, based on the goodness of fit quantities of model (RMSEA=0.032, CFI= 0.98, GFI=0.88). Cronbach’s alpha coefficient was 0.837 and 0.919 for expectation and perception questionnaire. Conclusion The results of the study showed that the medical tourism SERVQUAL questionnaire with 31 items and 8 dimensions was a valid and reliable tool to measure service quality of medical tourism in Iranian hospitals. PMID:28461863

  8. Validating a Spanish Version of the PIMRS: Application in National and Cross-National Research on Instructional Leadership

    ERIC Educational Resources Information Center

    Fromm, Germán; Hallinger, Philip; Volante, Paulo; Wang, Wen Chung

    2017-01-01

    The purposes of this study were to report on a systematic approach to validating a Spanish version of the Principal Instructional Management Rating Scale and then to apply the scale in a cross-national comparison of principal instructional leadership. The study yielded a validated Spanish language version of the PIMRS Teacher Form and offers a…

  9. Detailed Modeling and Analysis of the CPFM Dataset

    NASA Technical Reports Server (NTRS)

    Swartz, William H.; Lloyd, Steven A.; DeMajistre, Robert

    2004-01-01

    A quantitative understanding of photolysis rate coefficients (or "j-values") is essential to determining the photochemical reaction rates that define ozone loss and other crucial processes in the atmosphere. j-Values can be calculated with radiative transfer models, derived from actinic flux observations, or inferred from trace gas measurements. The principal objective of this study is to cross-validate j-values from the Composition and Photodissociative Flux Measurement (CPFM) instrument during the Photochemistry of Ozone Loss in the Arctic Region In Summer (POLARIS) and SAGE I11 Ozone Loss and Validation Experiment (SOLVE) field campaigns with model calculations and other measurements and to use this detailed analysis to improve our ability to determine j-values. Another objective is to analyze the spectral flux from the CPFM (not just the j-values) and, using a multi-wavelength/multi-species spectral fitting technique, determine atmospheric composition.

  10. A multivariate regression model for detection of fumonisins content in maize from near infrared spectra.

    PubMed

    Giacomo, Della Riccia; Stefania, Del Zotto

    2013-12-15

    Fumonisins are mycotoxins produced by Fusarium species that commonly live in maize. Whereas fungi damage plants, fumonisins cause disease both to cattle breedings and human beings. Law limits set fumonisins tolerable daily intake with respect to several maize based feed and food. Chemical techniques assure the most reliable and accurate measurements, but they are expensive and time consuming. A method based on Near Infrared spectroscopy and multivariate statistical regression is described as a simpler, cheaper and faster alternative. We apply Partial Least Squares with full cross validation. Two models are described, having high correlation of calibration (0.995, 0.998) and of validation (0.908, 0.909), respectively. Description of observed phenomenon is accurate and overfitting is avoided. Screening of contaminated maize with respect to European legal limit of 4 mg kg(-1) should be assured. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Modeling brook trout presence and absence from landscape variables using four different analytical methods

    USGS Publications Warehouse

    Steen, Paul J.; Passino-Reader, Dora R.; Wiley, Michael J.

    2006-01-01

    As a part of the Great Lakes Regional Aquatic Gap Analysis Project, we evaluated methodologies for modeling associations between fish species and habitat characteristics at a landscape scale. To do this, we created brook trout Salvelinus fontinalis presence and absence models based on four different techniques: multiple linear regression, logistic regression, neural networks, and classification trees. The models were tested in two ways: by application to an independent validation database and cross-validation using the training data, and by visual comparison of statewide distribution maps with historically recorded occurrences from the Michigan Fish Atlas. Although differences in the accuracy of our models were slight, the logistic regression model predicted with the least error, followed by multiple regression, then classification trees, then the neural networks. These models will provide natural resource managers a way to identify habitats requiring protection for the conservation of fish species.

  12. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide.

    PubMed

    Darwish, Hany W; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Non-linear Analysis of Scalp EEG by Using Bispectra: The Effect of the Reference Choice

    PubMed Central

    Chella, Federico; D'Andrea, Antea; Basti, Alessio; Pizzella, Vittorio; Marzetti, Laura

    2017-01-01

    Bispectral analysis is a signal processing technique that makes it possible to capture the non-linear and non-Gaussian properties of the EEG signals. It has found various applications in EEG research and clinical practice, including the assessment of anesthetic depth, the identification of epileptic seizures, and more recently, the evaluation of non-linear cross-frequency brain functional connectivity. However, the validity and reliability of the indices drawn from bispectral analysis of EEG signals are potentially biased by the use of a non-neutral EEG reference. The present study aims at investigating the effects of the reference choice on the analysis of the non-linear features of EEG signals through bicoherence, as well as on the estimation of cross-frequency EEG connectivity through two different non-linear measures, i.e., the cross-bicoherence and the antisymmetric cross-bicoherence. To this end, four commonly used reference schemes were considered: the vertex electrode (Cz), the digitally linked mastoids, the average reference, and the Reference Electrode Standardization Technique (REST). The reference effects were assessed both in simulations and in a real EEG experiment. The simulations allowed to investigated: (i) the effects of the electrode density on the performance of the above references in the estimation of bispectral measures; and (ii) the effects of the head model accuracy in the performance of the REST. For real data, the EEG signals recorded from 10 subjects during eyes open resting state were examined, and the distortions induced by the reference choice in the patterns of alpha-beta bicoherence, cross-bicoherence, and antisymmetric cross-bicoherence were assessed. The results showed significant differences in the findings depending on the chosen reference, with the REST providing superior performance than all the other references in approximating the ideal neutral reference. In conclusion, this study highlights the importance of considering the effects of the reference choice in the interpretation and comparison of the results of bispectral analysis of scalp EEG. PMID:28559790

  14. Performance improvements in temperature reconstructions of 2-D tunable diode laser absorption spectroscopy (TDLAS)

    NASA Astrophysics Data System (ADS)

    Choi, Doo-Won; Jeon, Min-Gyu; Cho, Gyeong-Rae; Kamimoto, Takahiro; Deguchi, Yoshihiro; Doh, Deog-Hee

    2016-02-01

    Performance improvement was attained in data reconstructions of 2-dimensional tunable diode laser absorption spectroscopy (TDLAS). Multiplicative Algebraic Reconstruction Technique (MART) algorithm was adopted for data reconstruction. The data obtained in an experiment for the measurement of temperature and concentration fields of gas flows were used. The measurement theory is based upon the Beer-Lambert law, and the measurement system consists of a tunable laser, collimators, detectors, and an analyzer. Methane was used as a fuel for combustion with air in the Bunsen-type burner. The data used for the reconstruction are from the optical signals of 8-laser beams passed on a cross-section of the methane flame. The performances of MART algorithm in data reconstruction were validated and compared with those obtained by Algebraic Reconstruction Technique (ART) algorithm.

  15. Raman spectroscopic characterization of urine of normal and cervical cancer subjects

    NASA Astrophysics Data System (ADS)

    Pappu, Raja; Prakasarao, Aruna; Dornadula, Koteeswaran; Singaravelu, Ganesan

    2017-02-01

    Cervical cancer is the fourth most common malignancy in female worldwide; the present method for diagnosis is the biopsy, Pap smear, colposcopy etc. To overcome the drawbacks of diagnosis an alternative technique is required, optical spectroscopy is a new technique where the discrimination of normal and cancer subjects provides valuable potential information in the diagnostic oncology at an early stage. Raman peaks in the spectra suggest interesting differences in various bio molecules. In this regard, non invasive optical detection of cervical cancer using urine samples by Raman Spectroscopy combined with LDA diagnostic algorithm yields an accuracy of 100% for original and cross validated group respectively. As the results were appreciable it is necessary to carry out the analysis for more number of samples to explore the facts hidden at different stages during the development of cervical cancer.

  16. Dual-Polarization Ku-Band Compact Spaceborne Antenna Based on Dual-Reflectarray Optics.

    PubMed

    Tienda, Carolina; Encinar, Jose A; Barba, Mariano; Arrebola, Manuel

    2018-04-05

    This article demonstrated an accurate analysis technique for dual-reflectarray antennas that take into account the angle of incidence of the impinging electric field on the main reflectarray cells. The reflected field on the sub and the main reflectarray surfaces is computed using Method of Moments in the spectral domain and assuming local periodicity. The sub-reflectarray is divided into groups of elements and the field radiated by each group is used to compute the incident and reflected field on the main reflectarray cells. A 50-cm demonstrator in Ku-band that provides European coverage has been designed, manufactured and tested to validate the analysis technique. The measured radiation patterns match the simulations and they fulfill the coverage requirements, achieving a cross-polar discrimination better than 25 dB in the frequency range: 12.975-14.25 GHz.

  17. An evaluation of Bayesian techniques for controlling model complexity and selecting inputs in a neural network for short-term load forecasting.

    PubMed

    Hippert, Henrique S; Taylor, James W

    2010-04-01

    Artificial neural networks have frequently been proposed for electricity load forecasting because of their capabilities for the nonlinear modelling of large multivariate data sets. Modelling with neural networks is not an easy task though; two of the main challenges are defining the appropriate level of model complexity, and choosing the input variables. This paper evaluates techniques for automatic neural network modelling within a Bayesian framework, as applied to six samples containing daily load and weather data for four different countries. We analyse input selection as carried out by the Bayesian 'automatic relevance determination', and the usefulness of the Bayesian 'evidence' for the selection of the best structure (in terms of number of neurones), as compared to methods based on cross-validation. Copyright 2009 Elsevier Ltd. All rights reserved.

  18. Validity of Hansen-Roach cross sections in low-enriched uranium systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busch, R.D.; O'Dell, R.D.

    Within the nuclear criticality safety community, the Hansen-Roach 16 group cross section set has been the standard'' for use in k{sub eff} calculations over the past 30 years. Yet even with its widespread acceptance, there are still questions about its validity and adequacy, about the proper procedure for calculating the potential scattering cross section, {sigma}{sub p}, for uranium and plutonium, and about the concept of resonance self shielding and its impact on cross sections. This paper attempts to address these questions. It provides a brief background on the Hansen-Roach cross sections. Next is presented a review of resonances in crossmore » sections, self shielding of these resonances, and the use of {sigma}{sub p} to characterize resonance self shielding. Three prescriptions for calculating {sigma}{sub p} are given. Finally, results of several calculations of k{sub eff} on low-enriched uranium systems are provided to confirm the validity of the Hansen-Roach cross sections when applied to such systems.« less

  19. Spectrally edited 2D 13Csbnd 13C NMR spectra without diagonal ridge for characterizing 13C-enriched low-temperature carbon materials

    NASA Astrophysics Data System (ADS)

    Johnson, Robert L.; Anderson, Jason M.; Shanks, Brent H.; Fang, Xiaowen; Hong, Mei; Schmidt-Rohr, Klaus

    2013-09-01

    Two robust combinations of spectral editing techniques with 2D 13Csbnd 13C NMR have been developed for characterizing the aromatic components of 13C-enriched low-temperature carbon materials. One method (exchange with protonated and nonprotonated spectral editing, EXPANSE) selects cross peaks of protonated and nearby nonprotonated carbons, while the other technique, dipolar-dephased double-quantum/single-quantum (DQ/SQ) NMR, selects signals of bonded nonprotonated carbons. Both spectra are free of a diagonal ridge, which has many advantages: Cross peaks on the diagonal or of small intensity can be detected, and residual spinning sidebands or truncation artifacts associated with the diagonal ridge are avoided. In the DQ/SQ experiment, dipolar dephasing of the double-quantum coherence removes protonated-carbon signals; this approach also eliminates the need for high-power proton decoupling. The initial magnetization is generated with minimal fluctuation by combining direct polarization, cross polarization, and equilibration by 13C spin diffusion. The dipolar dephased DQ/SQ spectrum shows signals from all linkages between aromatic rings, including a distinctive peak from polycondensed aromatics. In EXPANSE NMR, signals of protonated carbons are selected in the first spectral dimension by short cross polarization combined with dipolar dephasing difference. This removes ambiguities of peak assignment to overlapping signals of nonprotonated and protonated aromatic carbons, e.g. near 125 ppm. Spin diffusion is enhanced by dipolar-assisted rotational resonance. Before detection, Csbnd H dipolar dephasing by gated decoupling is applied, which selects signals of nonprotonated carbons. Thus, only cross peaks due to magnetization originating from protonated C and ending on nearby nonprotonated C are retained. Combined with the chemical shifts deduced from the cross-peak position, this double spectral editing defines the bonding environment of aromatic, COO, and Cdbnd O carbons, which is particularly useful for identifying furan and arene rings. The Cdbnd O carbons, whose chemical shifts vary strongly (between 212 and 165 ppm) and systematically depend on their two bonding partners, show particularly informative cross peaks, given that one bonding partner is defined by the other frequency coordinate of the cross peak. The new techniques and the information content of the resulting spectra are validated on sulfuric-acid treated low-temperature carbon materials and on products of the Maillard reaction. The crucial need for spectral editing for correct peak assignment is demonstrated in an example.

  20. Efficient computational model for classification of protein localization images using Extended Threshold Adjacency Statistics and Support Vector Machines.

    PubMed

    Tahir, Muhammad; Jan, Bismillah; Hayat, Maqsood; Shah, Shakir Ullah; Amin, Muhammad

    2018-04-01

    Discriminative and informative feature extraction is the core requirement for accurate and efficient classification of protein subcellular localization images so that drug development could be more effective. The objective of this paper is to propose a novel modification in the Threshold Adjacency Statistics technique and enhance its discriminative power. In this work, we utilized Threshold Adjacency Statistics from a novel perspective to enhance its discrimination power and efficiency. In this connection, we utilized seven threshold ranges to produce seven distinct feature spaces, which are then used to train seven SVMs. The final prediction is obtained through the majority voting scheme. The proposed ETAS-SubLoc system is tested on two benchmark datasets using 5-fold cross-validation technique. We observed that our proposed novel utilization of TAS technique has improved the discriminative power of the classifier. The ETAS-SubLoc system has achieved 99.2% accuracy, 99.3% sensitivity and 99.1% specificity for Endogenous dataset outperforming the classical Threshold Adjacency Statistics technique. Similarly, 91.8% accuracy, 96.3% sensitivity and 91.6% specificity values are achieved for Transfected dataset. Simulation results validated the effectiveness of ETAS-SubLoc that provides superior prediction performance compared to the existing technique. The proposed methodology aims at providing support to pharmaceutical industry as well as research community towards better drug designing and innovation in the fields of bioinformatics and computational biology. The implementation code for replicating the experiments presented in this paper is available at: https://drive.google.com/file/d/0B7IyGPObWbSqRTRMcXI2bG5CZWs/view?usp=sharing. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Cross-cultural adaptation and validation of Persian Achilles tendon Total Rupture Score.

    PubMed

    Ansari, Noureddin Nakhostin; Naghdi, Soofia; Hasanvand, Sahar; Fakhari, Zahra; Kordi, Ramin; Nilsson-Helander, Katarina

    2016-04-01

    To cross-culturally adapt the Achilles tendon Total Rupture Score (ATRS) to Persian language and to preliminary evaluate the reliability and validity of a Persian ATRS. A cross-sectional and prospective cohort study was conducted to translate and cross-culturally adapt the ATRS to Persian language (ATRS-Persian) following steps described in guidelines. Thirty patients with total Achilles tendon rupture and 30 healthy subjects participated in this study. Psychometric properties of floor/ceiling effects (responsiveness), internal consistency reliability, test-retest reliability, standard error of measurement (SEM), smallest detectable change (SDC), construct validity, and discriminant validity were tested. Factor analysis was performed to determine the ATRS-Persian structure. There were no floor or ceiling effects that indicate the content and responsiveness of ATRS-Persian. Internal consistency was high (Cronbach's α 0.95). Item-total correlations exceeded acceptable standard of 0.3 for the all items (0.58-0.95). The test-retest reliability was excellent [(ICC)agreement 0.98]. SEM and SDC were 3.57 and 9.9, respectively. Construct validity was supported by a significant correlation between the ATRS-Persian total score and the Persian Foot and Ankle Outcome Score (PFAOS) total score and PFAOS subscales (r = 0.55-0.83). The ATRS-Persian significantly discriminated between patients and healthy subjects. Explanatory factor analysis revealed 1 component. The ATRS was cross-culturally adapted to Persian and demonstrated to be a reliable and valid instrument to measure functional outcomes in Persian patients with Achilles tendon rupture. II.

  2. Impact of Cross-Axis Structural Dynamics on Validation of Linear Models for Space Launch System

    NASA Technical Reports Server (NTRS)

    Pei, Jing; Derry, Stephen D.; Zhou Zhiqiang; Newsom, Jerry R.

    2014-01-01

    A feasibility study was performed to examine the advisability of incorporating a set of Programmed Test Inputs (PTIs) during the Space Launch System (SLS) vehicle flight. The intent of these inputs is to provide validation to the preflight models for control system stability margins, aerodynamics, and structural dynamics. During October 2009, Ares I-X program was successful in carrying out a series of PTI maneuvers which provided a significant amount of valuable data for post-flight analysis. The resulting data comparisons showed excellent agreement with the preflight linear models across the frequency spectrum of interest. However unlike Ares I-X, the structural dynamics associated with the SLS boost phase configuration are far more complex and highly coupled in all three axes. This presents a challenge when implementing this similar system identification technique to SLS. Preliminary simulation results show noticeable mismatches between PTI validation and analytical linear models in the frequency range of the structural dynamics. An alternate approach was examined which demonstrates the potential for better overall characterization of the system frequency response as well as robustness of the control design.

  3. Classification based upon gene expression data: bias and precision of error rates.

    PubMed

    Wood, Ian A; Visscher, Peter M; Mengersen, Kerrie L

    2007-06-01

    Gene expression data offer a large number of potentially useful predictors for the classification of tissue samples into classes, such as diseased and non-diseased. The predictive error rate of classifiers can be estimated using methods such as cross-validation. We have investigated issues of interpretation and potential bias in the reporting of error rate estimates. The issues considered here are optimization and selection biases, sampling effects, measures of misclassification rate, baseline error rates, two-level external cross-validation and a novel proposal for detection of bias using the permutation mean. Reporting an optimal estimated error rate incurs an optimization bias. Downward bias of 3-5% was found in an existing study of classification based on gene expression data and may be endemic in similar studies. Using a simulated non-informative dataset and two example datasets from existing studies, we show how bias can be detected through the use of label permutations and avoided using two-level external cross-validation. Some studies avoid optimization bias by using single-level cross-validation and a test set, but error rates can be more accurately estimated via two-level cross-validation. In addition to estimating the simple overall error rate, we recommend reporting class error rates plus where possible the conditional risk incorporating prior class probabilities and a misclassification cost matrix. We also describe baseline error rates derived from three trivial classifiers which ignore the predictors. R code which implements two-level external cross-validation with the PAMR package, experiment code, dataset details and additional figures are freely available for non-commercial use from http://www.maths.qut.edu.au/profiles/wood/permr.jsp

  4. Agility performance in high-level junior basketball players: the predictive value of anthropometrics and power qualities.

    PubMed

    Sisic, Nedim; Jelicic, Mario; Pehar, Miran; Spasic, Miodrag; Sekulic, Damir

    2016-01-01

    In basketball, anthropometric status is an important factor when identifying and selecting talents, while agility is one of the most vital motor performances. The aim of this investigation was to evaluate the influence of anthropometric variables and power capacities on different preplanned agility performances. The participants were 92 high-level, junior-age basketball players (16-17 years of age; 187.6±8.72 cm in body height, 78.40±12.26 kg in body mass), randomly divided into a validation and cross-validation subsample. The predictors set consisted of 16 anthropometric variables, three tests of power-capacities (Sargent-jump, broad-jump and medicine-ball-throw) as predictors. The criteria were three tests of agility: a T-Shape-Test; a Zig-Zag-Test, and a test of running with a 180-degree turn (T180). Forward stepwise multiple regressions were calculated for validation subsamples and then cross-validated. Cross validation included correlations between observed and predicted scores, dependent samples t-test between predicted and observed scores; and Bland Altman graphics. Analysis of the variance identified centres being advanced in most of the anthropometric indices, and medicine-ball-throw (all at P<0.05); with no significant between-position-differences for other studied motor performances. Multiple regression models originally calculated for the validation subsample were then cross-validated, and confirmed for Zig-zag-Test (R of 0.71 and 0.72 for the validation and cross-validation subsample, respectively). Anthropometrics were not strongly related to agility performance, but leg length is found to be negatively associated with performance in basketball-specific agility. Power capacities are confirmed to be an important factor in agility. The results highlighted the importance of sport-specific tests when studying pre-planned agility performance in basketball. The improvement in power capacities will probably result in an improvement in agility in basketball athletes, while anthropometric indices should be used in order to identify those athletes who can achieve superior agility performance.

  5. Validation of geometric measurements of the left atrium and pulmonary veins for analysis of reverse structural remodeling following ablation therapy

    NASA Astrophysics Data System (ADS)

    Rettmann, M. E.; Holmes, D. R., III; Gunawan, M. S.; Ge, X.; Karwoski, R. A.; Breen, J. F.; Packer, D. L.; Robb, R. A.

    2012-03-01

    Geometric analysis of the left atrium and pulmonary veins is important for studying reverse structural remodeling following cardiac ablation therapy. It has been shown that the left atrium decreases in volume and the pulmonary vein ostia decrease in diameter following ablation therapy. Most analysis techniques, however, require laborious manual tracing of image cross-sections. Pulmonary vein diameters are typically measured at the junction between the left atrium and pulmonary veins, called the pulmonary vein ostia, with manually drawn lines on volume renderings or on image cross-sections. In this work, we describe a technique for making semi-automatic measurements of the left atrium and pulmonary vein ostial diameters from high resolution CT scans and multi-phase datasets. The left atrium and pulmonary veins are segmented from a CT volume using a 3D volume approach and cut planes are interactively positioned to separate the pulmonary veins from the body of the left atrium. The cut plane is also used to compute the pulmonary vein ostial diameter. Validation experiments are presented which demonstrate the ability to repeatedly measure left atrial volume and pulmonary vein diameters from high resolution CT scans, as well as the feasibility of this approach for analyzing dynamic, multi-phase datasets. In the high resolution CT scans the left atrial volume measurements show high repeatability with approximately 4% intra-rater repeatability and 8% inter-rater repeatability. Intra- and inter-rater repeatability for pulmonary vein diameter measurements range from approximately 2 to 4 mm. For the multi-phase CT datasets, differences in left atrial volumes between a standard slice-by-slice approach and the proposed 3D volume approach are small, with percent differences on the order of 3% to 6%.

  6. Determination of the geographic origin of onions between three main production areas in Japan and other countries by mineral composition.

    PubMed

    Ariyama, Kaoru; Aoyama, Yoshinori; Mochizuki, Akashi; Homura, Yuji; Kadokura, Masashi; Yasui, Akemi

    2007-01-24

    Onions (Allium cepa L.) are produced in many countries and are one of the most popular vegetables in the world, thus leading to an enormous amount of international trade. It is currently important that a scientific technique be developed for determining geographic origin as a means to detect fraudulent labeling. We have therefore developed a technique based on mineral analysis and linear discriminant analysis (LDA). The onion samples used in this study were from Hokkaido, Hyogo, and Saga, which are the primary onion-growing areas in Japan, and those from countries that export onions to Japan (China, the United States, New Zealand, Thailand, Australia, and Chile). Of 309 samples, 108 were from Hokkaido, 52 were from Saga, 77 were from Hyogo, and 72 were from abroad. Fourteen elements (Na, Mg, P, Mn, Co, Ni, Cu, Zn, Rb, Sr, Mo, Cd, Cs, and Ba) in the samples were determined by frame atomic adsorption spectrometry, inductively coupled plasma optical emission spectrometry, and inductively coupled plasma mass spectrometry. The models established by LDA were used to discriminate the geographic origin between Hokkaido and abroad, Hyogo and abroad, and Saga and abroad. Ten-fold cross-validations were conducted using these models. The discrimination accuracies obtained by cross-validation between Hokkaido and abroad were 100 and 86%, respectively. Those between Hyogo and abroad were 100 and 90%, respectively. Those between Saga and abroad were 98 and 90%, respectively. In addition, it was demonstrated that the fingerprint of an element pattern from a specific production area, which a crop receives, did not easily change by the variations of fertilization, crop year, variety, soil type, and production year if appropriate elements were chosen.

  7. Identification of DNA-Binding Proteins Using Mixed Feature Representation Methods.

    PubMed

    Qu, Kaiyang; Han, Ke; Wu, Song; Wang, Guohua; Wei, Leyi

    2017-09-22

    DNA-binding proteins play vital roles in cellular processes, such as DNA packaging, replication, transcription, regulation, and other DNA-associated activities. The current main prediction method is based on machine learning, and its accuracy mainly depends on the features extraction method. Therefore, using an efficient feature representation method is important to enhance the classification accuracy. However, existing feature representation methods cannot efficiently distinguish DNA-binding proteins from non-DNA-binding proteins. In this paper, a multi-feature representation method, which combines three feature representation methods, namely, K-Skip-N-Grams, Information theory, and Sequential and structural features (SSF), is used to represent the protein sequences and improve feature representation ability. In addition, the classifier is a support vector machine. The mixed-feature representation method is evaluated using 10-fold cross-validation and a test set. Feature vectors, which are obtained from a combination of three feature extractions, show the best performance in 10-fold cross-validation both under non-dimensional reduction and dimensional reduction by max-relevance-max-distance. Moreover, the reduced mixed feature method performs better than the non-reduced mixed feature technique. The feature vectors, which are a combination of SSF and K-Skip-N-Grams, show the best performance in the test set. Among these methods, mixed features exhibit superiority over the single features.

  8. Animal models of human anxiety disorders: reappraisal from a developmental psychopathology vantage point.

    PubMed

    Lampis, Valentina; Maziade, Michel; Battaglia, Marco

    2011-05-01

    We are witnessing a tremendous expansion of strategies and techniques that derive from basic and preclinical science to study the fine genetic, epigenetic, and proteomic regulation of behavior in the laboratory animal. In this endeavor, animal models of psychiatric illness are becoming the almost exclusive domain of basic researchers, with lesser involvement of clinician researchers in their conceptual design, and transfer into practice of new paradigms. From the side of human behavioral research, the growing interest in gene-environment interplay and the fostering of valid endophenotypes are among the few substantial innovations in the effort of linking common mental disorders to cutting-edge clinical research questions. We argue that it is time for cross-fertilization between these camps. In this article, we a) observe that the "translational divide" can-and should-be crossed by having investigators from both the basic and the clinical sides cowork on simpler, valid "endophenotypes" of neurodevelopmental relevance; b) emphasize the importance of unambiguous physiological readouts, more than behavioral equivalents of human symptoms/syndromes, for animal research; c) indicate and discuss how this could be fostered and implemented in a developmental framework of reference for some common anxiety disorders and ultimately lead to better animal models of human mental disorders.

  9. RBF kernel based support vector regression to estimate the blood volume and heart rate responses during hemodialysis.

    PubMed

    Javed, Faizan; Chan, Gregory S H; Savkin, Andrey V; Middleton, Paul M; Malouf, Philip; Steel, Elizabeth; Mackie, James; Lovell, Nigel H

    2009-01-01

    This paper uses non-linear support vector regression (SVR) to model the blood volume and heart rate (HR) responses in 9 hemodynamically stable kidney failure patients during hemodialysis. Using radial bias function (RBF) kernels the non-parametric models of relative blood volume (RBV) change with time as well as percentage change in HR with respect to RBV were obtained. The e-insensitivity based loss function was used for SVR modeling. Selection of the design parameters which includes capacity (C), insensitivity region (e) and the RBF kernel parameter (sigma) was made based on a grid search approach and the selected models were cross-validated using the average mean square error (AMSE) calculated from testing data based on a k-fold cross-validation technique. Linear regression was also applied to fit the curves and the AMSE was calculated for comparison with SVR. For the model based on RBV with time, SVR gave a lower AMSE for both training (AMSE=1.5) as well as testing data (AMSE=1.4) compared to linear regression (AMSE=1.8 and 1.5). SVR also provided a better fit for HR with RBV for both training as well as testing data (AMSE=15.8 and 16.4) compared to linear regression (AMSE=25.2 and 20.1).

  10. Computer-aided diagnosis system: a Bayesian hybrid classification method.

    PubMed

    Calle-Alonso, F; Pérez, C J; Arias-Nicolás, J P; Martín, J

    2013-10-01

    A novel method to classify multi-class biomedical objects is presented. The method is based on a hybrid approach which combines pairwise comparison, Bayesian regression and the k-nearest neighbor technique. It can be applied in a fully automatic way or in a relevance feedback framework. In the latter case, the information obtained from both an expert and the automatic classification is iteratively used to improve the results until a certain accuracy level is achieved, then, the learning process is finished and new classifications can be automatically performed. The method has been applied in two biomedical contexts by following the same cross-validation schemes as in the original studies. The first one refers to cancer diagnosis, leading to an accuracy of 77.35% versus 66.37%, originally obtained. The second one considers the diagnosis of pathologies of the vertebral column. The original method achieves accuracies ranging from 76.5% to 96.7%, and from 82.3% to 97.1% in two different cross-validation schemes. Even with no supervision, the proposed method reaches 96.71% and 97.32% in these two cases. By using a supervised framework the achieved accuracy is 97.74%. Furthermore, all abnormal cases were correctly classified. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  11. Simultaneous and Continuous Estimation of Shoulder and Elbow Kinematics from Surface EMG Signals

    PubMed Central

    Zhang, Qin; Liu, Runfeng; Chen, Wenbin; Xiong, Caihua

    2017-01-01

    In this paper, we present a simultaneous and continuous kinematics estimation method for multiple DoFs across shoulder and elbow joint. Although simultaneous and continuous kinematics estimation from surface electromyography (EMG) is a feasible way to achieve natural and intuitive human-machine interaction, few works investigated multi-DoF estimation across the significant joints of upper limb, shoulder and elbow joints. This paper evaluates the feasibility to estimate 4-DoF kinematics at shoulder and elbow during coordinated arm movements. Considering the potential applications of this method in exoskeleton, prosthetics and other arm rehabilitation techniques, the estimation performance is presented with different muscle activity decomposition and learning strategies. Principle component analysis (PCA) and independent component analysis (ICA) are respectively employed for EMG mode decomposition with artificial neural network (ANN) for learning the electromechanical association. Four joint angles across shoulder and elbow are simultaneously and continuously estimated from EMG in four coordinated arm movements. By using ICA (PCA) and single ANN, the average estimation accuracy 91.12% (90.23%) is obtained in 70-s intra-cross validation and 87.00% (86.30%) is obtained in 2-min inter-cross validation. This result suggests it is feasible and effective to use ICA (PCA) with single ANN for multi-joint kinematics estimation in variant application conditions. PMID:28611573

  12. A Simple Endoscopic Technique for Measuring the Cross-Sectional Area of the Upper Airway in a Rabbit Model.

    PubMed

    Wistermayer, Paul R; McIlwain, Wesley R; Ieronimakis, Nicholas; Rogers, Derek J

    2018-04-01

    Validate an accurate and reproducible method of measuring the cross-sectional area (CSA) of the upper airway. This is a prospective animal study done at a tertiary care medical treatment facility. Control images were obtained using endotracheal tubes of varying sizes. In vivo images were obtained from various timepoints of a concurrent study on subglottic stenosis. Using a 0° rod telescope, an instrument was placed at the level of interest, and a photo was obtained. Three independent and blinded raters then measured the CSA of the narrowest portion of the airway using open source image analysis software. Each blinded rater measured the CSA of 79 photos. The t testing to assess for accuracy showed no difference between measured and known CSAs of the control images ( P = .86), with an average error of 1.5% (SD = 5.5%). All intraclass correlation (ICC) values for intrarater agreement showed excellent agreement (ICC > .75). Interrater reliability among all raters in control (ICC = .975; 95% CI, .817-.995) and in vivo (ICC = .846;, 95% CI, .780-.896) images showed excellent agreement. We validate a simple, accurate, and reproducible method of measuring the CSA of the airway that can be used in a clinical or research setting.

  13. Computational Depth of Anesthesia via Multiple Vital Signs Based on Artificial Neural Networks.

    PubMed

    Sadrawi, Muammar; Fan, Shou-Zen; Abbod, Maysam F; Jen, Kuo-Kuang; Shieh, Jiann-Shing

    2015-01-01

    This study evaluated the depth of anesthesia (DoA) index using artificial neural networks (ANN) which is performed as the modeling technique. Totally 63-patient data is addressed, for both modeling and testing of 17 and 46 patients, respectively. The empirical mode decomposition (EMD) is utilized to purify between the electroencephalography (EEG) signal and the noise. The filtered EEG signal is subsequently extracted to achieve a sample entropy index by every 5-second signal. Then, it is combined with other mean values of vital signs, that is, electromyography (EMG), heart rate (HR), pulse, systolic blood pressure (SBP), diastolic blood pressure (DBP), and signal quality index (SQI) to evaluate the DoA index as the input. The 5 doctor scores are averaged to obtain an output index. The mean absolute error (MAE) is utilized as the performance evaluation. 10-fold cross-validation is performed in order to generalize the model. The ANN model is compared with the bispectral index (BIS). The results show that the ANN is able to produce lower MAE than BIS. For the correlation coefficient, ANN also has higher value than BIS tested on the 46-patient testing data. Sensitivity analysis and cross-validation method are applied in advance. The results state that EMG has the most effecting parameter, significantly.

  14. Computational Depth of Anesthesia via Multiple Vital Signs Based on Artificial Neural Networks

    PubMed Central

    Sadrawi, Muammar; Fan, Shou-Zen; Abbod, Maysam F.; Jen, Kuo-Kuang; Shieh, Jiann-Shing

    2015-01-01

    This study evaluated the depth of anesthesia (DoA) index using artificial neural networks (ANN) which is performed as the modeling technique. Totally 63-patient data is addressed, for both modeling and testing of 17 and 46 patients, respectively. The empirical mode decomposition (EMD) is utilized to purify between the electroencephalography (EEG) signal and the noise. The filtered EEG signal is subsequently extracted to achieve a sample entropy index by every 5-second signal. Then, it is combined with other mean values of vital signs, that is, electromyography (EMG), heart rate (HR), pulse, systolic blood pressure (SBP), diastolic blood pressure (DBP), and signal quality index (SQI) to evaluate the DoA index as the input. The 5 doctor scores are averaged to obtain an output index. The mean absolute error (MAE) is utilized as the performance evaluation. 10-fold cross-validation is performed in order to generalize the model. The ANN model is compared with the bispectral index (BIS). The results show that the ANN is able to produce lower MAE than BIS. For the correlation coefficient, ANN also has higher value than BIS tested on the 46-patient testing data. Sensitivity analysis and cross-validation method are applied in advance. The results state that EMG has the most effecting parameter, significantly. PMID:26568957

  15. Measurement of process variables in solid-state fermentation of wheat straw using FT-NIR spectroscopy and synergy interval PLS algorithm

    NASA Astrophysics Data System (ADS)

    Jiang, Hui; Liu, Guohai; Mei, Congli; Yu, Shuang; Xiao, Xiahong; Ding, Yuhan

    2012-11-01

    The feasibility of rapid determination of the process variables (i.e. pH and moisture content) in solid-state fermentation (SSF) of wheat straw using Fourier transform near infrared (FT-NIR) spectroscopy was studied. Synergy interval partial least squares (siPLS) algorithm was implemented to calibrate regression model. The number of PLS factors and the number of subintervals were optimized simultaneously by cross-validation. The performance of the prediction model was evaluated according to the root mean square error of cross-validation (RMSECV), the root mean square error of prediction (RMSEP) and the correlation coefficient (R). The measurement results of the optimal model were obtained as follows: RMSECV = 0.0776, Rc = 0.9777, RMSEP = 0.0963, and Rp = 0.9686 for pH model; RMSECV = 1.3544% w/w, Rc = 0.8871, RMSEP = 1.4946% w/w, and Rp = 0.8684 for moisture content model. Finally, compared with classic PLS and iPLS models, the siPLS model revealed its superior performance. The overall results demonstrate that FT-NIR spectroscopy combined with siPLS algorithm can be used to measure process variables in solid-state fermentation of wheat straw, and NIR spectroscopy technique has a potential to be utilized in SSF industry.

  16. In-line monitoring of extraction process of scutellarein from Erigeron breviscapus (vant.) Hand-Mazz based on qualitative and quantitative uses of near-infrared spectroscopy.

    PubMed

    Wu, Yongjiang; Jin, Ye; Ding, Haiying; Luan, Lianjun; Chen, Yong; Liu, Xuesong

    2011-09-01

    The application of near-infrared (NIR) spectroscopy for in-line monitoring of extraction process of scutellarein from Erigeron breviscapus (vant.) Hand-Mazz was investigated. For NIR measurements, two fiber optic probes designed to transmit NIR radiation through a 2 mm pathlength flow cell were utilized to collect spectra in real-time. High performance liquid chromatography (HPLC) was used as a reference method to determine scutellarein in extract solution. Partial least squares regression (PLSR) calibration model of Savitzky-Golay smoothing NIR spectra in the 5450-10,000 cm(-1) region gave satisfactory predictive results for scutellarein. The results showed that the correlation coefficients of calibration and cross validation were 0.9967 and 0.9811, respectively, and the root mean square error of calibration and cross validation were 0.044 and 0.105, respectively. Furthermore, both the moving block standard deviation (MBSD) method and conformity test were used to identify the end point of extraction process, providing real-time data and instant feedback about the extraction course. The results obtained in this study indicated that the NIR spectroscopy technique provides an efficient and environmentally friendly approach for fast determination of scutellarein and end point control of extraction process. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Multivariate Adaptive Regression Splines (Preprint)

    DTIC Science & Technology

    1990-08-01

    fold cross -validation would take about ten time as long, and MARS is not all that fast to begin with. Friedman has a number of examples showing...standardized mean squared error of prediction (MSEP), the generalized cross validation (GCV), and the number of selected terms (TERMS). In accordance with...and mi= 10 case were almost exclusively spurious cross product terms and terms involving the nuisance variables x6 through xlo. This large number of

  18. Cross section of α-induced reactions on iridium isotopes obtained from thick target yield measurement for the astrophysical γ process

    NASA Astrophysics Data System (ADS)

    Szücs, T.; Kiss, G. G.; Gyürky, Gy.; Halász, Z.; Fülöp, Zs.; Rauscher, T.

    2018-01-01

    The stellar reaction rates of radiative α-capture reactions on heavy isotopes are of crucial importance for the γ process network calculations. These rates are usually derived from statistical model calculations, which need to be validated, but the experimental database is very scarce. This paper presents the results of α-induced reaction cross section measurements on iridium isotopes carried out at first close to the astrophysically relevant energy region. Thick target yields of 191Ir(α,γ)195Au, 191Ir(α,n)194Au, 193Ir(α,n)196mAu, 193Ir(α,n)196Au reactions have been measured with the activation technique between Eα = 13.4 MeV and 17 MeV. For the first time the thick target yield was determined with X-ray counting. This led to a previously unprecedented sensitivity. From the measured thick target yields, reaction cross sections are derived and compared with statistical model calculations. The recently suggested energy-dependent modification of the α + nucleus optical potential gives a good description of the experimental data.

  19. Processing methods for photoacoustic Doppler flowmetry with a clinical ultrasound scanner

    NASA Astrophysics Data System (ADS)

    Bücking, Thore M.; van den Berg, Pim J.; Balabani, Stavroula; Steenbergen, Wiendelt; Beard, Paul C.; Brunker, Joanna

    2018-02-01

    Photoacoustic flowmetry (PAF) based on time-domain cross correlation of photoacoustic signals is a promising technique for deep tissue measurement of blood flow velocity. Signal processing has previously been developed for single element transducers. Here, the processing methods for acoustic resolution PAF using a clinical ultrasound transducer array are developed and validated using a 64-element transducer array with a -6 dB detection band of 11 to 17 MHz. Measurements were performed on a flow phantom consisting of a tube (580 μm inner diameter) perfused with human blood flowing at physiological speeds ranging from 3 to 25 mm / s. The processing pipeline comprised: image reconstruction, filtering, displacement detection, and masking. High-pass filtering and background subtraction were found to be key preprocessing steps to enable accurate flow velocity estimates, which were calculated using a cross-correlation based method. In addition, the regions of interest in the calculated velocity maps were defined using a masking approach based on the amplitude of the cross-correlation functions. These developments enabled blood flow measurements using a transducer array, bringing PAF one step closer to clinical applicability.

  20. Genomic selection across multiple breeding cycles in applied bread wheat breeding.

    PubMed

    Michel, Sebastian; Ametz, Christian; Gungor, Huseyin; Epure, Doru; Grausgruber, Heinrich; Löschenberger, Franziska; Buerstmayr, Hermann

    2016-06-01

    We evaluated genomic selection across five breeding cycles of bread wheat breeding. Bias of within-cycle cross-validation and methods for improving the prediction accuracy were assessed. The prospect of genomic selection has been frequently shown by cross-validation studies using the same genetic material across multiple environments, but studies investigating genomic selection across multiple breeding cycles in applied bread wheat breeding are lacking. We estimated the prediction accuracy of grain yield, protein content and protein yield of 659 inbred lines across five independent breeding cycles and assessed the bias of within-cycle cross-validation. We investigated the influence of outliers on the prediction accuracy and predicted protein yield by its components traits. A high average heritability was estimated for protein content, followed by grain yield and protein yield. The bias of the prediction accuracy using populations from individual cycles using fivefold cross-validation was accordingly substantial for protein yield (17-712 %) and less pronounced for protein content (8-86 %). Cross-validation using the cycles as folds aimed to avoid this bias and reached a maximum prediction accuracy of [Formula: see text] = 0.51 for protein content, [Formula: see text] = 0.38 for grain yield and [Formula: see text] = 0.16 for protein yield. Dropping outlier cycles increased the prediction accuracy of grain yield to [Formula: see text] = 0.41 as estimated by cross-validation, while dropping outlier environments did not have a significant effect on the prediction accuracy. Independent validation suggests, on the other hand, that careful consideration is necessary before an outlier correction is undertaken, which removes lines from the training population. Predicting protein yield by multiplying genomic estimated breeding values of grain yield and protein content raised the prediction accuracy to [Formula: see text] = 0.19 for this derived trait.

  1. Progress of a Cross-correlation Based Optical Strain Measurement Technique for Detecting Radial Growth on a Rotating Disk

    NASA Technical Reports Server (NTRS)

    Clem, Michelle M.; Woike, Mark; Abdul-Aziz, Ali

    2013-01-01

    The Aeronautical Sciences Project under NASAs Fundamental Aeronautics Program is extremely interested in the development of fault detection technologies, such as optical surface measurements in the internal parts of a flow path, for in situ health monitoring of gas turbine engines. In situ health monitoring has the potential to detect flaws, i.e. cracks in key components, such as engine turbine disks, before the flaws lead to catastrophic failure. In the present study, a cross-correlation imaging technique is investigated in a proof-of-concept study as a possible optical technique to measure the radial growth and strain field on an already cracked sub-scale turbine engine disk under loaded conditions in the NASA Glenn Research Centers High Precision Rotordynamics Laboratory. The optical strain measurement technique under investigation offers potential fault detection using an applied background consisting of a high-contrast random speckle pattern and imaging the background under unloaded and loaded conditions with a CCD camera. Spinning the cracked disk at high speeds induces an external load, resulting in a radial growth of the disk of approximately 50.8-m in the flawed region and hence, a localized strain field. When imaging the cracked disk under static conditions, the disk will appear shifted. The resulting background displacements between the two images will then be measured using the two-dimensional cross-correlation algorithms implemented in standard Particle Image Velocimetry (PIV) software to track the disk growth, which facilitates calculation of the localized strain field. In order to develop and validate this optical strain measurement technique an initial proof-of-concept experiment is carried out in a controlled environment. Using PIV optimization principles and guidelines, three potential backgrounds, for future use on the rotating disk, are developed and investigated in the controlled experiment. A range of known shifts are induced on the backgrounds; reference and data images are acquired before and after the induced shift, respectively, and the images are processed using the cross- correlation algorithms in order to determine the background displacements. The effectiveness of each background at resolving the known shift is evaluated and discussed in order to choose to the most suitable background to be implemented onto a rotating disk in the Rotordynamics Lab. Although testing on the rotating disk has not yet been performed, the driving principles behind the development of the present optical technique are based upon critical aspects of the future experiment, such as the amount of expected radial growth, disk analysis, and experimental design and are therefore addressed in the paper.

  2. [Assessment of knowledge in the use of metered dose inhalers in parents of asthmatic school children].

    PubMed

    Aquino-Pérez, Dulce María; Peña-Cadena, Daniel; Trujillo-García, José Ubaldo; Jiménez-Sandoval, Jaime Omar; Machorro-Muñoz, Olga Stephanie

    2013-01-01

    The use of metered dose inhaler (MDI) is key in the treatment of asthma; its effectiveness is related to proper technique. The purpose of this study is to evaluate the use of the technique of metered dose inhalers for the parents or guardians of school children with asthma. In this cross-sectional study, we used a sample of 221 individual caregivers (parent or guardian) of asthmatic children from 5 to 12 years old, who use MDI. We designed a validated questionnaire consisting of 27 items which addressed the handling of inhaler technique. Descriptive statistics was used. Caregivers were rated as "good technique" in 41 fathers (18.6%), 77 mothers (34.8%) and 9 tutors (4.1%), and with a "regular technique" 32 fathers (14.5%), 48 mothers (21.2%) and 14 guardians (6.3%). Asthmatic children aged 9 were rated as with "good technique" in 24 (10.9%). According to gender, we found a "good technique" in 80 boys (36.2%) and 47 girls (21.3%) and with a "regular technique" in 59 boys (26.7%) and 35 girls (15.8%), P 0.0973, RP 0.9. We found with a "regular technique" mainly those asthmatic children diagnosed at ages between 1 to 3 years. Most of the participants had a good technical qualification; however major mistakes were made at key points in the performance of it.

  3. Speed and heart-rate profiles in skating and classical cross-country skiing competitions.

    PubMed

    Bolger, Conor M; Kocbach, Jan; Hegge, Ann Magdalen; Sandbakk, Øyvind

    2015-10-01

    To compare the speed and heart-rate profiles during international skating and classical competitions in male and female world-class cross-country skiers. Four male and 5 female skiers performed individual time trials of 15 km (men) and 10 km (women) in the skating and classical techniques on 2 consecutive days. Races were performed on the same 5-km course. The course was mapped with GPS and a barometer to provide a valid course and elevation profile. Time, speed, and heart rate were determined for uphill, flat, and downhill terrains throughout the entire competition by wearing a GPS and a heart-rate monitor. Times in uphill, flat, and downhill terrain were ~55%, 15-20%, and 25-30%, respectively, of the total race time for both techniques and genders. The average speed differences between skating and classical skiing were 9% and 11% for men and women, respectively, and these values were 12% and 15% for uphill, 8% and 13% for flat (all P < .05), and 2% and 1% for downhill terrain. The average speeds for men were 9% and 11% faster than for women in skating and classical, respectively, with corresponding numbers of 11% and 14% for uphill, 6% and 11% for flat, and 4% and 5% for downhill terrain (all P < .05). Heart-rate profiles were relatively independent of technique and gender. The greatest performance differences between the skating and classical techniques and between the 2 genders were found on uphill terrain. Therefore, these speed differences could not be explained by variations in exercise intensity.

  4. The Arthroscopic Surgical Skill Evaluation Tool (ASSET).

    PubMed

    Koehler, Ryan J; Amsdell, Simon; Arendt, Elizabeth A; Bisson, Leslie J; Braman, Jonathan P; Bramen, Jonathan P; Butler, Aaron; Cosgarea, Andrew J; Harner, Christopher D; Garrett, William E; Olson, Tyson; Warme, Winston J; Nicandri, Gregg T

    2013-06-01

    Surgeries employing arthroscopic techniques are among the most commonly performed in orthopaedic clinical practice; however, valid and reliable methods of assessing the arthroscopic skill of orthopaedic surgeons are lacking. The Arthroscopic Surgery Skill Evaluation Tool (ASSET) will demonstrate content validity, concurrent criterion-oriented validity, and reliability when used to assess the technical ability of surgeons performing diagnostic knee arthroscopic surgery on cadaveric specimens. Cross-sectional study; Level of evidence, 3. Content validity was determined by a group of 7 experts using the Delphi method. Intra-articular performance of a right and left diagnostic knee arthroscopic procedure was recorded for 28 residents and 2 sports medicine fellowship-trained attending surgeons. Surgeon performance was assessed by 2 blinded raters using the ASSET. Concurrent criterion-oriented validity, interrater reliability, and test-retest reliability were evaluated. Content validity: The content development group identified 8 arthroscopic skill domains to evaluate using the ASSET. Concurrent criterion-oriented validity: Significant differences in the total ASSET score (P < .05) between novice, intermediate, and advanced experience groups were identified. Interrater reliability: The ASSET scores assigned by each rater were strongly correlated (r = 0.91, P < .01), and the intraclass correlation coefficient between raters for the total ASSET score was 0.90. Test-retest reliability: There was a significant correlation between ASSET scores for both procedures attempted by each surgeon (r = 0.79, P < .01). The ASSET appears to be a useful, valid, and reliable method for assessing surgeon performance of diagnostic knee arthroscopic surgery in cadaveric specimens. Studies are ongoing to determine its generalizability to other procedures as well as to the live operating room and other simulated environments.

  5. Uniting statistical and individual-based approaches for animal movement modelling.

    PubMed

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems.

  6. Uniting Statistical and Individual-Based Approaches for Animal Movement Modelling

    PubMed Central

    Latombe, Guillaume; Parrott, Lael; Basille, Mathieu; Fortin, Daniel

    2014-01-01

    The dynamic nature of their internal states and the environment directly shape animals' spatial behaviours and give rise to emergent properties at broader scales in natural systems. However, integrating these dynamic features into habitat selection studies remains challenging, due to practically impossible field work to access internal states and the inability of current statistical models to produce dynamic outputs. To address these issues, we developed a robust method, which combines statistical and individual-based modelling. Using a statistical technique for forward modelling of the IBM has the advantage of being faster for parameterization than a pure inverse modelling technique and allows for robust selection of parameters. Using GPS locations from caribou monitored in Québec, caribou movements were modelled based on generative mechanisms accounting for dynamic variables at a low level of emergence. These variables were accessed by replicating real individuals' movements in parallel sub-models, and movement parameters were then empirically parameterized using Step Selection Functions. The final IBM model was validated using both k-fold cross-validation and emergent patterns validation and was tested for two different scenarios, with varying hardwood encroachment. Our results highlighted a functional response in habitat selection, which suggests that our method was able to capture the complexity of the natural system, and adequately provided projections on future possible states of the system in response to different management plans. This is especially relevant for testing the long-term impact of scenarios corresponding to environmental configurations that have yet to be observed in real systems. PMID:24979047

  7. Cross-validation and hypothesis testing in neuroimaging: An irenic comment on the exchange between Friston and Lindquist et al.

    PubMed

    Reiss, Philip T

    2015-08-01

    The "ten ironic rules for statistical reviewers" presented by Friston (2012) prompted a rebuttal by Lindquist et al. (2013), which was followed by a rejoinder by Friston (2013). A key issue left unresolved in this discussion is the use of cross-validation to test the significance of predictive analyses. This note discusses the role that cross-validation-based and related hypothesis tests have come to play in modern data analyses, in neuroimaging and other fields. It is shown that such tests need not be suboptimal and can fill otherwise-unmet inferential needs. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Accelerating cross-validation with total variation and its application to super-resolution imaging

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Ikeda, Shiro; Akiyama, Kazunori; Kabashima, Yoshiyuki

    2017-12-01

    We develop an approximation formula for the cross-validation error (CVE) of a sparse linear regression penalized by ℓ_1-norm and total variation terms, which is based on a perturbative expansion utilizing the largeness of both the data dimensionality and the model. The developed formula allows us to reduce the necessary computational cost of the CVE evaluation significantly. The practicality of the formula is tested through application to simulated black-hole image reconstruction on the event-horizon scale with super resolution. The results demonstrate that our approximation reproduces the CVE values obtained via literally conducted cross-validation with reasonably good precision.

  9. The cross-cultural validity of posttraumatic stress disorder: implications for DSM-5.

    PubMed

    Hinton, Devon E; Lewis-Fernández, Roberto

    2011-09-01

    There is considerable debate about the cross-cultural applicability of the posttraumatic stress disorder (PTSD) category as currently specified. Concerns include the possible status of PTSD as a Western culture-bound disorder and the validity of individual items and criteria thresholds. This review examines various types of cross-cultural validity of the PTSD criteria as defined in DSM-IV-TR, and presents options and preliminary recommendations to be considered for DSM-5. Searches were conducted of the mental health literature, particularly since 1994, regarding cultural-, race-, or ethnicity-related factors that might limit the universal applicability of the diagnostic criteria of PTSD in DSM-IV-TR and the possible criteria for DSM-5. Substantial evidence of the cross-cultural validity of PTSD was found. However, evidence of cross-cultural variability in certain areas suggests the need for further research: the relative salience of avoidance/numbing symptoms, the role of the interpretation of trauma-caused symptoms in shaping symptomatology, and the prevalence of somatic symptoms. This review also indicates the need to modify certain criteria, such as the items on distressing dreams and on foreshortened future, to increase their cross-cultural applicability. Text additions are suggested to increase the applicability of the manual across cultural contexts: specifying that cultural syndromes-such as those indicated in the DSM-IV-TR Glossary-may be a prominent part of the trauma response in certain cultures, and that those syndromes may influence PTSD symptom salience and comorbidity. The DSM-IV-TR PTSD category demonstrates various types of validity. Criteria modification and textual clarifications are suggested to further improve its cross-cultural applicability. © 2010 Wiley-Liss, Inc.

  10. Progress of a Cross-Correlation Based Optical Strain Measurement Technique for Detecting Radial Growth on a Rotating Disk

    NASA Technical Reports Server (NTRS)

    Clem, Michelle M.; Woike, Mark R.; Abdul-Aziz, Ali

    2014-01-01

    The Aeronautical Sciences Project under NASA's Fundamental Aeronautics Program is interested in the development of novel measurement technologies, such as optical surface measurements for the in situ health monitoring of critical constituents of the internal flow path. In situ health monitoring has the potential to detect flaws, i.e. cracks in key components, such as engine turbine disks, before the flaws lead to catastrophic failure. The present study, aims to further validate and develop an optical strain measurement technique to measure the radial growth and strain field of an already cracked disk, mimicking the geometry of a sub-scale turbine engine disk, under loaded conditions in the NASA Glenn Research Center's High Precision Rotordynamics Laboratory. The technique offers potential fault detection by imaging an applied high-contrast random speckle pattern under unloaded and loaded conditions with a CCD camera. Spinning the cracked disk at high speeds (loaded conditions) induces an external load, resulting in a radial growth of the disk of approximately 50.0-µm in the flawed region and hence, a localized strain field. When imaging the cracked disk under static conditions, the disk will be undistorted; however, during rotation the cracked region will grow radially, thus causing the applied particle pattern to be 'shifted'. The resulting particle displacements between the two images is measured using the two-dimensional cross-correlation algorithms implemented in standard Particle Image Velocimetry (PIV) software to track the disk growth, which facilitates calculation of the localized strain field. A random particle distribution is adhered onto the surface of the cracked disk and two bench top experiments are carried out to evaluate the technique's ability to measure the induced particle displacements. The disk is shifted manually using a translation stage equipped with a fine micrometer and a hotplate is used to induce thermal growth of the disk, causing the particles to become shifted. For both experiments, reference and test images are acquired before and after the induced shifts, respectively, and then processed using PIV software. The controlled manual translation of the disk resulted in detection of the particle displacements accurate to 1.75% of full scale and the thermal expansion experiment resulted in successful detection of the disk's thermal growth as compared to the calculated thermal expansion results. After validation of the technique through the induced shift experiments, the technique is implemented in the Rotordynamics Lab for preliminary assessment in a simulated engine environment. The discussion of the findings and plans for future work to improve upon the results are addressed in the paper.

  11. MIMoSA: An Automated Method for Intermodal Segmentation Analysis of Multiple Sclerosis Brain Lesions.

    PubMed

    Valcarcel, Alessandra M; Linn, Kristin A; Vandekar, Simon N; Satterthwaite, Theodore D; Muschelli, John; Calabresi, Peter A; Pham, Dzung L; Martin, Melissa Lynne; Shinohara, Russell T

    2018-03-08

    Magnetic resonance imaging (MRI) is crucial for in vivo detection and characterization of white matter lesions (WMLs) in multiple sclerosis. While WMLs have been studied for over two decades using MRI, automated segmentation remains challenging. Although the majority of statistical techniques for the automated segmentation of WMLs are based on single imaging modalities, recent advances have used multimodal techniques for identifying WMLs. Complementary modalities emphasize different tissue properties, which help identify interrelated features of lesions. Method for Inter-Modal Segmentation Analysis (MIMoSA), a fully automatic lesion segmentation algorithm that utilizes novel covariance features from intermodal coupling regression in addition to mean structure to model the probability lesion is contained in each voxel, is proposed. MIMoSA was validated by comparison with both expert manual and other automated segmentation methods in two datasets. The first included 98 subjects imaged at Johns Hopkins Hospital in which bootstrap cross-validation was used to compare the performance of MIMoSA against OASIS and LesionTOADS, two popular automatic segmentation approaches. For a secondary validation, a publicly available data from a segmentation challenge were used for performance benchmarking. In the Johns Hopkins study, MIMoSA yielded average Sørensen-Dice coefficient (DSC) of .57 and partial AUC of .68 calculated with false positive rates up to 1%. This was superior to performance using OASIS and LesionTOADS. The proposed method also performed competitively in the segmentation challenge dataset. MIMoSA resulted in statistically significant improvements in lesion segmentation performance compared with LesionTOADS and OASIS, and performed competitively in an additional validation study. Copyright © 2018 by the American Society of Neuroimaging.

  12. Computer-aided Assessment of Regional Abdominal Fat with Food Residue Removal in CT

    PubMed Central

    Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi

    2014-01-01

    Rationale and Objectives Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Materials and Methods Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. Results We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Conclusions Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. PMID:24119354

  13. Computer-aided assessment of regional abdominal fat with food residue removal in CT.

    PubMed

    Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi

    2013-11-01

    Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. Published by Elsevier Inc.

  14. AVP-IC50 Pred: Multiple machine learning techniques-based prediction of peptide antiviral activity in terms of half maximal inhibitory concentration (IC50).

    PubMed

    Qureshi, Abid; Tandon, Himani; Kumar, Manoj

    2015-11-01

    Peptide-based antiviral therapeutics has gradually paved their way into mainstream drug discovery research. Experimental determination of peptides' antiviral activity as expressed by their IC50 values involves a lot of effort. Therefore, we have developed "AVP-IC50 Pred," a regression-based algorithm to predict the antiviral activity in terms of IC50 values (μM). A total of 759 non-redundant peptides from AVPdb and HIPdb were divided into a training/test set having 683 peptides (T(683)) and a validation set with 76 independent peptides (V(76)) for evaluation. We utilized important peptide sequence features like amino-acid compositions, binary profile of N8-C8 residues, physicochemical properties and their hybrids. Four different machine learning techniques (MLTs) namely Support vector machine, Random Forest, Instance-based classifier, and K-Star were employed. During 10-fold cross validation, we achieved maximum Pearson correlation coefficients (PCCs) of 0.66, 0.64, 0.56, 0.55, respectively, for the above MLTs using the best combination of feature sets. All the predictive models also performed well on the independent validation dataset and achieved maximum PCCs of 0.74, 0.68, 0.59, 0.57, respectively, on the best combination of feature sets. The AVP-IC50 Pred web server is anticipated to assist the researchers working on antiviral therapeutics by enabling them to computationally screen many compounds and focus experimental validation on the most promising set of peptides, thus reducing cost and time efforts. The server is available at http://crdd.osdd.net/servers/ic50avp. © 2015 Wiley Periodicals, Inc.

  15. Cross-talk between Clinical and Host Response Parameters of Periodontitis in Smokers

    PubMed Central

    Nagarajan, R.; Miller, C.S.; Dawson, D.; Al-Sabbagh, M.; Ebersole, J.L.

    2016-01-01

    Periodontal diseases are a major public health concern leading to tooth loss and also shown to be associated with several chronic systemic diseases. Smoking is a major risk factor for developing numerous systemic diseases, as well as periodontitis. While it is clear that smokers have a significantly enhanced risk for developing periodontitis leading to tooth loss, the population varies with regards to susceptibility to disease associated with smoking. This investigation focuses on identifying differences in four broad sets of variables consisting of: (a) host response molecules, (b) periodontal clinical parameters, (c) antibody measures for periodontal pathogens and oral commensal bacteria challenge, and (d) other variables of interest in a smoking population with (n = 171) and without periodontitis (n = 117). Subsequently, Bayesian network structured learning techniques (BNSL) techniques were used to investigate potential associations and cross-talk between the four broad sets of variables. BNSL revealed two broad communities with markedly different topology between the non-periodontitis and periodontitis smoking population. Confidence of the edges in the resulting network also showed marked variations within and between the periodontitis and non-periodontitis groups. The results presented validated known associations, as well as discovered new ones with minimal precedence that may warrant further investigation and novel hypothesis generation. Cross-talk between the clinical variables and antibody profiles of bacteria were especially pronounced in the case of periodontitis and mediated by the antibody response profile to P. gingivalis. PMID:27431617

  16. A cross-sectional evaluation of meditation experience on electroencephalography data by artificial neural network and support vector machine classifiers

    PubMed Central

    Lee, Yu-Hao; Hsieh, Ya-Ju; Shiah, Yung-Jong; Lin, Yu-Huei; Chen, Chiao-Yun; Tyan, Yu-Chang; GengQiu, JiaCheng; Hsu, Chung-Yao; Chen, Sharon Chia-Ju

    2017-01-01

    Abstract To quantitate the meditation experience is a subjective and complex issue because it is confounded by many factors such as emotional state, method of meditation, and personal physical condition. In this study, we propose a strategy with a cross-sectional analysis to evaluate the meditation experience with 2 artificial intelligence techniques: artificial neural network and support vector machine. Within this analysis system, 3 features of the electroencephalography alpha spectrum and variant normalizing scaling are manipulated as the evaluating variables for the detection of accuracy. Thereafter, by modulating the sliding window (the period of the analyzed data) and shifting interval of the window (the time interval to shift the analyzed data), the effect of immediate analysis for the 2 methods is compared. This analysis system is performed on 3 meditation groups, categorizing their meditation experiences in 10-year intervals from novice to junior and to senior. After an exhausted calculation and cross-validation across all variables, the high accuracy rate >98% is achievable under the criterion of 0.5-minute sliding window and 2 seconds shifting interval for both methods. In a word, the minimum analyzable data length is 0.5 minute and the minimum recognizable temporal resolution is 2 seconds in the decision of meditative classification. Our proposed classifier of the meditation experience promotes a rapid evaluation system to distinguish meditation experience and a beneficial utilization of artificial techniques for the big-data analysis. PMID:28422856

  17. A cross-sectional evaluation of meditation experience on electroencephalography data by artificial neural network and support vector machine classifiers.

    PubMed

    Lee, Yu-Hao; Hsieh, Ya-Ju; Shiah, Yung-Jong; Lin, Yu-Huei; Chen, Chiao-Yun; Tyan, Yu-Chang; GengQiu, JiaCheng; Hsu, Chung-Yao; Chen, Sharon Chia-Ju

    2017-04-01

    To quantitate the meditation experience is a subjective and complex issue because it is confounded by many factors such as emotional state, method of meditation, and personal physical condition. In this study, we propose a strategy with a cross-sectional analysis to evaluate the meditation experience with 2 artificial intelligence techniques: artificial neural network and support vector machine. Within this analysis system, 3 features of the electroencephalography alpha spectrum and variant normalizing scaling are manipulated as the evaluating variables for the detection of accuracy. Thereafter, by modulating the sliding window (the period of the analyzed data) and shifting interval of the window (the time interval to shift the analyzed data), the effect of immediate analysis for the 2 methods is compared. This analysis system is performed on 3 meditation groups, categorizing their meditation experiences in 10-year intervals from novice to junior and to senior. After an exhausted calculation and cross-validation across all variables, the high accuracy rate >98% is achievable under the criterion of 0.5-minute sliding window and 2 seconds shifting interval for both methods. In a word, the minimum analyzable data length is 0.5 minute and the minimum recognizable temporal resolution is 2 seconds in the decision of meditative classification. Our proposed classifier of the meditation experience promotes a rapid evaluation system to distinguish meditation experience and a beneficial utilization of artificial techniques for the big-data analysis.

  18. CROSS-CULTURAL ADAPTATION AND VALIDATION OF THE KOREAN VERSION OF THE CUMBERLAND ANKLE INSTABILITY TOOL.

    PubMed

    Ko, Jupil; Rosen, Adam B; Brown, Cathleen N

    2015-12-01

    The Cumberland Ankle Instability Tool (CAIT) is a valid and reliable patient reported outcome used to assess the presence and severity of chronic ankle instability (CAI). The CAIT has been cross-culturally adapted into other languages for use in non-English speaking populations. However, there are no valid questionnaires to assess CAI in individuals who speak Korean. The purpose of this study was to translate, cross-culturally adapt, and validate the CAIT, for use in a Korean-speaking population with CAI. Cross-cultural reliability study. The CAIT was cross-culturally adapted into Korean according to accepted guidelines and renamed the Cumberland Ankle Instability Tool-Korean (CAIT-K). Twenty-three participants (12 males, 11 females) who were bilingual in English and Korean were recruited and completed the original and adapted versions to assess agreement between versions. An additional 168 national level Korean athletes (106 male, 62 females; age = 20.3 ± 1.1 yrs), who participated in ≥ 90 minutes of physical activity per week, completed the final version of the CAIT-K twice within 14 days. Their completed questionnaires were assessed for internal consistency, test-retest reliability, criterion validity, and construct validity. For bilingual participants, intra-class correlation coefficients (ICC2,1) between the CAIT and the CAIT-K for test-retest reliability were 0.95 (SEM=1.83) and 0.96 (SEM=1.50) in right and left limbs, respectively. The Cronbach's alpha coefficients were 0.92 and 0.90 for the CAIT-K in right and left limbs, respectively. For native Korean speakers, the CAIT-K had high internal consistency (Cronbach's α=0.89) and intra-class correlation coefficient (ICC2,1 = 0.94, SEM=1.72), correlation with the physical component score (rho=0.70, p = 0.001) of the Short-Form Health Survey (SF-36), and the Kaiser-Meyer-Olkin score was 0.87. The original CAIT was translated, cross-culturally adapted, and validated from English to Korean. The CAIT-K appears to be valid and reliable and could be useful in assessing the Korean speaking population with CAI.

  19. Cross-cultural adaptation and validation of the osteoporosis assessment questionnaire short version (OPAQ-SV) for Chinese osteoporotic fracture females.

    PubMed

    Zhang, Yin-Ping; Wei, Huan-Huan; Wang, Wen; Xia, Ru-Yi; Zhou, Xiao-Ling; Porr, Caroline; Lammi, Mikko

    2016-04-01

    The Osteoporosis Assessment Questionnaire Short Version (OPAQ-SV) was cross-culturally adapted to measure health-related quality of life in Chinese osteoporotic fracture females and then validated in China for its psychometric properties. Cross-cultural adaptation, including translation of the original OPAQ-SV into Mandarin Chinese language, was performed according to published guidelines. Validation of the newly cross-culturally adapted OPAQ-SV was conducted by sampling 234 Chinese osteoporotic fracture females and also a control group of 235 Chinese osteoporotic females without fractures, producing robust content, construct, and discriminant validation results. Major categories of reliability were also met: the Cronbach alpha coefficient was 0.975, indicating good internal consistency; the test-retest reliability was 0.80; and principal component analysis resulted in a 6-factor structure explaining 75.847 % of the total variance. Further, the Comparative Fit Index result was 0.922 following the modified model confirmatory factor analysis, and the chi-squared test was 1.98. The root mean squared error of approximation was 0.078. Moreover, significant differences were revealed between females with fractures and those without fractures across all domains (p < 0.001). Overall, the newly cross-culturally adapted OPAQ-SV appears to possess adequate validity and reliability and may be utilized in clinical trials to assess the health-related quality of life in Chinese osteoporotic fracture females.

  20. Correcting for Optimistic Prediction in Small Data Sets

    PubMed Central

    Smith, Gordon C. S.; Seaman, Shaun R.; Wood, Angela M.; Royston, Patrick; White, Ian R.

    2014-01-01

    The C statistic is a commonly reported measure of screening test performance. Optimistic estimation of the C statistic is a frequent problem because of overfitting of statistical models in small data sets, and methods exist to correct for this issue. However, many studies do not use such methods, and those that do correct for optimism use diverse methods, some of which are known to be biased. We used clinical data sets (United Kingdom Down syndrome screening data from Glasgow (1991–2003), Edinburgh (1999–2003), and Cambridge (1990–2006), as well as Scottish national pregnancy discharge data (2004–2007)) to evaluate different approaches to adjustment for optimism. We found that sample splitting, cross-validation without replication, and leave-1-out cross-validation produced optimism-adjusted estimates of the C statistic that were biased and/or associated with greater absolute error than other available methods. Cross-validation with replication, bootstrapping, and a new method (leave-pair-out cross-validation) all generated unbiased optimism-adjusted estimates of the C statistic and had similar absolute errors in the clinical data set. Larger simulation studies confirmed that all 3 methods performed similarly with 10 or more events per variable, or when the C statistic was 0.9 or greater. However, with lower events per variable or lower C statistics, bootstrapping tended to be optimistic but with lower absolute and mean squared errors than both methods of cross-validation. PMID:24966219

  1. Cross-cultural adaption and validation of the Persian version of the SWAL-QOL.

    PubMed

    Tarameshlu, Maryam; Azimi, Amir Reza; Jalaie, Shohreh; Ghelichi, Leila; Ansari, Noureddin Nakhostin

    2017-06-01

    The aim of this study was to translate and cross-culturally adapt the swallowing quality-of-life questionnaire (SWAL-QOL) to Persian language and to determine validity and reliability of the Persian version of the swallow quality-of-life questionnaire (PSWAL-QOL) in the patients with oropharyngeal dysphagia.The cross-sectional survey was designed to translate and cross-culturally adapt SWAL-QOL to Persian language following steps recommended in guideline. A total of 142 patients with dysphagia (mean age = 56.7 ± 12.22 years) were selected by non-probability consecutive sampling method to evaluate construct validity and internal consistency. Thirty patients with dysphagia were completed the PSWAL-QOL 2 weeks later for test-retest reliability.The PSWAL-QOL was favorably accepted with no missing items. The floor effect was ranged 0% to 21% and ceiling effect was ranged 0% to 16%. The construct validity was established via exploratory factor analysis. Internal consistency was confirmed with Cronbach α >0.7 for all scales except eating duration (α = 0.68). The test-retest reliability was excellent with intraclass correlation coefficient (ICC) ≥0.75 for all scales.The SWAL-QOL was cross-culturally adapted to Persian and demonstrated to be a valid and reliable self-report questionnaire to measure the impact of dysphagia on the quality-of-life in the Persian patients with oropharyngeal dysphagia.

  2. Cross-cultural adaption and validation of the Persian version of the SWAL-QOL

    PubMed Central

    Tarameshlu, Maryam; Azimi, Amir Reza; Jalaie, Shohreh; Ghelichi, Leila; Ansari, Noureddin Nakhostin

    2017-01-01

    Abstract The aim of this study was to translate and cross-culturally adapt the swallowing quality-of-life questionnaire (SWAL-QOL) to Persian language and to determine validity and reliability of the Persian version of the swallow quality-of-life questionnaire (PSWAL-QOL) in the patients with oropharyngeal dysphagia. The cross-sectional survey was designed to translate and cross-culturally adapt SWAL-QOL to Persian language following steps recommended in guideline. A total of 142 patients with dysphagia (mean age = 56.7 ± 12.22 years) were selected by non-probability consecutive sampling method to evaluate construct validity and internal consistency. Thirty patients with dysphagia were completed the PSWAL-QOL 2 weeks later for test–retest reliability. The PSWAL-QOL was favorably accepted with no missing items. The floor effect was ranged 0% to 21% and ceiling effect was ranged 0% to 16%. The construct validity was established via exploratory factor analysis. Internal consistency was confirmed with Cronbach α >0.7 for all scales except eating duration (α = 0.68). The test–retest reliability was excellent with intraclass correlation coefficient (ICC) ≥0.75 for all scales. The SWAL-QOL was cross-culturally adapted to Persian and demonstrated to be a valid and reliable self-report questionnaire to measure the impact of dysphagia on the quality-of-life in the Persian patients with oropharyngeal dysphagia. PMID:28658118

  3. Certification in Structural Health Monitoring Systems

    DTIC Science & Technology

    2011-09-01

    validation [3,8]. This may be accomplished by computing the sum of squares of pure error ( SSPE ) and its associated squared correlation [3,8]. To compute...these values, a cross- validation sample must be established. In general, if the SSPE is high, the model does not predict well on independent data...plethora of cross- validation methods, some of which are more useful for certain models than others [3,8]. When possible, a disclosure of the SSPE

  4. A New Reassigned Spectrogram Method in Interference Detection for GNSS Receivers.

    PubMed

    Sun, Kewen; Jin, Tian; Yang, Dongkai

    2015-09-02

    Interference detection is very important for Global Navigation Satellite System (GNSS) receivers. Current work on interference detection in GNSS receivers has mainly focused on time-frequency (TF) analysis techniques, such as spectrogram and Wigner-Ville distribution (WVD), where the spectrogram approach presents the TF resolution trade-off problem, since the analysis window is used, and the WVD method suffers from the very serious cross-term problem, due to its quadratic TF distribution nature. In order to solve the cross-term problem and to preserve good TF resolution in the TF plane at the same time, in this paper, a new TF distribution by using a reassigned spectrogram has been proposed in interference detection for GNSS receivers. This proposed reassigned spectrogram method efficiently combines the elimination of the cross-term provided by the spectrogram itself according to its inherent nature and the improvement of the TF aggregation property achieved by the reassignment method. Moreover, a notch filter has been adopted in interference mitigation for GNSS receivers, where receiver operating characteristics (ROCs) are used as metrics for the characterization of interference mitigation performance. The proposed interference detection method by using a reassigned spectrogram is evaluated by experiments on GPS L1 signals in the disturbing scenarios in comparison to the state-of-the-art TF analysis approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-term problem and also keeps good TF localization properties, which has been proven to be valid and effective to enhance the interference Sensors 2015, 15 22168 detection performance; in addition, the adoption of the notch filter in interference mitigation has shown a significant acquisition performance improvement in terms of ROC curves for GNSS receivers in jamming environments.

  5. A New Reassigned Spectrogram Method in Interference Detection for GNSS Receivers

    PubMed Central

    Sun, Kewen; Jin, Tian; Yang, Dongkai

    2015-01-01

    Interference detection is very important for Global Navigation Satellite System (GNSS) receivers. Current work on interference detection in GNSS receivers has mainly focused on time-frequency (TF) analysis techniques, such as spectrogram and Wigner–Ville distribution (WVD), where the spectrogram approach presents the TF resolution trade-off problem, since the analysis window is used, and the WVD method suffers from the very serious cross-term problem, due to its quadratic TF distribution nature. In order to solve the cross-term problem and to preserve good TF resolution in the TF plane at the same time, in this paper, a new TF distribution by using a reassigned spectrogram has been proposed in interference detection for GNSS receivers. This proposed reassigned spectrogram method efficiently combines the elimination of the cross-term provided by the spectrogram itself according to its inherent nature and the improvement of the TF aggregation property achieved by the reassignment method. Moreover, a notch filter has been adopted in interference mitigation for GNSS receivers, where receiver operating characteristics (ROCs) are used as metrics for the characterization of interference mitigation performance. The proposed interference detection method by using a reassigned spectrogram is evaluated by experiments on GPS L1 signals in the disturbing scenarios in comparison to the state-of-the-art TF analysis approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-term problem and also keeps good TF localization properties, which has been proven to be valid and effective to enhance the interference detection performance; in addition, the adoption of the notch filter in interference mitigation has shown a significant acquisition performance improvement in terms of ROC curves for GNSS receivers in jamming environments. PMID:26364637

  6. Calibration of the Dutch-Flemish PROMIS Pain Behavior item bank in patients with chronic pain.

    PubMed

    Crins, M H P; Roorda, L D; Smits, N; de Vet, H C W; Westhovens, R; Cella, D; Cook, K F; Revicki, D; van Leeuwen, J; Boers, M; Dekker, J; Terwee, C B

    2016-02-01

    The aims of the current study were to calibrate the item parameters of the Dutch-Flemish PROMIS Pain Behavior item bank using a sample of Dutch patients with chronic pain and to evaluate cross-cultural validity between the Dutch-Flemish and the US PROMIS Pain Behavior item banks. Furthermore, reliability and construct validity of the Dutch-Flemish PROMIS Pain Behavior item bank were evaluated. The 39 items in the bank were completed by 1042 Dutch patients with chronic pain. To evaluate unidimensionality, a one-factor confirmatory factor analysis (CFA) was performed. A graded response model (GRM) was used to calibrate the items. To evaluate cross-cultural validity, Differential item functioning (DIF) for language (Dutch vs. English) was evaluated. Reliability of the item bank was also examined and construct validity was studied using several legacy instruments, e.g. the Roland Morris Disability Questionnaire. CFA supported the unidimensionality of the Dutch-Flemish PROMIS Pain Behavior item bank (CFI = 0.960, TLI = 0.958), the data also fit the GRM, and demonstrated good coverage across the pain behavior construct (threshold parameters range: -3.42 to 3.54). Analysis showed good cross-cultural validity (only six DIF items), reliability (Cronbach's α = 0.95) and construct validity (all correlations ≥0.53). The Dutch-Flemish PROMIS Pain Behavior item bank was found to have good cross-cultural validity, reliability and construct validity. The development of the Dutch-Flemish PROMIS Pain Behavior item bank will serve as the basis for Dutch-Flemish PROMIS short forms and computer adaptive testing (CAT). © 2015 European Pain Federation - EFIC®

  7. The reliability, validity, sensitivity, specificity and predictive values of the Chinese version of the Rowland Universal Dementia Assessment Scale.

    PubMed

    Chen, Chia-Wei; Chu, Hsin; Tsai, Chia-Fen; Yang, Hui-Ling; Tsai, Jui-Chen; Chung, Min-Huey; Liao, Yuan-Mei; Chi, Mei-Ju; Chou, Kuei-Ru

    2015-11-01

    The purpose of this study was to translate the Rowland Universal Dementia Assessment Scale into Chinese and to evaluate the psychometric properties (reliability and validity) and the diagnostic properties (sensitivity, specificity and predictive values) of the Chinese version of the Rowland Universal Dementia Assessment Scale. The accurate detection of early dementia requires screening tools with favourable cross-cultural linguistic and appropriate sensitivity, specificity, and predictive values, particularly for Chinese-speaking populations. This was a cross-sectional, descriptive study. Overall, 130 participants suspected to have cognitive impairment were enrolled in the study. A test-retest for determining reliability was scheduled four weeks after the initial test. Content validity was determined by five experts, whereas construct validity was established by using contrasted group technique. The participants' clinical diagnoses were used as the standard in calculating the sensitivity, specificity, positive predictive value and negative predictive value. The study revealed that the Chinese version of the Rowland Universal Dementia Assessment Scale exhibited a test-retest reliability of 0.90, an internal consistency reliability of 0.71, an inter-rater reliability (kappa value) of 0.88 and a content validity index of 0.97. Both the patients and healthy contrast group exhibited significant differences in their cognitive ability. The optimal cut-off points for the Chinese version of the Rowland Universal Dementia Assessment Scale in the test for mild cognitive impairment and dementia were 24 and 22, respectively; moreover, for these two conditions, the sensitivities of the scale were 0.79 and 0.76, the specificities were 0.91 and 0.81, the areas under the curve were 0.85 and 0.78, the positive predictive values were 0.99 and 0.83 and the negative predictive values were 0.96 and 0.91 respectively. The Chinese version of the Rowland Universal Dementia Assessment Scale exhibited sound reliability, validity, sensitivity, specificity and predictive values. This scale can help clinical staff members to quickly and accurately diagnose cognitive impairment and provide appropriate treatment as early as possible. © 2015 John Wiley & Sons Ltd.

  8. Comparative assessment of three standardized robotic surgery training methods.

    PubMed

    Hung, Andrew J; Jayaratna, Isuru S; Teruya, Kara; Desai, Mihir M; Gill, Inderbir S; Goh, Alvin C

    2013-10-01

    To evaluate three standardized robotic surgery training methods, inanimate, virtual reality and in vivo, for their construct validity. To explore the concept of cross-method validity, where the relative performance of each method is compared. Robotic surgical skills were prospectively assessed in 49 participating surgeons who were classified as follows: 'novice/trainee': urology residents, previous experience <30 cases (n = 38) and 'experts': faculty surgeons, previous experience ≥30 cases (n = 11). Three standardized, validated training methods were used: (i) structured inanimate tasks; (ii) virtual reality exercises on the da Vinci Skills Simulator (Intuitive Surgical, Sunnyvale, CA, USA); and (iii) a standardized robotic surgical task in a live porcine model with performance graded by the Global Evaluative Assessment of Robotic Skills (GEARS) tool. A Kruskal-Wallis test was used to evaluate performance differences between novices and experts (construct validity). Spearman's correlation coefficient (ρ) was used to measure the association of performance across inanimate, simulation and in vivo methods (cross-method validity). Novice and expert surgeons had previously performed a median (range) of 0 (0-20) and 300 (30-2000) robotic cases, respectively (P < 0.001). Construct validity: experts consistently outperformed residents with all three methods (P < 0.001). Cross-method validity: overall performance of inanimate tasks significantly correlated with virtual reality robotic performance (ρ = -0.7, P < 0.001) and in vivo robotic performance based on GEARS (ρ = -0.8, P < 0.0001). Virtual reality performance and in vivo tissue performance were also found to be strongly correlated (ρ = 0.6, P < 0.001). We propose the novel concept of cross-method validity, which may provide a method of evaluating the relative value of various forms of skills education and assessment. We externally confirmed the construct validity of each featured training tool. © 2013 BJU International.

  9. Survey of statistical techniques used in validation studies of air pollution prediction models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bornstein, R D; Anderson, S F

    1979-03-01

    Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.

  10. Attrition from an Adolescent Addiction Treatment Program: A Cross Validation.

    ERIC Educational Resources Information Center

    Mathisen, Kenneth S.; Meyers, Kathleen

    Treatment attrition is a major problem for programs treating adolescent substance abusers. To isolate and cross validate factors which are predictive of addiction treatment attrition among adolescent substance abusers, screening interview and diagnostic variables from 119 adolescent in-patients were submitted to a discriminant equation analysis.…

  11. 75 FR 80870 - Self-Regulatory Organizations; Chicago Stock Exchange, Inc.; Notice of Filing and Order Granting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-23

    ... Proposed Rule Change To Eliminate the Validated Cross Trade Entry Functionality December 16, 2010. Pursuant... eliminate the Validated Cross Trade Entry Functionality for Exchange-registered Institutional Brokers. The... Brokers (``Institutional Brokers'') by eliminating the ability of an Institutional Broker to execute...

  12. Cross-Cultural Validation of TEMAS, a Minority Projective Test.

    ERIC Educational Resources Information Center

    Costantino, Giuseppe; And Others

    The theoretical framework and cross-cultural validation of Tell-Me-A-Story (TEMAS), a projective test developed to measure personality development in ethnic minority children, is presented. The TEMAS test consists of 23 chromatic pictures which incorporate the following characteristics: (1) representation of antithetical concepts which the…

  13. Genome-based prediction of test cross performance in two subsequent breeding cycles.

    PubMed

    Hofheinz, Nina; Borchardt, Dietrich; Weissleder, Knuth; Frisch, Matthias

    2012-12-01

    Genome-based prediction of genetic values is expected to overcome shortcomings that limit the application of QTL mapping and marker-assisted selection in plant breeding. Our goal was to study the genome-based prediction of test cross performance with genetic effects that were estimated using genotypes from the preceding breeding cycle. In particular, our objectives were to employ a ridge regression approach that approximates best linear unbiased prediction of genetic effects, compare cross validation with validation using genetic material of the subsequent breeding cycle, and investigate the prospects of genome-based prediction in sugar beet breeding. We focused on the traits sugar content and standard molasses loss (ML) and used a set of 310 sugar beet lines to estimate genetic effects at 384 SNP markers. In cross validation, correlations >0.8 between observed and predicted test cross performance were observed for both traits. However, in validation with 56 lines from the next breeding cycle, a correlation of 0.8 could only be observed for sugar content, for standard ML the correlation reduced to 0.4. We found that ridge regression based on preliminary estimates of the heritability provided a very good approximation of best linear unbiased prediction and was not accompanied with a loss in prediction accuracy. We conclude that prediction accuracy assessed with cross validation within one cycle of a breeding program can not be used as an indicator for the accuracy of predicting lines of the next cycle. Prediction of lines of the next cycle seems promising for traits with high heritabilities.

  14. Monitoring by forward scatter radar techniques: an improved second-order analytical model

    NASA Astrophysics Data System (ADS)

    Falconi, Marta Tecla; Comite, Davide; Galli, Alessandro; Marzano, Frank S.; Pastina, Debora; Lombardo, Pierfrancesco

    2017-10-01

    In this work, a second-order phase approximation is introduced to provide an improved analytical model of the signal received in forward scatter radar systems. A typical configuration with a rectangular metallic object illuminated while crossing the baseline, in far- or near-field conditions, is considered. An improved second-order model is compared with a simplified one already proposed by the authors and based on a paraxial approximation. A phase error analysis is carried out to investigate benefits and limitations of the second-order modeling. The results are validated by developing full-wave numerical simulations implementing the relevant scattering problem on a commercial tool.

  15. Joint fMRI analysis and subject clustering using sparse dictionary learning

    NASA Astrophysics Data System (ADS)

    Kim, Seung-Jun; Dontaraju, Krishna K.

    2017-08-01

    Multi-subject fMRI data analysis methods based on sparse dictionary learning are proposed. In addition to identifying the component spatial maps by exploiting the sparsity of the maps, clusters of the subjects are learned by postulating that the fMRI volumes admit a subspace clustering structure. Furthermore, in order to tune the associated hyper-parameters systematically, a cross-validation strategy is developed based on entry-wise sampling of the fMRI dataset. Efficient algorithms for solving the proposed constrained dictionary learning formulations are developed. Numerical tests performed on synthetic fMRI data show promising results and provides insights into the proposed technique.

  16. Experimental light scattering by ultrasonically controlled small particles - Implications for Planetary Science

    NASA Astrophysics Data System (ADS)

    Gritsevich, M.; Penttilä, A.; Maconi, G.; Kassamakov, I.; Markkanen, J.; Martikainen, J.; Väisänen, T.; Helander, P.; Puranen, T.; Salmi, A.; Hæggström, E.; Muinonen, K.

    2017-09-01

    We present the results obtained with our newly developed 3D scatterometer - a setup for precise multi-angular measurements of light scattered by mm- to µm-sized samples held in place by sound. These measurements are cross-validated against the modeled light-scattering characteristics of the sample, i.e., the intensity and the degree of linear polarization of the reflected light, calculated with state-of-the-art electromagnetic techniques. We demonstrate a unique non-destructive approach to derive the optical properties of small grain samples which facilitates research on highly valuable planetary materials, such as samples returned from space missions or rare meteorites.

  17. Medical application of artificial immune recognition system (AIRS): diagnosis of atherosclerosis from carotid artery Doppler signals.

    PubMed

    Latifoğlu, Fatma; Kodaz, Halife; Kara, Sadik; Güneş, Salih

    2007-08-01

    This study was conducted to distinguish between atherosclerosis and healthy subjects. Hence, we have employed the maximum envelope of the carotid artery Doppler sonograms derived from Fast Fourier Transformation-Welch method and Artificial Immune Recognition System (AIRS). The fuzzy appearance of the carotid artery Doppler signals makes physicians suspicious about the existence of diseases and sometimes causes false diagnosis. Our technique gets around this problem using AIRS to decide and assist the physician to make the final judgment in confidence. AIRS has reached 99.29% classification accuracy using 10-fold cross validation. Results show that the proposed method classified Doppler signals successfully.

  18. Validation and divergence of the activation energy barrier crossing transition at the AOT/lecithin reverse micellar interface.

    PubMed

    Narayanan, S Shankara; Sinha, Sudarson Sekhar; Sarkar, Rupa; Pal, Samir Kumar

    2008-03-13

    In this report, the validity and divergence of the activation energy barrier crossing model for the bound to free type water transition at the interface of the AOT/lecithin mixed reverse micelle (RM) has been investigated for the first time in a wide range of temperatures by time-resolved solvation of fluorophores. Here, picosecond-resolved solvation dynamics of two fluorescent probes, ANS (1-anilino-8-naphthalenesulfonic acid, ammonium salt) and Coumarin 500 (C-500), in the mixed RM have been carefully examined at 293, 313, 328, and 343 K. Using the dynamic light scattering (DLS) technique, the size of the mixed RMs at different temperatures was found to have an insignificant change. The solvation process at the reverse micellar interface has been found to be the activation energy barrier crossing type, in which interface-bound type water molecules get converted into free type water molecules. The activation energies, Ea, calculated for ANS and C-500 are 7.4 and 3.9 kcal mol(-1), respectively, which are in good agreement with that obtained by molecular dynamics simulation studies. However, deviation from the regular Arrhenius type behavior was observed for ANS around 343 K, which has been attributed to the spatial heterogeneity of the probe environments. Time-resolved fluorescence anisotropy decay of the probes has indicated the existence of the dyes in a range of locations in RM. With the increase in temperature, the overall anisotropy decay becomes faster revealing the lability of the microenvironment at elevated temperatures.

  19. Derivation and Cross-Validation of Cutoff Scores for Patients With Schizophrenia Spectrum Disorders on WAIS-IV Digit Span-Based Performance Validity Measures.

    PubMed

    Glassmire, David M; Toofanian Ross, Parnian; Kinney, Dominique I; Nitch, Stephen R

    2016-06-01

    Two studies were conducted to identify and cross-validate cutoff scores on the Wechsler Adult Intelligence Scale-Fourth Edition Digit Span-based embedded performance validity (PV) measures for individuals with schizophrenia spectrum disorders. In Study 1, normative scores were identified on Digit Span-embedded PV measures among a sample of patients (n = 84) with schizophrenia spectrum diagnoses who had no known incentive to perform poorly and who put forth valid effort on external PV tests. Previously identified cutoff scores resulted in unacceptable false positive rates and lower cutoff scores were adopted to maintain specificity levels ≥90%. In Study 2, the revised cutoff scores were cross-validated within a sample of schizophrenia spectrum patients (n = 96) committed as incompetent to stand trial. Performance on Digit Span PV measures was significantly related to Full Scale IQ in both studies, indicating the need to consider the intellectual functioning of examinees with psychotic spectrum disorders when interpreting scores on Digit Span PV measures. © The Author(s) 2015.

  20. Validation of the Economic and Health Outcomes Model of Type 2 Diabetes Mellitus (ECHO-T2DM).

    PubMed

    Willis, Michael; Johansen, Pierre; Nilsson, Andreas; Asseburg, Christian

    2017-03-01

    The Economic and Health Outcomes Model of Type 2 Diabetes Mellitus (ECHO-T2DM) was developed to address study questions pertaining to the cost-effectiveness of treatment alternatives in the care of patients with type 2 diabetes mellitus (T2DM). Naturally, the usefulness of a model is determined by the accuracy of its predictions. A previous version of ECHO-T2DM was validated against actual trial outcomes and the model predictions were generally accurate. However, there have been recent upgrades to the model, which modify model predictions and necessitate an update of the validation exercises. The objectives of this study were to extend the methods available for evaluating model validity, to conduct a formal model validation of ECHO-T2DM (version 2.3.0) in accordance with the principles espoused by the International Society for Pharmacoeconomics and Outcomes Research (ISPOR) and the Society for Medical Decision Making (SMDM), and secondarily to evaluate the relative accuracy of four sets of macrovascular risk equations included in ECHO-T2DM. We followed the ISPOR/SMDM guidelines on model validation, evaluating face validity, verification, cross-validation, and external validation. Model verification involved 297 'stress tests', in which specific model inputs were modified systematically to ascertain correct model implementation. Cross-validation consisted of a comparison between ECHO-T2DM predictions and those of the seminal National Institutes of Health model. In external validation, study characteristics were entered into ECHO-T2DM to replicate the clinical results of 12 studies (including 17 patient populations), and model predictions were compared to observed values using established statistical techniques as well as measures of average prediction error, separately for the four sets of macrovascular risk equations supported in ECHO-T2DM. Sub-group analyses were conducted for dependent vs. independent outcomes and for microvascular vs. macrovascular vs. mortality endpoints. All stress tests were passed. ECHO-T2DM replicated the National Institutes of Health cost-effectiveness application with numerically similar results. In external validation of ECHO-T2DM, model predictions agreed well with observed clinical outcomes. For all sets of macrovascular risk equations, the results were close to the intercept and slope coefficients corresponding to a perfect match, resulting in high R 2 and failure to reject concordance using an F test. The results were similar for sub-groups of dependent and independent validation, with some degree of under-prediction of macrovascular events. ECHO-T2DM continues to match health outcomes in clinical trials in T2DM, with prediction accuracy similar to other leading models of T2DM.

  1. Application of Multivariable Analysis and FTIR-ATR Spectroscopy to the Prediction of Properties in Campeche Honey

    PubMed Central

    Pat, Lucio; Ali, Bassam; Guerrero, Armando; Córdova, Atl V.; Garduza, José P.

    2016-01-01

    Attenuated total reflectance-Fourier transform infrared spectrometry and chemometrics model was used for determination of physicochemical properties (pH, redox potential, free acidity, electrical conductivity, moisture, total soluble solids (TSS), ash, and HMF) in honey samples. The reference values of 189 honey samples of different botanical origin were determined using Association Official Analytical Chemists, (AOAC), 1990; Codex Alimentarius, 2001, International Honey Commission, 2002, methods. Multivariate calibration models were built using partial least squares (PLS) for the measurands studied. The developed models were validated using cross-validation and external validation; several statistical parameters were obtained to determine the robustness of the calibration models: (PCs) optimum number of components principal, (SECV) standard error of cross-validation, (R 2 cal) coefficient of determination of cross-validation, (SEP) standard error of validation, and (R 2 val) coefficient of determination for external validation and coefficient of variation (CV). The prediction accuracy for pH, redox potential, electrical conductivity, moisture, TSS, and ash was good, while for free acidity and HMF it was poor. The results demonstrate that attenuated total reflectance-Fourier transform infrared spectrometry is a valuable, rapid, and nondestructive tool for the quantification of physicochemical properties of honey. PMID:28070445

  2. Validity and cultural equivalence of the standard Greene Climacteric Scale in Hong Kong.

    PubMed

    Chen, Run Qiu; Davis, Susan R; Wong, Chit Ming; Lam, Tai Hing

    2010-01-01

    The aim of this study was to translate the standard Greene Climacteric Scale (GCS) and a urogenital symptom scale into colloquial Chinese (Hong Kong) and test their validity and reliability in Hong Kong Chinese women. The scales were translated with standard techniques, and cross-cultural construct validity, internal consistency, test-retest reliability, and responsiveness were tested on samples of women aged 40 to 60 years recruited from the community. A total of 611 women, with mean (SD) age of 48.9 (5.3) years, provided completed scales for the study. Confirmatory factor analysis demonstrated construct validity of the translated standard GCS. The items were found to have good homogeneity in measuring the scale concepts (Cronbach alpha > 0.7). But the three-item urogenital scale had poor internal consistency (Cronbach alpha = 0.43), and a combination of this scale with the standard GCS resulted in a reduced model fit to the data. Test-retest reliability for the GCS was good on women recruited for a retest (n = 52). The translated GCS was found to be responsive to change over time (effect size, 0.59; n = 19). The Chinese (Hong Kong) version of the standard GCS is a valid and cultural-equivalent instrument. Our data do not support inclusion of the urogenital scale to the standard GCS. Measurement of urogenital symptoms is subject to further study.

  3. Reliability and Validity Study of the Chamorro Assisted Gait Scale for People with Sprained Ankles, Walking with Forearm Crutches

    PubMed Central

    Ridao-Fernández, Carmen; Ojeda, Joaquín; Benítez-Lugo, Marisa; Sevillano, José Luis

    2016-01-01

    Objective The aim of this study was to design and validate a functional assessment scale for assisted gait with forearm crutches (Chamorro Assisted Gait Scale—CHAGS) and to assess its reliability in people with sprained ankles. Design Thirty subjects who suffered from sprained ankle (anterior talofibular ligament first and second degree) were included in the study. A modified Delphi technique was used to obtain the content validity. The selected items were: pelvic and scapular girdle dissociation(1), deviation of Center of Gravity(2), crutch inclination(3), steps rhythm(4), symmetry of step length(5), cross support(6), simultaneous support of foot and crutch(7), forearm off(8), facing forward(9) and fluency(10). Two raters twice visualized the gait of the sample subjects which were recorded. The criterion-related validity was determined by correlation between CHAGS and Coding of eight criteria of qualitative gait analysis (Viel Coding). Internal consistency and inter and intra-rater reliability were also tested. Results CHAGS obtained a high and negative correlation with Viel Coding. We obtained a good internal consistency and the intra-class correlation coefficients oscillated between 0.97 and 0.99, while the minimal detectable changes were acceptable. Conclusion CHAGS scale is a valid and reliable tool for assessing assisted gait with crutches in people with sprained ankles to perform partial relief of lower limbs. PMID:27168236

  4. A Validation Study of the Impression Replica Technique.

    PubMed

    Segerström, Sofia; Wiking-Lima de Faria, Johanna; Braian, Michael; Ameri, Arman; Ahlgren, Camilla

    2018-04-17

    To validate the well-known and often-used impression replica technique for measuring fit between a preparation and a crown in vitro. The validation consisted of three steps. First, a measuring instrument was validated to elucidate its accuracy. Second, a specimen consisting of male and female counterparts was created and validated by the measuring instrument. Calculations were made for the exact values of three gaps between the male and female. Finally, impression replicas were produced of the specimen gaps and sectioned into four pieces. The replicas were then measured with the use of a light microscope. The values received from measuring the specimen were then compared with the values received from the impression replicas, and the technique was thereby validated. The impression replica technique overvalued all measured gaps. Depending on location of the three measuring sites, the difference between the specimen and the impression replicas varied from 47 to 130 μm. The impression replica technique overestimates gaps within the range of 2% to 11%. The validation of the replica technique enables the method to be used as a reference when testing other methods for evaluating fit in dentistry. © 2018 by the American College of Prosthodontists.

  5. Developing a model of competence in the operating theatre: psychometric validation of the perceived perioperative competence scale-revised.

    PubMed

    Gillespie, Brigid M; Polit, Denise F; Hamlin, Lois; Chaboyer, Wendy

    2012-01-01

    This paper describes the development and validation of the Revised Perioperative Competence Scale (PPCS-R). There is a lack of a psychometrically tested sound self-assessment tools to measure nurses' perceived competence in the operating room. Content validity was established by a panel of international experts and the original 98-item scale was pilot tested with 345 nurses in Queensland, Australia. Following the removal of several items, a national sample that included all 3209 nurses who were members of the Australian College of Operating Room Nurses was surveyed using the 94-item version. Psychometric testing assessed content validity using exploratory factor analysis, internal consistency using Cronbach's alpha, and construct validity using the "known groups" technique. During item reduction, several preliminary factor analyses were performed on two random halves of the sample (n=550). Usable data for psychometric assessment were obtained from 1122 nurses. The original 94-item scale was reduced to 40 items. The final factor analysis using the entire sample resulted in a 40 item six-factor solution. Cronbach's alpha for the 40-item scale was .96. Construct validation demonstrated significant differences (p<.0001) in perceived competence scores relative to years of operating room experience and receipt of specialty education. On the basis of these results, the psychometric properties of the PPCS-R were considered encouraging. Further testing of the tool in different samples of operating room nurses is necessary to enable cross-cultural comparisons. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Characterization of Nitinol Laser-Weld Joints by Nondestructive Testing

    NASA Astrophysics Data System (ADS)

    Wohlschlögel, Markus; Gläßel, Gunter; Sanchez, Daniela; Schüßler, Andreas; Dillenz, Alexander; Saal, David; Mayr, Peter

    2015-12-01

    Joining technology is an integral part of today's Nitinol medical device manufacturing. Besides crimping and riveting, laser welding is often applied to join components made from Nitinol to Nitinol, as well as Nitinol components to dissimilar materials. Other Nitinol joining techniques include adhesive bonding, soldering, and brazing. Typically, the performance of joints is assessed by destructive mechanical testing, on a process validation base. In this study, a nondestructive testing method—photothermal radiometry—is applied to characterize small Nitinol laser-weld joints used to connect two wire ends via a sleeve. Two different wire diameters are investigated. Effective joint connection cross sections are visualized using metallography techniques. Results of the nondestructive testing are correlated to data from destructive torsion testing, where the maximum torque at fracture is evaluated for the same joints and criteria for the differentiation of good and poor laser-welding quality by nondestructive testing are established.

  7. Soft computing techniques toward modeling the water supplies of Cyprus.

    PubMed

    Iliadis, L; Maris, F; Tachos, S

    2011-10-01

    This research effort aims in the application of soft computing techniques toward water resources management. More specifically, the target is the development of reliable soft computing models capable of estimating the water supply for the case of "Germasogeia" mountainous watersheds in Cyprus. Initially, ε-Regression Support Vector Machines (ε-RSVM) and fuzzy weighted ε-RSVMR models have been developed that accept five input parameters. At the same time, reliable artificial neural networks have been developed to perform the same job. The 5-fold cross validation approach has been employed in order to eliminate bad local behaviors and to produce a more representative training data set. Thus, the fuzzy weighted Support Vector Regression (SVR) combined with the fuzzy partition has been employed in an effort to enhance the quality of the results. Several rational and reliable models have been produced that can enhance the efficiency of water policy designers. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Dual-Polarization Ku-Band Compact Spaceborne Antenna Based on Dual-Reflectarray Optics †

    PubMed Central

    Tienda, Carolina; Encinar, Jose A.; Barba, Mariano

    2018-01-01

    This article demonstrated an accurate analysis technique for dual-reflectarray antennas that take into account the angle of incidence of the impinging electric field on the main reflectarray cells. The reflected field on the sub and the main reflectarray surfaces is computed using Method of Moments in the spectral domain and assuming local periodicity. The sub-reflectarray is divided into groups of elements and the field radiated by each group is used to compute the incident and reflected field on the main reflectarray cells. A 50-cm demonstrator in Ku-band that provides European coverage has been designed, manufactured and tested to validate the analysis technique. The measured radiation patterns match the simulations and they fulfill the coverage requirements, achieving a cross-polar discrimination better than 25 dB in the frequency range: 12.975–14.25 GHz. PMID:29621155

  9. Exploring QSARs of the interaction of flavonoids with GABA (A) receptor using MLR, ANN and SVM techniques.

    PubMed

    Deeb, Omar; Shaik, Basheerulla; Agrawal, Vijay K

    2014-10-01

    Quantitative Structure-Activity Relationship (QSAR) models for binding affinity constants (log Ki) of 78 flavonoid ligands towards the benzodiazepine site of GABA (A) receptor complex were calculated using the machine learning methods: artificial neural network (ANN) and support vector machine (SVM) techniques. The models obtained were compared with those obtained using multiple linear regression (MLR) analysis. The descriptor selection and model building were performed with 10-fold cross-validation using the training data set. The SVM and MLR coefficient of determination values are 0.944 and 0.879, respectively, for the training set and are higher than those of ANN models. Though the SVM model shows improvement of training set fitting, the ANN model was superior to SVM and MLR in predicting the test set. Randomization test is employed to check the suitability of the models.

  10. Rapid prediction of total petroleum hydrocarbons concentration in contaminated soil using vis-NIR spectroscopy and regression techniques.

    PubMed

    Douglas, R K; Nawar, S; Alamar, M C; Mouazen, A M; Coulon, F

    2018-03-01

    Visible and near infrared spectrometry (vis-NIRS) coupled with data mining techniques can offer fast and cost-effective quantitative measurement of total petroleum hydrocarbons (TPH) in contaminated soils. Literature showed however significant differences in the performance on the vis-NIRS between linear and non-linear calibration methods. This study compared the performance of linear partial least squares regression (PLSR) with a nonlinear random forest (RF) regression for the calibration of vis-NIRS when analysing TPH in soils. 88 soil samples (3 uncontaminated and 85 contaminated) collected from three sites located in the Niger Delta were scanned using an analytical spectral device (ASD) spectrophotometer (350-2500nm) in diffuse reflectance mode. Sequential ultrasonic solvent extraction-gas chromatography (SUSE-GC) was used as reference quantification method for TPH which equal to the sum of aliphatic and aromatic fractions ranging between C 10 and C 35 . Prior to model development, spectra were subjected to pre-processing including noise cut, maximum normalization, first derivative and smoothing. Then 65 samples were selected as calibration set and the remaining 20 samples as validation set. Both vis-NIR spectrometry and gas chromatography profiles of the 85 soil samples were subjected to RF and PLSR with leave-one-out cross-validation (LOOCV) for the calibration models. Results showed that RF calibration model with a coefficient of determination (R 2 ) of 0.85, a root means square error of prediction (RMSEP) 68.43mgkg -1 , and a residual prediction deviation (RPD) of 2.61 outperformed PLSR (R 2 =0.63, RMSEP=107.54mgkg -1 and RDP=2.55) in cross-validation. These results indicate that RF modelling approach is accounting for the nonlinearity of the soil spectral responses hence, providing significantly higher prediction accuracy compared to the linear PLSR. It is recommended to adopt the vis-NIRS coupled with RF modelling approach as a portable and cost effective method for the rapid quantification of TPH in soils. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Noninvasive Diagnosis of Nonalcoholic Fatty Liver Disease and Quantification of Liver Fat Using a New Quantitative Ultrasound Technique.

    PubMed

    Lin, Steven C; Heba, Elhamy; Wolfson, Tanya; Ang, Brandon; Gamst, Anthony; Han, Aiguo; Erdman, John W; O'Brien, William D; Andre, Michael P; Sirlin, Claude B; Loomba, Rohit

    2015-07-01

    Liver biopsy analysis is the standard method used to diagnose nonalcoholic fatty liver disease (NAFLD). Advanced magnetic resonance imaging is a noninvasive procedure that can accurately diagnose and quantify steatosis, but is expensive. Conventional ultrasound is more accessible but identifies steatosis with low levels of sensitivity, specificity, and quantitative accuracy, and results vary among technicians. A new quantitative ultrasound (QUS) technique can identify steatosis in animal models. We assessed the accuracy of QUS in the diagnosis and quantification of hepatic steatosis, comparing findings with those from magnetic resonance imaging proton density fat fraction (MRI-PDFF) analysis as a reference. We performed a prospective, cross-sectional analysis of a cohort of adults (N = 204) with NAFLD (MRI-PDFF, ≥5%) and without NAFLD (controls). Subjects underwent MRI-PDFF and QUS analyses of the liver on the same day at the University of California, San Diego, from February 2012 through March 2014. QUS parameters and backscatter coefficient (BSC) values were calculated. Patients were assigned randomly to training (n = 102; mean age, 51 ± 17 y; mean body mass index, 31 ± 7 kg/m(2)) and validation (n = 102; mean age, 49 ± 17 y; body mass index, 30 ± 6 kg/m(2)) groups; 69% of patients in each group had NAFLD. BSC (range, 0.00005-0.25 1/cm-sr) correlated with MRI-PDFF (Spearman ρ = 0.80; P < .0001). In the training group, the BSC analysis identified patients with NAFLD with an area under the curve value of 0.98 (95% confidence interval, 0.95-1.00; P < .0001). The optimal BSC cut-off value identified patients with NAFLD in the training and validation groups with 93% and 87% sensitivity, 97% and 91% specificity, 86% and 76% negative predictive values, and 99% and 95% positive predictive values, respectively. QUS measurements of BSC can accurately diagnose and quantify hepatic steatosis, based on a cross-sectional analysis that used MRI-PDFF as the reference. With further validation, QUS could be an inexpensive, widely available method to screen the general or at-risk population for NAFLD. Copyright © 2015 AGA Institute. Published by Elsevier Inc. All rights reserved.

  12. Cross-Cultural Adaptation and Initial Validation of the Stroke-Specific Quality of Life Scale into the Yoruba Language

    ERIC Educational Resources Information Center

    Akinpelu, Aderonke O.; Odetunde, Marufat O.; Odole, Adesola C.

    2012-01-01

    Stroke-Specific Quality of Life 2.0 (SS-QoL 2.0) scale is used widely and has been cross-culturally adapted to many languages. This study aimed at the cross-cultural adaptation of SS-QoL 2.0 to Yoruba, the indigenous language of south-western Nigeria, and to carry out an initial investigation on its validity. English SS-QoL 2.0 was first adapted…

  13. The DC-8 Submillimeter-Wave Cloud Ice Radiometer

    NASA Technical Reports Server (NTRS)

    Walter, Steven J.; Batelaan, Paul; Siegel, Peter; Evans, K. Franklin; Evans, Aaron; Balachandra, Balu; Gannon, Jade; Guldalian, John; Raz, Guy; Shea, James

    2000-01-01

    An airborne radiometer is being developed to demonstrate the capability of radiometry at submillimeter-wavelengths to characterize cirrus clouds. At these wavelengths, cirrus clouds scatter upwelling radiation from water vapor in the lower troposphere. Radiometric measurements made at multiple widely spaced frequencies permit flux variations caused by changes in scattering due to crystal size to be distinguished from changes in cloud ice content. Measurements at dual polarizations can also be used to constrain the mean crystal shape. An airborne radiometer measuring the upwelling submillimeter-wave flux should then able to retrieve both bulk and microphysical cloud properties. The radiometer is being designed to make measurements at four frequencies (183 GHz, 325 GHz, 448 GHz, and 643 GHz) with dual-polarization capability at 643 GHz. The instrument is being developed for flight on NASA's DC-8 and will scan cross-track through an aircraft window. Measurements with this radiometer in combination with independent ground-based and airborne measurements will validate the submillimeter-wave radiometer retrieval techniques. The goal of this effort is to develop a technique to enable spaceborne characterization of cirrus, which will meet a key climate measurement need. The development of an airborne radiometer to validate cirrus retrieval techniques is a critical step toward development of spaced-based radiometers to investigate and monitor cirrus on a global scale. The radiometer development is a cooperative effort of the University of Colorado, Colorado State University, Swales Aerospace, and Jet Propulsion Laboratory and is funded by the NASA Instrument Incubator Program.

  14. Shape Optimization by Bayesian-Validated Computer-Simulation Surrogates

    NASA Technical Reports Server (NTRS)

    Patera, Anthony T.

    1997-01-01

    A nonparametric-validated, surrogate approach to optimization has been applied to the computational optimization of eddy-promoter heat exchangers and to the experimental optimization of a multielement airfoil. In addition to the baseline surrogate framework, a surrogate-Pareto framework has been applied to the two-criteria, eddy-promoter design problem. The Pareto analysis improves the predictability of the surrogate results, preserves generality, and provides a means to rapidly determine design trade-offs. Significant contributions have been made in the geometric description used for the eddy-promoter inclusions as well as to the surrogate framework itself. A level-set based, geometric description has been developed to define the shape of the eddy-promoter inclusions. The level-set technique allows for topology changes (from single-body,eddy-promoter configurations to two-body configurations) without requiring any additional logic. The continuity of the output responses for input variations that cross the boundary between topologies has been demonstrated. Input-output continuity is required for the straightforward application of surrogate techniques in which simplified, interpolative models are fitted through a construction set of data. The surrogate framework developed previously has been extended in a number of ways. First, the formulation for a general, two-output, two-performance metric problem is presented. Surrogates are constructed and validated for the outputs. The performance metrics can be functions of both outputs, as well as explicitly of the inputs, and serve to characterize the design preferences. By segregating the outputs and the performance metrics, an additional level of flexibility is provided to the designer. The validated outputs can be used in future design studies and the error estimates provided by the output validation step still apply, and require no additional appeals to the expensive analysis. Second, a candidate-based a posteriori error analysis capability has been developed which provides probabilistic error estimates on the true performance for a design randomly selected near the surrogate-predicted optimal design.

  15. Scale/TSUNAMI Sensitivity Data for ICSBEP Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Reed, Davis Allan; Lefebvre, Robert A

    2011-01-01

    The Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) software developed at Oak Ridge National Laboratory (ORNL) as part of the Scale code system provide unique methods for code validation, gap analysis, and experiment design. For TSUNAMI analysis, sensitivity data are generated for each application and each existing or proposed experiment used in the assessment. The validation of diverse sets of applications requires potentially thousands of data files to be maintained and organized by the user, and a growing number of these files are available through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE) distributed through themore » International Criticality Safety Benchmark Evaluation Program (ICSBEP). To facilitate the use of the IHECSBE benchmarks in rigorous TSUNAMI validation and gap analysis techniques, ORNL generated SCALE/TSUNAMI sensitivity data files (SDFs) for several hundred benchmarks for distribution with the IHECSBE. For the 2010 edition of IHECSBE, the sensitivity data were generated using 238-group cross-section data based on ENDF/B-VII.0 for 494 benchmark experiments. Additionally, ORNL has developed a quality assurance procedure to guide the generation of Scale inputs and sensitivity data, as well as a graphical user interface to facilitate the use of sensitivity data in identifying experiments and applying them in validation studies.« less

  16. Urdu translation and validation of shorter version of Positive Affect and Negative Affect Schedule (PANAS) on Pakistani bank employees.

    PubMed

    Akhter, Noreen

    2017-10-01

    To translate, adapt and validate shorter version of positive affect and negative affect scale on Pakistani corporate employees. This cross-sectional study was conducted in the twin cities of Islamabad and Rawalpindi from October 2014 to December 2015. The study was completed into two independent parts. In part one, the scale was translated by forward translation. Then it was pilot-tested and administered on customer services employees from commercial banks and the telecommunication sector. Data of the pilot study was analysed by using exploratory factor analysis to extract the initial factor of positive affect and negative affect scale. Part two comprised the main study. Commercial bank employees were included in the sample using convenient sampling technique. Data of the main study was analysed using confirmatory factor analysis in order to establish construct validity of positive affect and negative affect scale. There were145 participants in the first part of the study and 495 in the second. Results of confirmatory factor analysis confirmed the two-factor structure of positive affect and negative affect scale suggesting that the scale has two distinct domains, i.e. positive affect and negative affect. The shorter version of positive affect and negative affect scale was found to be a valid and reliable measure.

  17. Recommendations for elaboration, transcultural adaptation and validation process of tests in Speech, Hearing and Language Pathology.

    PubMed

    Pernambuco, Leandro; Espelt, Albert; Magalhães, Hipólito Virgílio; Lima, Kenio Costa de

    2017-06-08

    to present a guide with recommendations for translation, adaptation, elaboration and process of validation of tests in Speech and Language Pathology. the recommendations were based on international guidelines with a focus on the elaboration, translation, cross-cultural adaptation and validation process of tests. the recommendations were grouped into two Charts, one of them with procedures for translation and transcultural adaptation and the other for obtaining evidence of validity, reliability and measures of accuracy of the tests. a guide with norms for the organization and systematization of the process of elaboration, translation, cross-cultural adaptation and validation process of tests in Speech and Language Pathology was created.

  18. An Evaluation of the Cross-Cultural Validity of Holland's Theory: Career Choices by Workers in India.

    ERIC Educational Resources Information Center

    Leong, Frederick T. L.; Austin, James T.; Sekaran, Uma; Komarraju, Meera

    1998-01-01

    Natives of India (n=172) completed Holland's Vocational Preference Inventory and job satisfaction measures. The inventory did not exhibit high external validity with this population. Congruence, consistency, and differentiation did not predict job or occupational satisfaction, suggesting cross-cultural limits on Holland's theory. (SK)

  19. Psychometric Evaluation of the Exercise Identity Scale among Greek Adults and Cross-Cultural Validity

    ERIC Educational Resources Information Center

    Vlachopoulos, Symeon P.; Kaperoni, Maria; Moustaka, Frederiki C.; Anderson, Dean F.

    2008-01-01

    The present study reported on translating the Exercise Identity Scale (EIS: Anderson & Cychosz, 1994) into Greek and examining its psychometric properties and cross-cultural validity based on U.S. individuals' EIS responses. Using four samples comprising 33, 103, and 647 Greek individuals, including exercisers and nonexercisers, and a similar…

  20. Cross-Validation of FITNESSGRAM® Health-Related Fitness Standards in Hungarian Youth

    ERIC Educational Resources Information Center

    Laurson, Kelly R.; Saint-Maurice, Pedro F.; Karsai, István; Csányi, Tamás

    2015-01-01

    Purpose: The purpose of this study was to cross-validate FITNESSGRAM® aerobic and body composition standards in a representative sample of Hungarian youth. Method: A nationally representative sample (N = 405) of Hungarian adolescents from the Hungarian National Youth Fitness Study (ages 12-18.9 years) participated in an aerobic capacity assessment…

  1. Caregivers' Agreement and Validity of Indirect Functional Analysis: A Cross Cultural Evaluation across Multiple Problem Behavior Topographies

    ERIC Educational Resources Information Center

    Virues-Ortega, Javier; Segui-Duran, David; Descalzo-Quero, Alberto; Carnerero, Jose Julio; Martin, Neil

    2011-01-01

    The Motivation Assessment Scale is an aid for hypothesis-driven functional analysis. This study presents its Spanish cross-cultural validation while examining psychometric attributes not yet explored. The study sample comprised 80 primary caregivers of children with autism. Acceptability, scaling assumptions, internal consistency, factor…

  2. Cross-Cultural Validation of the Counselor Burnout Inventory in Hong Kong

    ERIC Educational Resources Information Center

    Shin, Hyojung; Yuen, Mantak; Lee, Jayoung; Lee, Sang Min

    2013-01-01

    This study investigated the cross-cultural validation of the Chinese translation of the Counselor Burnout Inventory (CBI) with a sample of school counselors in Hong Kong. Specifically, this study examined the CBI's factor structure using confirmatory factor analysis and calculated the effect size, to compare burnout scores among the counselors of…

  3. Cross-Validation of a Short Form of the Marlowe-Crowne Social Desirability Scale.

    ERIC Educational Resources Information Center

    Zook, Avery, II; Sipps, Gary J.

    1985-01-01

    Presents a cross-validation of Reynolds' short form of the Marlowe-Crowne Social Desirability Scale (N=233). Researchers administered 13 items as a separate entity, calculated Cronbach's Alpha for each sex, and computed test-retest correlation for one group. Concluded that the short form is a viable alternative. (Author/NRB)

  4. Selection of Marine Corps Drill Instructors

    DTIC Science & Technology

    1980-03-01

    8 4. ., ey- Construction and Cross-Validation Statistics for Drill Instructor School Performance Success Keys...Race, and School Attrition ........... ............................. ... 15 13. Key- Construction and Cross-Validation Statistics for Drill... constructed form, the Alternation Ranking of Series Drill Instruc- tors. In this form, DIs in a Series are ranked from highest to lowest in terms of their

  5. Studying Cross-Cultural Differences in Temperament in the First Year of Life: United States and Italy

    ERIC Educational Resources Information Center

    Montirosso, Rosario; Cozzi, Patrizia; Putnam, Samuel P.; Gartstein, Maria A.; Borgatti, Renato

    2011-01-01

    An Italian translation of the Infant Behavior Questionnaire-Revised (IBQ-R) was developed and evaluated with 110 infants, demonstrating satisfactory internal consistency, discriminant validity, and construct validity in the form of gender and age differences, as well as factorial integrity. Cross-cultural differences were subsequently evaluated…

  6. The Halpern Critical Thinking Assessment and Real-World Outcomes: Cross-National Applications

    ERIC Educational Resources Information Center

    Butler, Heather A.; Dwyer, Christopher P.; Hogan, Michael J.; Franco, Amanda; Rivas, Silvia F.; Saiz, Carlos; Almeida, Leandro S.

    2012-01-01

    The Halpern Critical Thinking Assessment (HCTA) is a reliable measure of critical thinking that has been validated with numerous qualitatively different samples and measures of academic success (Halpern, 2010a). This paper presents several cross-national applications of the assessment, and recent work to expand the validation of the HCTA with…

  7. γ production and neutron inelastic scattering cross sections for 76Ge

    NASA Astrophysics Data System (ADS)

    Rouki, C.; Domula, A. R.; Drohé, J. C.; Koning, A. J.; Plompen, A. J. M.; Zuber, K.

    2013-11-01

    The 2040.7-keV γ ray from the 69th excited state of 76Ge was investigated in the interest of Ge-based double-β-decay experiments like the Germanium Detector Array (GERDA) experiment. The predicted transition could interfere with valid 0νββ events at 2039.0 keV, creating false signals in large-volume 76Ge enriched detectors. The measurement was performed with the Gamma Array for Inelastic Neutron Scattering (GAINS) at the Geel Electron Linear Accelerator (GELINA) white neutron source, using the (n,n'γ) technique and focusing on the strongest γ rays originating from the level. Upper limits obtained for the production cross section of the 2040.7-keV γ ray showed no possible influence on GERDA data. Additional analysis of the data yielded high-resolution cross sections for the low-lying states of 76Ge and related γ rays, improving the accuracy and extending existing data for five transitions and five levels. The inelastic scattering cross section for 76Ge was determined for incident neutron energies up to 2.23 MeV, significantly increasing the energy range for which experimental data are available. Comparisons with model calculations using the talys code are presented indicating that accounting for the recently established asymmetric rotor structure should lead to an improved description of the data.

  8. A Comparison Study of Classifier Algorithms for Cross-Person Physical Activity Recognition

    PubMed Central

    Saez, Yago; Baldominos, Alejandro; Isasi, Pedro

    2016-01-01

    Physical activity is widely known to be one of the key elements of a healthy life. The many benefits of physical activity described in the medical literature include weight loss and reductions in the risk factors for chronic diseases. With the recent advances in wearable devices, such as smartwatches or physical activity wristbands, motion tracking sensors are becoming pervasive, which has led to an impressive growth in the amount of physical activity data available and an increasing interest in recognizing which specific activity a user is performing. Moreover, big data and machine learning are now cross-fertilizing each other in an approach called “deep learning”, which consists of massive artificial neural networks able to detect complicated patterns from enormous amounts of input data to learn classification models. This work compares various state-of-the-art classification techniques for automatic cross-person activity recognition under different scenarios that vary widely in how much information is available for analysis. We have incorporated deep learning by using Google’s TensorFlow framework. The data used in this study were acquired from PAMAP2 (Physical Activity Monitoring in the Ageing Population), a publicly available dataset containing physical activity data. To perform cross-person prediction, we used the leave-one-subject-out (LOSO) cross-validation technique. When working with large training sets, the best classifiers obtain very high average accuracies (e.g., 96% using extra randomized trees). However, when the data volume is drastically reduced (where available data are only 0.001% of the continuous data), deep neural networks performed the best, achieving 60% in overall prediction accuracy. We found that even when working with only approximately 22.67% of the full dataset, we can statistically obtain the same results as when working with the full dataset. This finding enables the design of more energy-efficient devices and facilitates cold starts and big data processing of physical activity records. PMID:28042838

  9. A Comparison Study of Classifier Algorithms for Cross-Person Physical Activity Recognition.

    PubMed

    Saez, Yago; Baldominos, Alejandro; Isasi, Pedro

    2016-12-30

    Physical activity is widely known to be one of the key elements of a healthy life. The many benefits of physical activity described in the medical literature include weight loss and reductions in the risk factors for chronic diseases. With the recent advances in wearable devices, such as smartwatches or physical activity wristbands, motion tracking sensors are becoming pervasive, which has led to an impressive growth in the amount of physical activity data available and an increasing interest in recognizing which specific activity a user is performing. Moreover, big data and machine learning are now cross-fertilizing each other in an approach called "deep learning", which consists of massive artificial neural networks able to detect complicated patterns from enormous amounts of input data to learn classification models. This work compares various state-of-the-art classification techniques for automatic cross-person activity recognition under different scenarios that vary widely in how much information is available for analysis. We have incorporated deep learning by using Google's TensorFlow framework. The data used in this study were acquired from PAMAP2 (Physical Activity Monitoring in the Ageing Population), a publicly available dataset containing physical activity data. To perform cross-person prediction, we used the leave-one-subject-out (LOSO) cross-validation technique. When working with large training sets, the best classifiers obtain very high average accuracies (e.g., 96% using extra randomized trees). However, when the data volume is drastically reduced (where available data are only 0.001% of the continuous data), deep neural networks performed the best, achieving 60% in overall prediction accuracy. We found that even when working with only approximately 22.67% of the full dataset, we can statistically obtain the same results as when working with the full dataset. This finding enables the design of more energy-efficient devices and facilitates cold starts and big data processing of physical activity records.

  10. Application of a novel hybrid method for spatiotemporal data imputation: A case study of the Minqin County groundwater level

    NASA Astrophysics Data System (ADS)

    Zhang, Zhongrong; Yang, Xuan; Li, Hao; Li, Weide; Yan, Haowen; Shi, Fei

    2017-10-01

    The techniques for data analyses have been widely developed in past years, however, missing data still represent a ubiquitous problem in many scientific fields. In particular, dealing with missing spatiotemporal data presents an enormous challenge. Nonetheless, in recent years, a considerable amount of research has focused on spatiotemporal problems, making spatiotemporal missing data imputation methods increasingly indispensable. In this paper, a novel spatiotemporal hybrid method is proposed to verify and imputed spatiotemporal missing values. This new method, termed SOM-FLSSVM, flexibly combines three advanced techniques: self-organizing feature map (SOM) clustering, the fruit fly optimization algorithm (FOA) and the least squares support vector machine (LSSVM). We employ a cross-validation (CV) procedure and FOA swarm intelligence optimization strategy that can search available parameters and determine the optimal imputation model. The spatiotemporal underground water data for Minqin County, China, were selected to test the reliability and imputation ability of SOM-FLSSVM. We carried out a validation experiment and compared three well-studied models with SOM-FLSSVM using a different missing data ratio from 0.1 to 0.8 in the same data set. The results demonstrate that the new hybrid method performs well in terms of both robustness and accuracy for spatiotemporal missing data.

  11. Marketing blood drives to students: a case study.

    PubMed

    Leigh, Laurence; Bist, Michael; Alexe, Roxana

    2007-01-01

    The aim of this paper is to motivate blood donation among international students and demonstrate the applicability of marketing techniques in the health care sector. The paper uses a combination of focus groups and a questionnaire-based survey. The paper finds that donors primarily find gratification from their altruistic acts through awareness of their contribution to saving lives. Receiving information on how each individual donation is used is seen as a powerful means of reinforcement. Practical benefits such as receiving free blood test information are also useful motivators, while communicating the professionalism of the blood collection techniques are important for reassuring the minority of prospective donors who expressed fears about possible risks associated with blood donation. Since this was a small-scale study among Hungarian and international students in Budapest, further research is necessary to validate its results among other demographic groups. Findings were reported to the International Federation of Red Cross and Red Crescent Societies in Hungary in order to increase blood donations among students in Hungary. Subject to validation through further research, applying recommended approaches in different countries and other demographic groups is suggested. This is the first research paper on motivation toward blood donation among international students and offers new and practical suggestions for increasing their level of participation in blood drives.

  12. A new modal-based approach for modelling the bump foil structure in the simultaneous solution of foil-air bearing rotor dynamic problems

    NASA Astrophysics Data System (ADS)

    Bin Hassan, M. F.; Bonello, P.

    2017-05-01

    Recently-proposed techniques for the simultaneous solution of foil-air bearing (FAB) rotor dynamic problems have been limited to a simple bump foil model in which the individual bumps were modelled as independent spring-damper (ISD) subsystems. The present paper addresses this limitation by introducing a modal model of the bump foil structure into the simultaneous solution scheme. The dynamics of the corrugated bump foil structure are first studied using the finite element (FE) technique. This study is experimentally validated using a purpose-made corrugated foil structure. Based on the findings of this study, it is proposed that the dynamics of the full foil structure, including bump interaction and foil inertia, can be represented by a modal model comprising a limited number of modes. This full foil structure modal model (FFSMM) is then adapted into the rotordynamic FAB problem solution scheme, instead of the ISD model. Preliminary results using the FFSMM under static and unbalance excitation conditions are proven to be reliable by comparison against the corresponding ISD foil model results and by cross-correlating different methods for computing the deflection of the full foil structure. The rotor-bearing model is also validated against experimental and theoretical results in the literature.

  13. Imaging Modalities Relevant to Intracranial Pressure Assessment in Astronauts: A Case-Based Discussion

    NASA Technical Reports Server (NTRS)

    Sargsyan, Ashot E.; Kramer, Larry A.; Hamilton, Douglas R.; Hamilton, Douglas R.; Fogarty, Jennifer; Polk, J. D.

    2010-01-01

    Introduction: Intracranial pressure (ICP) elevation has been inferred or documented in a number of space crewmembers. Recent advances in noninvasive imaging technology offer new possibilities for ICP assessment. Most International Space Station (ISS) partner agencies have adopted a battery of occupational health monitoring tests including magnetic resonance imaging (MRI) pre- and postflight, and high-resolution sonography of the orbital structures in all mission phases including during flight. We hypothesize that joint consideration of data from the two techniques has the potential to improve quality and continuity of crewmember monitoring and care. Methods: Specially designed MRI and sonographic protocols were used to image eyes and optic nerves (ON) including the meningeal sheaths. Specific crewmembers multi-modality imaging data were analyzed to identify points of mutual validation as well as unique features of complementary nature. Results and Conclusion: Magnetic resonance imaging (MRI) and high-resolution sonography are both tomographic methods, however images obtained by the two modalities are based on different physical phenomena and use different acquisition principles. Consideration of the images acquired by these two modalities allows cross-validating findings related to the volume and fluid content of the ON subarachnoid space, shape of the globe, and other anatomical features of the orbit. Each of the imaging modalities also has unique advantages, making them complementary techniques.

  14. A Computer Program for Practical Semivariogram Modeling and Ordinary Kriging: A Case Study of Porosity Distribution in an Oil Field

    NASA Astrophysics Data System (ADS)

    Mert, Bayram Ali; Dag, Ahmet

    2017-12-01

    In this study, firstly, a practical and educational geostatistical program (JeoStat) was developed, and then example analysis of porosity parameter distribution, using oilfield data, was presented. With this program, two or three-dimensional variogram analysis can be performed by using normal, log-normal or indicator transformed data. In these analyses, JeoStat offers seven commonly used theoretical variogram models (Spherical, Gaussian, Exponential, Linear, Generalized Linear, Hole Effect and Paddington Mix) to the users. These theoretical models can be easily and quickly fitted to experimental models using a mouse. JeoStat uses ordinary kriging interpolation technique for computation of point or block estimate, and also uses cross-validation test techniques for validation of the fitted theoretical model. All the results obtained by the analysis as well as all the graphics such as histogram, variogram and kriging estimation maps can be saved to the hard drive, including digitised graphics and maps. As such, the numerical values of any point in the map can be monitored using a mouse and text boxes. This program is available to students, researchers, consultants and corporations of any size free of charge. The JeoStat software package and source codes available at: http://www.jeostat.com/JeoStat_2017.0.rar.

  15. Simultaneous determination of vitamin B12 and its derivatives using some of multivariate calibration 1 (MVC1) techniques

    NASA Astrophysics Data System (ADS)

    Samadi-Maybodi, Abdolraouf; Darzi, S. K. Hassani Nejad

    2008-10-01

    Resolution of binary mixtures of vitamin B12, methylcobalamin and B12 coenzyme with minimum sample pre-treatment and without analyte separation has been successfully achieved by methods of partial least squares algorithm with one dependent variable (PLS1), orthogonal signal correction/partial least squares (OSC/PLS), principal component regression (PCR) and hybrid linear analysis (HLA). Data of analysis were obtained from UV-vis spectra. The UV-vis spectra of the vitamin B12, methylcobalamin and B12 coenzyme were recorded in the same spectral conditions. The method of central composite design was used in the ranges of 10-80 mg L -1 for vitamin B12 and methylcobalamin and 20-130 mg L -1 for B12 coenzyme. The models refinement procedure and validation were performed by cross-validation. The minimum root mean square error of prediction (RMSEP) was 2.26 mg L -1 for vitamin B12 with PLS1, 1.33 mg L -1 for methylcobalamin with OSC/PLS and 3.24 mg L -1 for B12 coenzyme with HLA techniques. Figures of merit such as selectivity, sensitivity, analytical sensitivity and LOD were determined for three compounds. The procedure was successfully applied to simultaneous determination of three compounds in synthetic mixtures and in a pharmaceutical formulation.

  16. Spatial prediction of near surface soil water retention functions using hydrogeophysics

    NASA Astrophysics Data System (ADS)

    Gibson, J. P.; Franz, T. E.

    2017-12-01

    The hydrological community often turns to widely available spatial datasets such as SSURGO to characterize the spatial variability of soil across a landscape of interest. This has served as a reasonable first approximation when lacking localized soil data. However, previous work has shown that information loss within land surface models primarily stems from parameterization. Localized soil sampling is both expensive and time intense, and thus a need exists in connecting spatial datasets with ground observations. Given that hydrogeophysics is data-dense, rapid, and relatively easy to adopt, it is a promising technique to help dovetail localized soil sampling with larger spatial datasets. In this work, we utilize 2 geophysical techniques; cosmic ray neutron probe and electromagnetic induction, to identify temporally stable soil moisture patterns. This is achieved by measuring numerous times over a range of wet to dry field conditions in order to apply an empirical orthogonal function. We then present measured water retention functions of shallow cores extracted within each temporally stable zone. Lastly, we use soil moisture patterns as a covariate to predict soil hydraulic properties in areas without measurement and validate using a leave-one-out cross validation analysis. Using these approaches to better constrain soil hydraulic property variability, we speculate that further research can better estimate hydrologic fluxes in areas of interest.

  17. Micro Blowing Simulations Using a Coupled Finite-Volume Lattice-Boltzman n L ES Approach

    NASA Technical Reports Server (NTRS)

    Menon, S.; Feiz, H.

    1990-01-01

    Three dimensional large-eddy simulations (LES) of single and multiple jet-in-cross-flow (JICF) are conducted using the 19-bit Lattice Boltzmann Equation (LBE) method coupled with a conventional finite-volume (FV) scheme. In this coupled LBE-FV approach, the LBE-LES is employed to simulate the flow inside the jet nozzles while the FV-LES is used to simulate the crossflow. The key application area is the use of this technique is to study the micro blowing technique (MBT) for drag control similar to the recent experiments at NASA/GRC. It is necessary to resolve the flow inside the micro-blowing and suction holes with high resolution without being restricted by the FV time-step restriction. The coupled LBE-FV-LES approach achieves this objectives in a computationally efficient manner. A single jet in crossflow case is used for validation purpose and the results are compared with experimental data and full LBE-LES simulation. Good agreement with data is obtained. Subsequently, MBT over a flat plate with porosity of 25% is simulated using 9 jets in a compressible cross flow at a Mach number of 0.4. It is shown that MBT suppresses the near-wall vortices and reduces the skin friction by up to 50 percent. This is in good agreement with experimental data.

  18. How to test validity in orthodontic research: a mixed dentition analysis example.

    PubMed

    Donatelli, Richard E; Lee, Shin-Jae

    2015-02-01

    The data used to test the validity of a prediction method should be different from the data used to generate the prediction model. In this study, we explored whether an independent data set is mandatory for testing the validity of a new prediction method and how validity can be tested without independent new data. Several validation methods were compared in an example using the data from a mixed dentition analysis with a regression model. The validation errors of real mixed dentition analysis data and simulation data were analyzed for increasingly large data sets. The validation results of both the real and the simulation studies demonstrated that the leave-1-out cross-validation method had the smallest errors. The largest errors occurred in the traditional simple validation method. The differences between the validation methods diminished as the sample size increased. The leave-1-out cross-validation method seems to be an optimal validation method for improving the prediction accuracy in a data set with limited sample sizes. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  19. [Practical aspects for minimizing errors in the cross-cultural adaptation and validation of quality of life questionnaires].

    PubMed

    Lauffer, A; Solé, L; Bernstein, S; Lopes, M H; Francisconi, C F

    2013-01-01

    The development and validation of questionnaires for evaluating quality of life (QoL) has become an important area of research. However, there is a proliferation of non-validated measuring instruments in the health setting that do not contribute to advances in scientific knowledge. To present, through the analysis of available validated questionnaires, a checklist of the practical aspects of how to carry out the cross-cultural adaptation of QoL questionnaires (generic, or disease-specific) so that no step is overlooked in the evaluation process, and thus help prevent the elaboration of insufficient or incomplete validations. We have consulted basic textbooks and Pubmed databases using the following keywords quality of life, questionnaires, and gastroenterology, confined to «validation studies» in English, Spanish, and Portuguese, and with no time limit, for the purpose of analyzing the translation and validation of the questionnaires available through the Mapi Institute and PROQOLID websites. A checklist is presented to aid in the planning and carrying out of the cross-cultural adaptation of QoL questionnaires, in conjunction with a glossary of key terms in the area of knowledge. The acronym DSTAC was used, which refers to each of the 5 stages involved in the recommended procedure. In addition, we provide a table of the QoL instruments that have been validated into Spanish. This article provides information on how to adapt QoL questionnaires from a cross-cultural perspective, as well as to minimize common errors. Copyright © 2012 Asociación Mexicana de Gastroenterología. Published by Masson Doyma México S.A. All rights reserved.

  20. Cross-cultural adaptation and validation of the neonatal/infant Braden Q risk assessment scale.

    PubMed

    de Lima, Edson Luiz; de Brito, Maria José Azevedo; de Souza, Diba Maria Sebba Tosta; Salomé, Geraldo Magela; Ferreira, Lydia Masako

    2016-02-01

    To translate into Brazilian Portuguese and cross-culturally adapt the Neonatal/Infant Braden Q Risk Assessment Scale (Neonatal/Infant Braden Q Scale), and test the psychometric properties, reproducibility and validity of the instrument. There is a lack of studies on the development of pressure ulcers in children, especially in neonates. Thirty professionals participated in the cross-cultural adaptation of the Brazilian-Portuguese version of the scale. Fifty neonates of both sexes were assessed between July 2013 and June 2014. Reliability and reproducibility were tested in 20 neonates and construct validity was measured by correlating the Neonatal/Infant Braden Q Scale with the Braden Q Risk Assessment Scale (Braden Q Scale). Discriminant validity was assessed by comparing the scores of neonates with and without ulcers. The scale showed inter-rater reliability (ICC = 0.98; P < 0.001) and intra-rater reliability (ICC = 0.79; P < 0.001). A strong correlation was found between the Neonatal/Infant Braden Q Scale and Braden Q Scale (r = 0.96; P < 0.001). The cross-culturally adapted Brazilian version of the Neonatal/Infant Braden Q Scale is a reliable instrument, showing face, content and construct validity. Copyright © 2015 Tissue Viability Society. Published by Elsevier Ltd. All rights reserved.

  1. Bayesian cross-entropy methodology for optimal design of validation experiments

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Mahadevan, S.

    2006-07-01

    An important concern in the design of validation experiments is how to incorporate the mathematical model in the design in order to allow conclusive comparisons of model prediction with experimental output in model assessment. The classical experimental design methods are more suitable for phenomena discovery and may result in a subjective, expensive, time-consuming and ineffective design that may adversely impact these comparisons. In this paper, an integrated Bayesian cross-entropy methodology is proposed to perform the optimal design of validation experiments incorporating the computational model. The expected cross entropy, an information-theoretic distance between the distributions of model prediction and experimental observation, is defined as a utility function to measure the similarity of two distributions. A simulated annealing algorithm is used to find optimal values of input variables through minimizing or maximizing the expected cross entropy. The measured data after testing with the optimum input values are used to update the distribution of the experimental output using Bayes theorem. The procedure is repeated to adaptively design the required number of experiments for model assessment, each time ensuring that the experiment provides effective comparison for validation. The methodology is illustrated for the optimal design of validation experiments for a three-leg bolted joint structure and a composite helicopter rotor hub component.

  2. Three-dimensional ultrasound strain imaging of skeletal muscles

    NASA Astrophysics Data System (ADS)

    Gijsbertse, K.; Sprengers, A. M. J.; Nillesen, M. M.; Hansen, H. H. G.; Lopata, R. G. P.; Verdonschot, N.; de Korte, C. L.

    2017-01-01

    In this study, a multi-dimensional strain estimation method is presented to assess local relative deformation in three orthogonal directions in 3D space of skeletal muscles during voluntary contractions. A rigid translation and compressive deformation of a block phantom, that mimics muscle contraction, is used as experimental validation of the 3D technique and to compare its performance with respect to a 2D based technique. Axial, lateral and (in case of 3D) elevational displacements are estimated using a cross-correlation based displacement estimation algorithm. After transformation of the displacements to a Cartesian coordinate system, strain is derived using a least-squares strain estimator. The performance of both methods is compared by calculating the root-mean-squared error of the estimated displacements with the calculated theoretical displacements of the phantom experiments. We observe that the 3D technique delivers more accurate displacement estimations compared to the 2D technique, especially in the translation experiment where out-of-plane motion hampers the 2D technique. In vivo application of the 3D technique in the musculus vastus intermedius shows good resemblance between measured strain and the force pattern. Similarity of the strain curves of repetitive measurements indicates the reproducibility of voluntary contractions. These results indicate that 3D ultrasound is a valuable imaging tool to quantify complex tissue motion, especially when there is motion in three directions, which results in out-of-plane errors for 2D techniques.

  3. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  4. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...

  5. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  6. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...

  7. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...

  8. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  9. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...

  10. 41 CFR 60-3.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... validation techniques contemplated by these guidelines. In such circumstances, the user should utilize... a formal and scored selection procedure is used which has an adverse impact, the validation... user cannot or need not follow the validation techniques anticipated by these guidelines, the user...

  11. 29 CFR 1607.6 - Use of selection procedures which have not been validated.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... circumstances in which a user cannot or need not utilize the validation techniques contemplated by these... which has an adverse impact, the validation techniques contemplated by these guidelines usually should be followed if technically feasible. Where the user cannot or need not follow the validation...

  12. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  13. Cross-cultural validity of the ABILOCO questionnaire for individuals with stroke, based on Rasch analysis.

    PubMed

    Avelino, Patrick Roberto; Magalhães, Lívia Castro; Faria-Fortini, Iza; Basílio, Marluce Lopes; Menezes, Kênia Kiefer Parreiras; Teixeira-Salmela, Luci Fuscaldi

    2018-06-01

    The purpose of this study was to evaluate the cross-cultural validity of the Brazilian version of the ABILOCO questionnaire for stroke subjects. Cross-cultural adaptation of the original English version of the ABILOCO to the Brazilian-Portuguese language followed standardized procedures. The adapted version was administered to 136 stroke subjects and its measurement properties were assessed using Rash analysis. Cross-cultural validity was based on cultural invariance analyses. Goodness-of-fit analysis revealed one misfitting item. The principal component analysis of the residuals showed that the first dimension explained 45% of the variance in locomotion ability; however, the eigenvalue was 1.92. The ABILOCO-Brazil divided the sample into two levels of ability and the items into about seven levels of difficulty. The item-person map showed some ceiling effect. Cultural invariance analyses revealed that although there were differences in the item calibrations between the ABILOCO-original and ABILOCO-Brazil, they did not impact the measures of locomotion ability. The ABILOCO-Brazil demonstrated satisfactory measurement properties to be used within both clinical and research contexts in Brazil, as well cross-cultural validity to be used in international/multicentric studies. However, the presence of ceiling effect suggests that it may not be appropriate for the assessment of individuals with high levels of locomotion ability. Implications for rehabilitation Self-report measures of locomotion ability are clinically important, since they describe the abilities of the individuals within real life contexts. The ABILOCO questionnaire, specific for stroke survivors, demonstrated satisfactory measurement properties, but may not be most appropriate to assess individuals with high levels of locomotion ability The results of the cross-cultural validity showed that the ABILOCO-Original and the ABILOCO-Brazil calibrations may be used interchangeable.

  14. Translation, Cross-cultural Adaptation and Psychometric Validation of the Korean-Language Cardiac Rehabilitation Barriers Scale (CRBS-K).

    PubMed

    Baek, Sora; Park, Hee-Won; Lee, Yookyung; Grace, Sherry L; Kim, Won-Seok

    2017-10-01

    To perform a translation and cross-cultural adaptation of the Cardiac Rehabilitation Barriers Scale (CRBS) for use in Korea, followed by psychometric validation. The CRBS was developed to assess patients' perception of the degree to which patient, provider and health system-level barriers affect their cardiac rehabilitation (CR) participation. The CRBS consists of 21 items (barriers to adherence) rated on a 5-point Likert scale. The first phase was to translate and cross-culturally adapt the CRBS to the Korean language. After back-translation, both versions were reviewed by a committee. The face validity was assessed in a sample of Korean patients (n=53) with history of acute myocardial infarction that did not participate in CR through semi-structured interviews. The second phase was to assess the construct and criterion validity of the Korean translation as well as internal reliability, through administration of the translated version in 104 patients, principle component analysis with varimax rotation and cross-referencing against CR use, respectively. The length, readability, and clarity of the questionnaire were rated well, demonstrating face validity. Analysis revealed a six-factor solution, demonstrating construct validity. Cronbach's alpha was greater than 0.65. Barriers rated highest included not knowing about CR and not being contacted by a program. The mean CRBS score was significantly higher among non-attendees (2.71±0.26) than CR attendees (2.51±0.18) (p<0.01). The Korean version of CRBS has demonstrated face, content and criterion validity, suggesting it may be useful for assessing barriers to CR utilization in Korea.

  15. PCA as a practical indicator of OPLS-DA model reliability.

    PubMed

    Worley, Bradley; Powers, Robert

    Principal Component Analysis (PCA) and Orthogonal Projections to Latent Structures Discriminant Analysis (OPLS-DA) are powerful statistical modeling tools that provide insights into separations between experimental groups based on high-dimensional spectral measurements from NMR, MS or other analytical instrumentation. However, when used without validation, these tools may lead investigators to statistically unreliable conclusions. This danger is especially real for Partial Least Squares (PLS) and OPLS, which aggressively force separations between experimental groups. As a result, OPLS-DA is often used as an alternative method when PCA fails to expose group separation, but this practice is highly dangerous. Without rigorous validation, OPLS-DA can easily yield statistically unreliable group separation. A Monte Carlo analysis of PCA group separations and OPLS-DA cross-validation metrics was performed on NMR datasets with statistically significant separations in scores-space. A linearly increasing amount of Gaussian noise was added to each data matrix followed by the construction and validation of PCA and OPLS-DA models. With increasing added noise, the PCA scores-space distance between groups rapidly decreased and the OPLS-DA cross-validation statistics simultaneously deteriorated. A decrease in correlation between the estimated loadings (added noise) and the true (original) loadings was also observed. While the validity of the OPLS-DA model diminished with increasing added noise, the group separation in scores-space remained basically unaffected. Supported by the results of Monte Carlo analyses of PCA group separations and OPLS-DA cross-validation metrics, we provide practical guidelines and cross-validatory recommendations for reliable inference from PCA and OPLS-DA models.

  16. The Arthroscopic Surgical Skill Evaluation Tool (ASSET)

    PubMed Central

    Koehler, Ryan J.; Amsdell, Simon; Arendt, Elizabeth A; Bisson, Leslie J; Braman, Jonathan P; Butler, Aaron; Cosgarea, Andrew J; Harner, Christopher D; Garrett, William E; Olson, Tyson; Warme, Winston J.; Nicandri, Gregg T.

    2014-01-01

    Background Surgeries employing arthroscopic techniques are among the most commonly performed in orthopaedic clinical practice however, valid and reliable methods of assessing the arthroscopic skill of orthopaedic surgeons are lacking. Hypothesis The Arthroscopic Surgery Skill Evaluation Tool (ASSET) will demonstrate content validity, concurrent criterion-oriented validity, and reliability, when used to assess the technical ability of surgeons performing diagnostic knee arthroscopy on cadaveric specimens. Study Design Cross-sectional study; Level of evidence, 3 Methods Content validity was determined by a group of seven experts using a Delphi process. Intra-articular performance of a right and left diagnostic knee arthroscopy was recorded for twenty-eight residents and two sports medicine fellowship trained attending surgeons. Subject performance was assessed by two blinded raters using the ASSET. Concurrent criterion-oriented validity, inter-rater reliability, and test-retest reliability were evaluated. Results Content validity: The content development group identified 8 arthroscopic skill domains to evaluate using the ASSET. Concurrent criterion-oriented validity: Significant differences in total ASSET score (p<0.05) between novice, intermediate, and advanced experience groups were identified. Inter-rater reliability: The ASSET scores assigned by each rater were strongly correlated (r=0.91, p <0.01) and the intra-class correlation coefficient between raters for the total ASSET score was 0.90. Test-retest reliability: there was a significant correlation between ASSET scores for both procedures attempted by each individual (r = 0.79, p<0.01). Conclusion The ASSET appears to be a useful, valid, and reliable method for assessing surgeon performance of diagnostic knee arthroscopy in cadaveric specimens. Studies are ongoing to determine its generalizability to other procedures as well as to the live OR and other simulated environments. PMID:23548808

  17. Measuring the statistical validity of summary meta-analysis and meta-regression results for use in clinical practice.

    PubMed

    Willis, Brian H; Riley, Richard D

    2017-09-20

    An important question for clinicians appraising a meta-analysis is: are the findings likely to be valid in their own practice-does the reported effect accurately represent the effect that would occur in their own clinical population? To this end we advance the concept of statistical validity-where the parameter being estimated equals the corresponding parameter for a new independent study. Using a simple ('leave-one-out') cross-validation technique, we demonstrate how we may test meta-analysis estimates for statistical validity using a new validation statistic, Vn, and derive its distribution. We compare this with the usual approach of investigating heterogeneity in meta-analyses and demonstrate the link between statistical validity and homogeneity. Using a simulation study, the properties of Vn and the Q statistic are compared for univariate random effects meta-analysis and a tailored meta-regression model, where information from the setting (included as model covariates) is used to calibrate the summary estimate to the setting of application. Their properties are found to be similar when there are 50 studies or more, but for fewer studies Vn has greater power but a higher type 1 error rate than Q. The power and type 1 error rate of Vn are also shown to depend on the within-study variance, between-study variance, study sample size, and the number of studies in the meta-analysis. Finally, we apply Vn to two published meta-analyses and conclude that it usefully augments standard methods when deciding upon the likely validity of summary meta-analysis estimates in clinical practice. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2017 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  18. Comparing Approaches to Deal With Non-Gaussianity of Rainfall Data in Kriging-Based Radar-Gauge Rainfall Merging

    NASA Astrophysics Data System (ADS)

    Cecinati, F.; Wani, O.; Rico-Ramirez, M. A.

    2017-11-01

    Merging radar and rain gauge rainfall data is a technique used to improve the quality of spatial rainfall estimates and in particular the use of Kriging with External Drift (KED) is a very effective radar-rain gauge rainfall merging technique. However, kriging interpolations assume Gaussianity of the process. Rainfall has a strongly skewed, positive, probability distribution, characterized by a discontinuity due to intermittency. In KED rainfall residuals are used, implicitly calculated as the difference between rain gauge data and a linear function of the radar estimates. Rainfall residuals are non-Gaussian as well. The aim of this work is to evaluate the impact of applying KED to non-Gaussian rainfall residuals, and to assess the best techniques to improve Gaussianity. We compare Box-Cox transformations with λ parameters equal to 0.5, 0.25, and 0.1, Box-Cox with time-variant optimization of λ, normal score transformation, and a singularity analysis technique. The results suggest that Box-Cox with λ = 0.1 and the singularity analysis is not suitable for KED. Normal score transformation and Box-Cox with optimized λ, or λ = 0.25 produce satisfactory results in terms of Gaussianity of the residuals, probability distribution of the merged rainfall products, and rainfall estimate quality, when validated through cross-validation. However, it is observed that Box-Cox transformations are strongly dependent on the temporal and spatial variability of rainfall and on the units used for the rainfall intensity. Overall, applying transformations results in a quantitative improvement of the rainfall estimates only if the correct transformations for the specific data set are used.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akiyama, Kazunori; Fish, Vincent L.; Doeleman, Sheperd S.

    We propose a new imaging technique for radio and optical/infrared interferometry. The proposed technique reconstructs the image from the visibility amplitude and closure phase, which are standard data products of short-millimeter very long baseline interferometers such as the Event Horizon Telescope (EHT) and optical/infrared interferometers, by utilizing two regularization functions: the ℓ {sub 1}-norm and total variation (TV) of the brightness distribution. In the proposed method, optimal regularization parameters, which represent the sparseness and effective spatial resolution of the image, are derived from data themselves using cross-validation (CV). As an application of this technique, we present simulated observations of M87more » with the EHT based on four physically motivated models. We confirm that ℓ {sub 1} + TV regularization can achieve an optimal resolution of ∼20%–30% of the diffraction limit λ / D {sub max}, which is the nominal spatial resolution of a radio interferometer. With the proposed technique, the EHT can robustly and reasonably achieve super-resolution sufficient to clearly resolve the black hole shadow. These results make it promising for the EHT to provide an unprecedented view of the event-horizon-scale structure in the vicinity of the supermassive black hole in M87 and also the Galactic center Sgr A*.« less

  20. Multi-channel Spiral Twist Extrusion (MCSTE): A Novel Severe Plastic Deformation Technique for Grain Refinement

    NASA Astrophysics Data System (ADS)

    El-Garaihy, W. H.; Fouad, D. M.; Salem, H. G.

    2018-07-01

    Multi-channel Spiral Twist Extrusion (MCSTE) is introduced as a novel severe plastic deformation (SPD) technique for producing superior mechanical properties associated with ultrafine grained structure in bulk metals and alloys. The MCSTE design is based on inserting a uniform square cross-sectioned billet within stacked disks that guarantee shear strain accumulation. In an attempt to validate the technique and evaluate its plastic deformation characteristics, a series of experiments were conducted. The influence of the number of MCSTE passes on the mechanical properties and microstructural evolution of AA1100 alloy were investigated. Four passes of MCSTE, at a relatively low twisting angle of 30 deg, resulted in increasing the strength and hardness coupled with retention of ductility. Metallographic observations indicated a significant grain size reduction of 72 pct after 4 passes of MCSTE compared with the as-received (AR) condition. Moreover, the structural uniformity increased with the number of passes, which was reflected in the hardness distribution from the peripheries to the center of the extrudates. The current study showed that the MCSTE technique could be an effective, adaptable SPD die design with a promising potential for industrial applications compared to its counterparts.

  1. A novel ultrasound technique for detection of osteochondral defects in the ankle joint: a parametric and feasibility study.

    PubMed

    Sarkalkan, Nazli; Loeve, Arjo J; van Dongen, Koen W A; Tuijthof, Gabrielle J M; Zadpoor, Amir A

    2014-12-24

    (Osteo)chondral defects (OCDs) in the ankle are currently diagnosed with modalities that are not convenient to use in long-term follow-ups. Ultrasound (US) imaging, which is a cost-effective and non-invasive alternative, has limited ability to discriminate OCDs. We aim to develop a new diagnostic technique based on US wave propagation through the ankle joint. The presence of OCDs is identified when a US signal deviates from a reference signal associated with the healthy joint. The feasibility of the proposed technique is studied using experimentally-validated 2D finite-difference time-domain models of the ankle joint. The normalized maximum cross correlation of experiments and simulation was 0.97. Effects of variables relevant to the ankle joint, US transducers and OCDs were evaluated. Variations in joint space width and transducer orientation made noticeable alterations to the reference signal: normalized root mean square error ranged from 6.29% to 65.25% and from 19.59% to 8064.2%, respectively. The results suggest that the new technique could be used for detection of OCDs, if the effects of other parameters (i.e., parameters related to the ankle joint and US transducers) can be reduced.

  2. Multi-channel Spiral Twist Extrusion (MCSTE): A Novel Severe Plastic Deformation Technique for Grain Refinement

    NASA Astrophysics Data System (ADS)

    El-Garaihy, W. H.; Fouad, D. M.; Salem, H. G.

    2018-04-01

    Multi-channel Spiral Twist Extrusion (MCSTE) is introduced as a novel severe plastic deformation (SPD) technique for producing superior mechanical properties associated with ultrafine grained structure in bulk metals and alloys. The MCSTE design is based on inserting a uniform square cross-sectioned billet within stacked disks that guarantee shear strain accumulation. In an attempt to validate the technique and evaluate its plastic deformation characteristics, a series of experiments were conducted. The influence of the number of MCSTE passes on the mechanical properties and microstructural evolution of AA1100 alloy were investigated. Four passes of MCSTE, at a relatively low twisting angle of 30 deg, resulted in increasing the strength and hardness coupled with retention of ductility. Metallographic observations indicated a significant grain size reduction of 72 pct after 4 passes of MCSTE compared with the as-received (AR) condition. Moreover, the structural uniformity increased with the number of passes, which was reflected in the hardness distribution from the peripheries to the center of the extrudates. The current study showed that the MCSTE technique could be an effective, adaptable SPD die design with a promising potential for industrial applications compared to its counterparts.

  3. Simulation verification techniques study: Simulation performance validation techniques document. [for the space shuttle system

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1975-01-01

    Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.

  4. Cross-cultural adaptation and validation of the VISA-A questionnaire for German-speaking achilles tendinopathy patients.

    PubMed

    Lohrer, Heinz; Nauck, Tanja

    2009-10-30

    Achilles tendinopathy is the predominant overuse injury in runners. To further investigate this overload injury in transverse and longitudinal studies a valid, responsive and reliable outcome measure is demanded. Most questionnaires have been developed for English-speaking populations. This is also true for the VISA-A score, so far representing the only valid, reliable, and disease specific questionnaire for Achilles tendinopathy. To internationally compare research results, to perform multinational studies or to exclude bias originating from subpopulations speaking different languages within one country an equivalent instrument is demanded in different languages. The aim of this study was therefore to cross-cultural adapt and validate the VISA-A questionnaire for German-speaking Achilles tendinopathy patients. According to the "guidelines for the process of cross-cultural adaptation of self-report measures" the VISA-A score was cross-culturally adapted into German (VISA-A-G) using six steps: Translation, synthesis, back translation, expert committee review, pretesting (n = 77), and appraisal of the adaptation process by an advisory committee determining the adequacy of the cross-cultural adaptation. The resulting VISA-A-G was then subjected to an analysis of reliability, validity, and internal consistency in 30 Achilles tendinopathy patients and 79 asymptomatic people. Concurrent validity was tested against a generic tendon grading system (Percy and Conochie) and against a classification system for the effect of pain on athletic performance (Curwin and Stanish). The "advisory committee" determined the VISA-A-G questionnaire as been translated "acceptable". The VISA-A-G questionnaire showed moderate to excellent test-retest reliability (ICC = 0.60 to 0.97). Concurrent validity showed good coherence when correlated with the grading system of Curwin and Stanish (rho = -0.95) and for the Percy and Conochie grade of severity (rho 0.95). Internal consistency (Cronbach's alpha) for the total VISA-A-G scores of the patients was calculated to be 0.737. The VISA-A questionnaire was successfully cross-cultural adapted and validated for use in German speaking populations. The psychometric properties of the VISA-A-G questionnaire are similar to those of the original English version. It therefore can be recommended as a sufficiently robust tool for future measuring clinical severity of Achilles tendinopathy in German speaking patients.

  5. Cross-cultural adaptation and validation of the VISA-A questionnaire for German-speaking Achilles tendinopathy patients

    PubMed Central

    Lohrer, Heinz; Nauck, Tanja

    2009-01-01

    Background Achilles tendinopathy is the predominant overuse injury in runners. To further investigate this overload injury in transverse and longitudinal studies a valid, responsive and reliable outcome measure is demanded. Most questionnaires have been developed for English-speaking populations. This is also true for the VISA-A score, so far representing the only valid, reliable, and disease specific questionnaire for Achilles tendinopathy. To internationally compare research results, to perform multinational studies or to exclude bias originating from subpopulations speaking different languages within one country an equivalent instrument is demanded in different languages. The aim of this study was therefore to cross-cultural adapt and validate the VISA-A questionnaire for German-speaking Achilles tendinopathy patients. Methods According to the "guidelines for the process of cross-cultural adaptation of self-report measures" the VISA-A score was cross-culturally adapted into German (VISA-A-G) using six steps: Translation, synthesis, back translation, expert committee review, pretesting (n = 77), and appraisal of the adaptation process by an advisory committee determining the adequacy of the cross-cultural adaptation. The resulting VISA-A-G was then subjected to an analysis of reliability, validity, and internal consistency in 30 Achilles tendinopathy patients and 79 asymptomatic people. Concurrent validity was tested against a generic tendon grading system (Percy and Conochie) and against a classification system for the effect of pain on athletic performance (Curwin and Stanish). Results The "advisory committee" determined the VISA-A-G questionnaire as been translated "acceptable". The VISA-A-G questionnaire showed moderate to excellent test-retest reliability (ICC = 0.60 to 0.97). Concurrent validity showed good coherence when correlated with the grading system of Curwin and Stanish (rho = -0.95) and for the Percy and Conochie grade of severity (rho 0.95). Internal consistency (Cronbach's alpha) for the total VISA-A-G scores of the patients was calculated to be 0.737. Conclusion The VISA-A questionnaire was successfully cross-cultural adapted and validated for use in German speaking populations. The psychometric properties of the VISA-A-G questionnaire are similar to those of the original English version. It therefore can be recommended as a sufficiently robust tool for future measuring clinical severity of Achilles tendinopathy in German speaking patients. PMID:19878572

  6. Validated electrochemical and chromatographic quantifications of some antibiotic residues in pharmaceutical industrial waste water.

    PubMed

    Ibrahim, Heba K; Abdel-Moety, Mona M; Abdel-Gawad, Sherif A; Al-Ghobashy, Medhat A; Kawy, Mohamed Abdel

    2017-03-01

    Realistic implementation of ion selective electrodes (ISEs) into environmental monitoring programs has always been a challenging task. This could be largely attributed to difficulties in validation of ISE assay results. In this study, the electrochemical response of amoxicillin trihydrate (AMX), ciprofloxacin hydrochloride (CPLX), trimethoprim (TMP), and norfloxacin (NFLX) was studied by the fabrication of sensitive membrane electrodes belonging to two types of ISEs, which are polyvinyl chloride (PVC) membrane electrodes and glassy carbon (GC) electrodes. Linear response for the membrane electrodes was in the concentration range of 10 -5 -10 -2  mol/L. For the PVC membrane electrodes, Nernstian slopes of 55.1, 56.5, 56.5, and 54.0 mV/decade were achieved over a pH 4-8 for AMX, CPLX, and NFLX, respectively, and pH 3-6 for TMP. On the other hand, for GC electrodes, Nernstian slopes of 59.1, 58.2, 57.0, and 58.2 mV/decade were achieved over pH 4-8 for AMX, CPLX, and NFLX, respectively, and pH 3-6 for TMP. In addition to assay validation to international industry standards, the fabricated electrodes were also cross-validated relative to conventional separation techniques; high performance liquid chromatography (HPLC), and thin layer chromatography (TLC)-densitometry. The HPLC assay was applied in concentration range of 0.5-10.0 μg/mL, for all target analytes. The TLC-densitometry was adopted over a concentration range of 0.3-1.0 μg/band, for AMX, and 0.1-0.9 μg/band, for CPLX, NFLX, and TMP. The proposed techniques were successfully applied for quantification of the selected drugs either in pure form or waste water samples obtained from pharmaceutical plants. The actual waste water samples were subjected to solid phase extraction (SPE) for pretreatment prior to the application of chromatographic techniques (HPLC and TLC-densitometry). On the other hand, the fabricated electrodes were successfully applied for quantification of the antibiotic residues in actual waste water samples without any pretreatment. This finding assures the suitability of the fabricated ISEs for environmental analysis.

  7. An Automated Statistical Technique for Counting Distinct Multiple Sclerosis Lesions.

    PubMed

    Dworkin, J D; Linn, K A; Oguz, I; Fleishman, G M; Bakshi, R; Nair, G; Calabresi, P A; Henry, R G; Oh, J; Papinutto, N; Pelletier, D; Rooney, W; Stern, W; Sicotte, N L; Reich, D S; Shinohara, R T

    2018-04-01

    Lesion load is a common biomarker in multiple sclerosis, yet it has historically shown modest association with clinical outcome. Lesion count, which encapsulates the natural history of lesion formation and is thought to provide complementary information, is difficult to assess in patients with confluent (ie, spatially overlapping) lesions. We introduce a statistical technique for cross-sectionally counting pathologically distinct lesions. MR imaging was used to assess the probability of a lesion at each location. The texture of this map was quantified using a novel technique, and clusters resembling the center of a lesion were counted. Validity compared with a criterion standard count was demonstrated in 60 subjects observed longitudinally, and reliability was determined using 14 scans of a clinically stable subject acquired at 7 sites. The proposed count and the criterion standard count were highly correlated ( r = 0.97, P < .001) and not significantly different (t 59 = -.83, P = .41), and the variability of the proposed count across repeat scans was equivalent to that of lesion load. After accounting for lesion load and age, lesion count was negatively associated ( t 58 = -2.73, P < .01) with the Expanded Disability Status Scale. Average lesion size had a higher association with the Expanded Disability Status Scale ( r = 0.35, P < .01) than lesion load ( r = 0.10, P = .44) or lesion count ( r = -.12, P = .36) alone. This study introduces a novel technique for counting pathologically distinct lesions using cross-sectional data and demonstrates its ability to recover obscured longitudinal information. The proposed count allows more accurate estimation of lesion size, which correlated more closely with disability scores than either lesion load or lesion count alone. © 2018 by American Journal of Neuroradiology.

  8. Measuring Adolescent Social and Academic Self-Efficacy: Cross-Ethnic Validity of the SEQ-C

    ERIC Educational Resources Information Center

    Minter, Anthony; Pritzker, Suzanne

    2017-01-01

    Objective: This study examines the psychometric strength, including cross-ethnic validity, of two subscales of Muris' Self-Efficacy Questionnaire for Children: Academic Self-Efficacy (ASE) and Social Self-Efficacy (SSE). Methods: A large ethnically diverse sample of 3,358 early and late adolescents completed surveys including the ASE and SSE.…

  9. How Nonrecidivism Affects Predictive Accuracy: Evidence from a Cross-Validation of the Ontario Domestic Assault Risk Assessment (ODARA)

    ERIC Educational Resources Information Center

    Hilton, N. Zoe; Harris, Grant T.

    2009-01-01

    Prediction effect sizes such as ROC area are important for demonstrating a risk assessment's generalizability and utility. How a study defines recidivism might affect predictive accuracy. Nonrecidivism is problematic when predicting specialized violence (e.g., domestic violence). The present study cross-validates the ability of the Ontario…

  10. Cross-validation of generalised body composition equations with diverse young men and women: the Training Intervention and Genetics of Exercise Response (TIGER) Study

    USDA-ARS?s Scientific Manuscript database

    Generalised skinfold equations developed in the 1970s are commonly used to estimate laboratory-measured percentage fat (BF%). The equations were developed on predominately white individuals using Siri's two-component percentage fat equation (BF%-GEN). We cross-validated the Jackson-Pollock (JP) gene...

  11. Comprehensive Assessment of Emotional Disturbance: A Cross-Validation Approach

    ERIC Educational Resources Information Center

    Fisher, Emily S.; Doyon, Katie E.; Saldana, Enrique; Allen, Megan Redding

    2007-01-01

    Assessing a student for emotional disturbance is a serious and complex task given the stigma of the label and the ambiguities of the federal definition. One way that school psychologists can be more confident in their assessment results is to cross validate data from different sources using the RIOT approach (Review, Interview, Observe, Test).…

  12. The Learning Transfer System Inventory (LTSI) in Ukraine: The Cross-Cultural Validation of the Instrument

    ERIC Educational Resources Information Center

    Yamkovenko, Bogdan V.; Holton, Elwood, III; Bates, R. A.

    2007-01-01

    Purpose: The purpose of this research is to expand cross-cultural research and validate the Learning Transfer System Inventory in Ukraine. The researchers seek to translate the LTSI into Ukrainian and investigate the internal structure of this translated version of the questionnaire. Design/methodology/approach: The LTSI is translated into…

  13. Validation of annual growth rings in freshwater mussel shells using cross dating .Can

    Treesearch

    Andrew L. Rypel; Wendell R. Haag; Robert H. Findlay

    2009-01-01

    We examined the usefulness of dendrochronological cross-dating methods for studying long-term, interannual growth patterns in freshwater mussels, including validation of annual shell ring formation. Using 13 species from three rivers, we measured increment widths between putative annual rings on shell thin sections and then removed age-related variation by...

  14. Cross-Cultural Validation of Stages of Exercise Change Scale among Chinese College Students

    ERIC Educational Resources Information Center

    Keating, Xiaofen D.; Guan, Jianmin; Huang, Yong; Deng, Mingying; Wu, Yifeng; Qu, Shuhua

    2005-01-01

    The purpose of the study was to test the cross-cultural concurrent validity of the stages of exercise change scale (SECS) in Chinese college students. The original SECS was translated into Chinese (C-SECS). Students from four Chinese universities (N = 1843) participated in the study. The leisure-time exercise (LTE) questionnaire was used to…

  15. [Maslach Burnout Inventory - Student Survey: Portugal-Brazil cross-cultural adaptation].

    PubMed

    Campos, Juliana Alvares Duarte Bonini; Maroco, João

    2012-10-01

    To perform a cross-cultural adaptation of the Portuguese version of the Maslach Burnout Inventory for students (MBI-SS), and investigate its reliability, validity and cross-cultural invariance. The face validity involved the participation of a multidisciplinary team. Content validity was performed. The Portuguese version was completed in 2009, on the internet, by 958 Brazilian and 556 Portuguese university students from the urban area. Confirmatory factor analysis was carried out using as fit indices: the χ²/df, the Comparative Fit Index (CFI), the Goodness of Fit Index (GFI) and the Root Mean Square Error of Approximation (RMSEA). To verify the stability of the factor solution according to the original English version, cross-validation was performed in 2/3 of the total sample and replicated in the remaining 1/3. Convergent validity was estimated by the average variance extracted and composite reliability. The discriminant validity was assessed, and the internal consistency was estimated by the Cronbach's alpha coefficient. Concurrent validity was estimated by the correlational analysis of the mean scores of the Portuguese version and the Copenhagen Burnout Inventory, and the divergent validity was compared to the Beck Depression Inventory. The invariance of the model between the Brazilian and the Portuguese samples was assessed. The three-factor model of Exhaustion, Disengagement and Efficacy showed good fit (c 2/df = 8.498, CFI = 0.916, GFI = 0.902, RMSEA = 0.086). The factor structure was stable (λ:χ²dif = 11.383, p = 0.50; Cov: χ²dif = 6.479, p = 0.372; Residues: χ²dif = 21.514, p = 0.121). Adequate convergent validity (VEM = 0.45;0.64, CC = 0.82;0.88), discriminant (ρ² = 0.06;0.33) and internal consistency (α = 0.83;0.88) were observed. The concurrent validity of the Portuguese version with the Copenhagen Inventory was adequate (r = 0.21, 0.74). The assessment of the divergent validity was impaired by the approach of the theoretical concept of the dimensions Exhaustion and Disengagement of the Portuguese version with the Beck Depression Inventory. Invariance of the instrument between the Brazilian and Portuguese samples was not observed (λ:χ²dif = 84.768, p<0.001; Cov: χ²dif = 129.206, p < 0.001; Residues: χ²dif = 518.760, p < 0.001). The Portuguese version of the Maslach Burnout Inventory for students showed adequate reliability and validity, but its factor structure was not invariant between the countries, indicating the absence of cross-cultural stability.

  16. Empirical gradient threshold technique for automated segmentation across image modalities and cell lines.

    PubMed

    Chalfoun, J; Majurski, M; Peskin, A; Breen, C; Bajcsy, P; Brady, M

    2015-10-01

    New microscopy technologies are enabling image acquisition of terabyte-sized data sets consisting of hundreds of thousands of images. In order to retrieve and analyze the biological information in these large data sets, segmentation is needed to detect the regions containing cells or cell colonies. Our work with hundreds of large images (each 21,000×21,000 pixels) requires a segmentation method that: (1) yields high segmentation accuracy, (2) is applicable to multiple cell lines with various densities of cells and cell colonies, and several imaging modalities, (3) can process large data sets in a timely manner, (4) has a low memory footprint and (5) has a small number of user-set parameters that do not require adjustment during the segmentation of large image sets. None of the currently available segmentation methods meet all these requirements. Segmentation based on image gradient thresholding is fast and has a low memory footprint. However, existing techniques that automate the selection of the gradient image threshold do not work across image modalities, multiple cell lines, and a wide range of foreground/background densities (requirement 2) and all failed the requirement for robust parameters that do not require re-adjustment with time (requirement 5). We present a novel and empirically derived image gradient threshold selection method for separating foreground and background pixels in an image that meets all the requirements listed above. We quantify the difference between our approach and existing ones in terms of accuracy, execution speed, memory usage and number of adjustable parameters on a reference data set. This reference data set consists of 501 validation images with manually determined segmentations and image sizes ranging from 0.36 Megapixels to 850 Megapixels. It includes four different cell lines and two image modalities: phase contrast and fluorescent. Our new technique, called Empirical Gradient Threshold (EGT), is derived from this reference data set with a 10-fold cross-validation method. EGT segments cells or colonies with resulting Dice accuracy index measurements above 0.92 for all cross-validation data sets. EGT results has also been visually verified on a much larger data set that includes bright field and Differential Interference Contrast (DIC) images, 16 cell lines and 61 time-sequence data sets, for a total of 17,479 images. This method is implemented as an open-source plugin to ImageJ as well as a standalone executable that can be downloaded from the following link: https://isg.nist.gov/. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  17. Methodology and issues of integral experiments selection for nuclear data validation

    NASA Astrophysics Data System (ADS)

    Tatiana, Ivanova; Ivanov, Evgeny; Hill, Ian

    2017-09-01

    Nuclear data validation involves a large suite of Integral Experiments (IEs) for criticality, reactor physics and dosimetry applications. [1] Often benchmarks are taken from international Handbooks. [2, 3] Depending on the application, IEs have different degrees of usefulness in validation, and usually the use of a single benchmark is not advised; indeed, it may lead to erroneous interpretation and results. [1] This work aims at quantifying the importance of benchmarks used in application dependent cross section validation. The approach is based on well-known General Linear Least Squared Method (GLLSM) extended to establish biases and uncertainties for given cross sections (within a given energy interval). The statistical treatment results in a vector of weighting factors for the integral benchmarks. These factors characterize the value added by a benchmark for nuclear data validation for the given application. The methodology is illustrated by one example, selecting benchmarks for 239Pu cross section validation. The studies were performed in the framework of Subgroup 39 (Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files) established at the Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD).

  18. Investigation of anisotropic thermal transport in cross-linked polymers

    NASA Astrophysics Data System (ADS)

    Simavilla, David Nieto

    Thermal transport in lightly cross-linked polyisoprene and polybutadine subjected to uniaxial elongation is investigated experimentally. We employ two experimental techniques to assess the effect that deformation has on this class of materials. The first technique, which is based on Forced Rayleigh Scattering (FRS), allows us to measure the two independent components of the thermal diffusivity tensor as a function of deformation. These measurements along with independent measurements of the tensile stress and birefringence are used to evaluate the stress-thermal and stress-optic rules. The stress-thermal rule is found to be valid for the entire range of elongations applied. In contrast, the stress-optic rule fails for moderate to large stretch ratios. This suggests that the degree of anisotropy in thermal conductivity depends on both orientation and tension in polymer chain segments. The second technique, which is based on infrared thermography (IRT), allows us to measure anisotropy in thermal conductivity and strain induced changes in heat capacity. We validate this method measurements of anisotropic thermal conductivity by comparing them with those obtained using FRS. We find excellent agreement between the two techniques. Uncertainty in the infrared thermography method measurements is estimated to be about 2-5 %. The accuracy of the method and its potential application to non-transparent materials makes it a good alternative to extend current research on anisotropic thermal transport in polymeric materials. A second IRT application allows us to investigate the dependence of heat capacity on deformation. We find that heat capacity increases with stretch ratio in polyisoprene specimens under uniaxial extension. The deviation from the equilibrium value of heat capacity is consistent with an independent set of experiments comparing anisotropy in thermal diffusivity and conductivity employing FRS and IRT techniques. We identify finite extensibility and strain-induced crystallization as the possible causes explaining our observations and evaluate their contribution making use of classical rubber elasticity results. Finally, we study of the role of evaporation-induced thermal effects in the well-know phenomena of the tears of wine. We develop a transport model and support its predictions by experimentally measuring the temperature gradient present in wine and cognac films using IRT. Our results demonstrate that the Marangoni flow responsible for wine tears results from both composition and temperature gradients, whose relative contribution strongly depends on the thermodynamic properties of ethanol-water mixtures. The methods developed here can be used to obtain a deeper understanding of Marangoni flows, which are ubiquitous in nature and modern technology.

  19. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    NASA Technical Reports Server (NTRS)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  20. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    ERIC Educational Resources Information Center

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  1. Satellite laser ranging to low Earth orbiters: orbit and network validation

    NASA Astrophysics Data System (ADS)

    Arnold, Daniel; Montenbruck, Oliver; Hackel, Stefan; Sośnica, Krzysztof

    2018-04-01

    Satellite laser ranging (SLR) to low Earth orbiters (LEOs) provides optical distance measurements with mm-to-cm-level precision. SLR residuals, i.e., differences between measured and modeled ranges, serve as a common figure of merit for the quality assessment of orbits derived by radiometric tracking techniques. We discuss relevant processing standards for the modeling of SLR observations and highlight the importance of line-of-sight-dependent range corrections for the various types of laser retroreflector arrays. A 1-3 cm consistency of SLR observations and GPS-based precise orbits is demonstrated for a wide range of past and present LEO missions supported by the International Laser Ranging Service (ILRS). A parameter estimation approach is presented to investigate systematic orbit errors and it is shown that SLR validation of LEO satellites is not only able to detect radial but also along-track and cross-track offsets. SLR residual statistics clearly depend on the employed precise orbit determination technique (kinematic vs. reduced-dynamic, float vs. fixed ambiguities) but also reveal pronounced differences in the ILRS station performance. Using the residual-based parameter estimation approach, corrections to ILRS station coordinates, range biases, and timing offsets are derived. As a result, root-mean-square residuals of 5-10 mm have been achieved over a 1-year data arc in 2016 using observations from a subset of high-performance stations and ambiguity-fixed orbits of four LEO missions. As a final contribution, we demonstrate that SLR can not only validate single-satellite orbit solutions but also precise baseline solutions of formation flying missions such as GRACE, TanDEM-X, and Swarm.

  2. Rapid determination of crocins in saffron by near-infrared spectroscopy combined with chemometric techniques

    NASA Astrophysics Data System (ADS)

    Li, Shuailing; Shao, Qingsong; Lu, Zhonghua; Duan, Chengli; Yi, Haojun; Su, Liyang

    2018-02-01

    Saffron is an expensive spice. Its primary effective constituents are crocin I and II, and the contents of these compounds directly affect the quality and commercial value of saffron. In this study, near-infrared spectroscopy was combined with chemometric techniques for the determination of crocin I and II in saffron. Partial least squares regression models were built for the quantification of crocin I and II. By comparing different spectral ranges and spectral pretreatment methods (no pretreatment, vector normalization, subtract a straight line, multiplicative scatter correction, minimum-maximum normalization, eliminate the constant offset, first derivative, and second derivative), optimum models were developed. The root mean square error of cross-validation values of the best partial least squares models for crocin I and II were 1.40 and 0.30, respectively. The coefficients of determination for crocin I and II were 93.40 and 96.30, respectively. These results show that near-infrared spectroscopy can be combined with chemometric techniques to determine the contents of crocin I and II in saffron quickly and efficiently.

  3. Fabrication of setup for high temperature thermal conductivity measurement.

    PubMed

    Patel, Ashutosh; Pandey, Sudhir K

    2017-01-01

    In this work, we report the fabrication of an experimental setup for high temperature thermal conductivity (κ) measurement. It can characterize samples with various dimensions and shapes. Steady state based axial heat flow technique is used for κ measurement. Heat loss is measured using parallel thermal conductance technique. Simple design, lightweight, and small size sample holder is developed by using a thin heater and limited components. Low heat loss value is achieved by using very low thermal conductive insulator block with small cross-sectional area. Power delivered to the heater is measured accurately by using 4-wire technique and for this, the heater is developed with 4 wires. This setup is validated by using Bi 0.36 Sb 1.45 Te 3 , polycrystalline bismuth, gadolinium, and alumina samples. The data obtained for these samples are found to be in good agreement with the reported data. The maximum deviation of 6% in the value κ is observed. This maximum deviation is observed with the gadolinium sample. We also report the thermal conductivity of polycrystalline tellurium from 320 K to 550 K and the nonmonotonous behavior of κ with temperature is observed.

  4. Measurement of process variables in solid-state fermentation of wheat straw using FT-NIR spectroscopy and synergy interval PLS algorithm.

    PubMed

    Jiang, Hui; Liu, Guohai; Mei, Congli; Yu, Shuang; Xiao, Xiahong; Ding, Yuhan

    2012-11-01

    The feasibility of rapid determination of the process variables (i.e. pH and moisture content) in solid-state fermentation (SSF) of wheat straw using Fourier transform near infrared (FT-NIR) spectroscopy was studied. Synergy interval partial least squares (siPLS) algorithm was implemented to calibrate regression model. The number of PLS factors and the number of subintervals were optimized simultaneously by cross-validation. The performance of the prediction model was evaluated according to the root mean square error of cross-validation (RMSECV), the root mean square error of prediction (RMSEP) and the correlation coefficient (R). The measurement results of the optimal model were obtained as follows: RMSECV=0.0776, R(c)=0.9777, RMSEP=0.0963, and R(p)=0.9686 for pH model; RMSECV=1.3544% w/w, R(c)=0.8871, RMSEP=1.4946% w/w, and R(p)=0.8684 for moisture content model. Finally, compared with classic PLS and iPLS models, the siPLS model revealed its superior performance. The overall results demonstrate that FT-NIR spectroscopy combined with siPLS algorithm can be used to measure process variables in solid-state fermentation of wheat straw, and NIR spectroscopy technique has a potential to be utilized in SSF industry. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Well-being of Taiwanese and Singaporean college students: cross-cultural validity of a modified social cognitive model.

    PubMed

    Sheu, Hung-Bin; Chong, Shiqin Stephanie; Chen, Hsih-Fang; Lin, Wen-Chien

    2014-07-01

    This study tested the cross-cultural validity of a modified version of Lent's (2004) normative well-being model. Data of 317 Taiwanese and 259 Singaporean college students were collected using the Mandarin and English versions of the survey and were analyzed using structural equation modeling techniques. Satisfactory fit showed that the modified model offered a reasonable representation of the relations among the constructs and accounted for substantial amounts of the variances in academic well-being and life satisfaction for both samples. Results of the bootstrapping procedure revealed that indirect effects of personality traits and self-construal variables on well-being outcomes were mediated mostly by pathways that involved academic self-efficacy, academic goal progress, and/or academic supports. Academic well-being also filtered the effects of other predictors on life satisfaction. Multigroup structural equation modeling analyses indicated the presence of measurement equivalence across these 2 groups. However, several structural paths differed significantly between the Taiwanese and the Singaporean samples. Overall, this study provides evidence for the applicability of the modified well-being model to college students in Taiwan and Singapore and suggests that students in these 2 Asian countries might pursue and maintain their well-being through different psychological mechanisms. Practical implications for interventions and outreach programs as well as directions for future research are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  6. Can we really use available scales for child and adolescent psychopathology across cultures? A systematic review of cross-cultural measurement invariance data.

    PubMed

    Stevanovic, Dejan; Jafari, Peyman; Knez, Rajna; Franic, Tomislav; Atilola, Olayinka; Davidovic, Nikolina; Bagheri, Zahra; Lakic, Aneta

    2017-02-01

    In this systematic review, we assessed available evidence for cross-cultural measurement invariance of assessment scales for child and adolescent psychopathology as an indicator of cross-cultural validity. A literature search was conducted using the Medline, PsychInfo, Scopus, Web of Science, and Google Scholar databases. Cross-cultural measurement invariance data was available for 26 scales. Based on the aggregation of the evidence from the studies under review, none of the evaluated scales have strong evidence for cross-cultural validity and suitability for cross-cultural comparison. A few of the studies showed a moderate level of measurement invariance for some scales (such as the Fear Survey Schedule for Children-Revised, Multidimensional Anxiety Scale for Children, Revised Child Anxiety and Depression Scale, Revised Children's Manifest Anxiety Scale, Mood and Feelings Questionnaire, and Disruptive Behavior Rating Scale), which may make them suitable in cross-cultural comparative studies. The remainder of the scales either showed weak or outright lack of measurement invariance. This review showed only limited testing for measurement invariance across cultural groups of scales for pediatric psychopathology, with evidence of cross-cultural validity for only a few scales. This study also revealed a need to improve practices of statistical analysis reporting in testing measurement invariance. Implications for future research are discussed.

  7. Nuclear Quantum Effects in Water at the Triple Point: Using Theory as a Link Between Experiments.

    PubMed

    Cheng, Bingqing; Behler, Jörg; Ceriotti, Michele

    2016-06-16

    One of the most prominent consequences of the quantum nature of light atomic nuclei is that their kinetic energy does not follow a Maxwell-Boltzmann distribution. Deep inelastic neutron scattering (DINS) experiments can measure this effect. Thus, the nuclear quantum kinetic energy can be probed directly in both ordered and disordered samples. However, the relation between the quantum kinetic energy and the atomic environment is a very indirect one, and cross-validation with theoretical modeling is therefore urgently needed. Here, we use state of the art path integral molecular dynamics techniques to compute the kinetic energy of hydrogen and oxygen nuclei in liquid, solid, and gas-phase water close to the triple point, comparing three different interatomic potentials and validating our results against equilibrium isotope fractionation measurements. We will then show how accurate simulations can draw a link between extremely precise fractionation experiments and DINS, therefore establishing a reliable benchmark for future measurements and providing key insights to increase further the accuracy of interatomic potentials for water.

  8. Attenuated total reflectance-FT-IR spectroscopy for gunshot residue analysis: potential for ammunition determination.

    PubMed

    Bueno, Justin; Sikirzhytski, Vitali; Lednev, Igor K

    2013-08-06

    The ability to link a suspect to a particular shooting incident is a principal task for many forensic investigators. Here, we attempt to achieve this goal by analysis of gunshot residue (GSR) through the use of attenuated total reflectance (ATR) Fourier transform infrared spectroscopy (FT-IR) combined with statistical analysis. The firearm discharge process is analogous to a complex chemical process. Therefore, the products of this process (GSR) will vary based upon numerous factors, including the specific combination of the firearm and ammunition which was discharged. Differentiation of FT-IR data, collected from GSR particles originating from three different firearm-ammunition combinations (0.38 in., 0.40 in., and 9 mm calibers), was achieved using projection to latent structures discriminant analysis (PLS-DA). The technique was cross (leave-one-out), both internally and externally, validated. External validation was achieved via assignment (caliber identification) of unknown FT-IR spectra from unknown GSR particles. The results demonstrate great potential for ATR-FT-IR spectroscopic analysis of GSR for forensic purposes.

  9. A new technique for rapid assessment of eutrophication status of coastal waters using a support vector machine

    NASA Astrophysics Data System (ADS)

    Kong, Xianyu; Che, Xiaowei; Su, Rongguo; Zhang, Chuansong; Yao, Qingzhen; Shi, Xiaoyong

    2017-05-01

    There is an urgent need to develop efficient evaluation tools that use easily measured variables to make rapid and timely eutrophication assessments, which are important for marine health management, and to implement eutrophication monitoring programs. In this study, an approach for rapidly assessing the eutrophication status of coastal waters with three easily measured parameters (turbidity, chlorophyll a and dissolved oxygen) was developed by the grid search (GS) optimized support vector machine (SVM), with trophic index TRIX classification results as the reference. With the optimized penalty parameter C =64 and the kernel parameter γ =1, the classification accuracy rates reached 89.3% for the training data, 88.3% for the cross-validation, and 88.5% for the validation dataset. Because the developed approach only used three easy-to-measure variables, its application could facilitate the rapid assessment of the eutrophication status of coastal waters, resulting in potential cost savings in marine monitoring programs and assisting in the provision of timely advice for marine management.

  10. Concepts for the translation of genome-based innovations into public health: a comprehensive overview.

    PubMed

    Syurina, Elena V; Schulte In den Bäumen, Tobias; Brand, Angela; Ambrosino, Elena; Feron, Frans Jm

    2013-03-01

    Recent vast and rapid development of genome-related sciences is followed by the development of different assessment techniques or attempts to adapt the existing ones. The aim of this article is to give an overview of existing concepts for the assessment and translation of innovations into healthcare, applying a descriptive analysis of their present use by public health specialists and policy makers. The international literature review identified eight concepts including Health Technology Assessment, analytic validity, clinical validity, clinical utility, ethical, legal and social implications, Public Health Wheel and others. This study gives an overview of these concepts (including the level of current use) applying a descriptive analysis of their present use by public health specialists and policy makers. Despite the heterogeneity of the analyzed concepts and difference in use in everyday healthcare practice, the cross-integration of these concepts is important in order to improve translation speed and quality. Finally, some recommendations are made regarding the most applicable translational concepts.

  11. Fast and simultaneous prediction of animal feed nutritive values using near infrared reflectance spectroscopy

    NASA Astrophysics Data System (ADS)

    Samadi; Wajizah, S.; Munawar, A. A.

    2018-02-01

    Feed plays an important factor in animal production. The purpose of this study is to apply NIRS method in determining feed values. NIRS spectra data were acquired for feed samples in wavelength range of 1000 - 2500 nm with 32 scans and 0.2 nm wavelength. Spectral data were corrected by de-trending (DT) and standard normal variate (SNV) methods. Prediction of in vitro dry matter digestibility (IVDMD) and in vitro organic matter digestibility (IVOMD) were established as model by using principal component regression (PCR) and validated using leave one out cross validation (LOOCV). Prediction performance was quantified using coefficient correlation (r) and residual predictive deviation (RPD) index. The results showed that IVDMD and IVOMD can be predicted by using SNV spectra data with r and RPD index: 0.93 and 2.78 for IVDMD ; 0.90 and 2.35 for IVOMD respectively. In conclusion, NIRS technique appears feasible to predict animal feed nutritive values.

  12. Time and frequency pump-probe multiplexing to enhance the signal response of Brillouin optical time-domain analyzers.

    PubMed

    Soto, Marcelo A; Ricchiuti, Amelia Lavinia; Zhang, Liang; Barrera, David; Sales, Salvador; Thévenaz, Luc

    2014-11-17

    A technique to enhance the response and performance of Brillouin distributed fiber sensors is proposed and experimentally validated. The method consists in creating a multi-frequency pump pulse interacting with a matching multi-frequency continuous-wave probe. To avoid nonlinear cross-interaction between spectral lines, the method requires that the distinct pump pulse components and temporal traces reaching the photo-detector are subject to wavelength-selective delaying. This way the total pump and probe powers launched into the fiber can be incrementally boosted beyond the thresholds imposed by nonlinear effects. As a consequence of the multiplied pump-probe Brillouin interactions occurring along the fiber, the sensor response can be enhanced in exact proportion to the number of spectral components. The method is experimentally validated in a 50 km-long distributed optical fiber sensor augmented to 3 pump-probe spectral pairs, demonstrating a signal-to-noise ratio enhancement of 4.8 dB.

  13. Sino-Nasal Outcome Test-22: Translation, Cross-cultural Adaptation, and Validation in Hebrew-Speaking Patients.

    PubMed

    Shapira Galitz, Yael; Halperin, Doron; Bavnik, Yosef; Warman, Meir

    2016-05-01

    To perform the translation, cross-cultural adaptation, and validation of the Sino-Nasal Outcome Test-22 (SNOT-22) questionnaire to the Hebrew language. A single-center prospective cross-sectional study. Seventy-three chronic rhinosinusitis (CRS) patients and 73 patients without sinonasal disease filled the Hebrew version of the SNOT-22 questionnaire. Fifty-one CRS patients underwent endoscopic sinus surgery, out of which 28 filled a postoperative questionnaire. Seventy-three healthy volunteers without sinonasal disease also answered the questionnaire. Internal consistency, test-retest reproducibility, validity, and responsiveness of the questionnaire were evaluated. Questionnaire reliability was excellent, with a high internal consistency (Cronbach's alpha coefficient, 0.91-0.936) and test-retest reproducibility (Spearman's coefficient, 0.962). Mean scores for the preoperative, postoperative, and control groups were 50.44, 29.64, and 13.15, respectively (P < .0001 for CRS vs controls, P < .001 for preoperative vs postoperative), showing validity and responsiveness of the questionnaire. The Hebrew version of SNOT-22 questionnaire is a valid outcome measure for patients with CRS with or without nasal polyps. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2016.

  14. Dynamic MRI to quantify musculoskeletal motion: A systematic review of concurrent validity and reliability, and perspectives for evaluation of musculoskeletal disorders.

    PubMed

    Borotikar, Bhushan; Lempereur, Mathieu; Lelievre, Mathieu; Burdin, Valérie; Ben Salem, Douraied; Brochard, Sylvain

    2017-01-01

    To report evidence for the concurrent validity and reliability of dynamic MRI techniques to evaluate in vivo joint and muscle mechanics, and to propose recommendations for their use in the assessment of normal and impaired musculoskeletal function. The search was conducted on articles published in Web of science, PubMed, Scopus, Academic search Premier, and Cochrane Library between 1990 and August 2017. Studies that reported the concurrent validity and/or reliability of dynamic MRI techniques for in vivo evaluation of joint or muscle mechanics were included after assessment by two independent reviewers. Selected articles were assessed using an adapted quality assessment tool and a data extraction process. Results for concurrent validity and reliability were categorized as poor, moderate, or excellent. Twenty articles fulfilled the inclusion criteria with a mean quality assessment score of 66% (±10.4%). Concurrent validity and/or reliability of eight dynamic MRI techniques were reported, with the knee being the most evaluated joint (seven studies). Moderate to excellent concurrent validity and reliability were reported for seven out of eight dynamic MRI techniques. Cine phase contrast and real-time MRI appeared to be the most valid and reliable techniques to evaluate joint motion, and spin tag for muscle motion. Dynamic MRI techniques are promising for the in vivo evaluation of musculoskeletal mechanics; however results should be evaluated with caution since validity and reliability have not been determined for all joints and muscles, nor for many pathological conditions.

  15. Image correlation microscopy for uniform illumination.

    PubMed

    Gaborski, T R; Sealander, M N; Ehrenberg, M; Waugh, R E; McGrath, J L

    2010-01-01

    Image cross-correlation microscopy is a technique that quantifies the motion of fluorescent features in an image by measuring the temporal autocorrelation function decay in a time-lapse image sequence. Image cross-correlation microscopy has traditionally employed laser-scanning microscopes because the technique emerged as an extension of laser-based fluorescence correlation spectroscopy. In this work, we show that image correlation can also be used to measure fluorescence dynamics in uniform illumination or wide-field imaging systems and we call our new approach uniform illumination image correlation microscopy. Wide-field microscopy is not only a simpler, less expensive imaging modality, but it offers the capability of greater temporal resolution over laser-scanning systems. In traditional laser-scanning image cross-correlation microscopy, lateral mobility is calculated from the temporal de-correlation of an image, where the characteristic length is the illuminating laser beam width. In wide-field microscopy, the diffusion length is defined by the feature size using the spatial autocorrelation function. Correlation function decay in time occurs as an object diffuses from its original position. We show that theoretical and simulated comparisons between Gaussian and uniform features indicate the temporal autocorrelation function depends strongly on particle size and not particle shape. In this report, we establish the relationships between the spatial autocorrelation function feature size, temporal autocorrelation function characteristic time and the diffusion coefficient for uniform illumination image correlation microscopy using analytical, Monte Carlo and experimental validation with particle tracking algorithms. Additionally, we demonstrate uniform illumination image correlation microscopy analysis of adhesion molecule domain aggregation and diffusion on the surface of human neutrophils.

  16. An improved method of early diagnosis of smoking-induced respiratory changes using machine learning algorithms.

    PubMed

    Amaral, Jorge L M; Lopes, Agnaldo J; Jansen, José M; Faria, Alvaro C D; Melo, Pedro L

    2013-12-01

    The purpose of this study was to develop an automatic classifier to increase the accuracy of the forced oscillation technique (FOT) for diagnosing early respiratory abnormalities in smoking patients. The data consisted of FOT parameters obtained from 56 volunteers, 28 healthy and 28 smokers with low tobacco consumption. Many supervised learning techniques were investigated, including logistic linear classifiers, k nearest neighbor (KNN), neural networks and support vector machines (SVM). To evaluate performance, the ROC curve of the most accurate parameter was established as baseline. To determine the best input features and classifier parameters, we used genetic algorithms and a 10-fold cross-validation using the average area under the ROC curve (AUC). In the first experiment, the original FOT parameters were used as input. We observed a significant improvement in accuracy (KNN=0.89 and SVM=0.87) compared with the baseline (0.77). The second experiment performed a feature selection on the original FOT parameters. This selection did not cause any significant improvement in accuracy, but it was useful in identifying more adequate FOT parameters. In the third experiment, we performed a feature selection on the cross products of the FOT parameters. This selection resulted in a further increase in AUC (KNN=SVM=0.91), which allows for high diagnostic accuracy. In conclusion, machine learning classifiers can help identify early smoking-induced respiratory alterations. The use of FOT cross products and the search for the best features and classifier parameters can markedly improve the performance of machine learning classifiers. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. Investigation of a Cross-Correlation Based Optical Strain Measurement Technique for Detecting radial Growth on a Rotating Disk

    NASA Technical Reports Server (NTRS)

    Clem, Michelle M.; Woike, Mark R.

    2013-01-01

    The Aeronautical Sciences Project under NASA`s Fundamental Aeronautics Program is extremely interested in the development of novel measurement technologies, such as optical surface measurements in the internal parts of a flow path, for in situ health monitoring of gas turbine engines. In situ health monitoring has the potential to detect flaws, i.e. cracks in key components, such as engine turbine disks, before the flaws lead to catastrophic failure. In the present study, a cross-correlation imaging technique is investigated in a proof-of-concept study as a possible optical technique to measure the radial growth and strain field on an already cracked sub-scale turbine engine disk under loaded conditions in the NASA Glenn Research Center`s High Precision Rotordynamics Laboratory. The optical strain measurement technique under investigation offers potential fault detection using an applied high-contrast random speckle pattern and imaging the pattern under unloaded and loaded conditions with a CCD camera. Spinning the cracked disk at high speeds induces an external load, resulting in a radial growth of the disk of approximately 50.0-im in the flawed region and hence, a localized strain field. When imaging the cracked disk under static conditions, the disk will be undistorted; however, during rotation the cracked region will grow radially, thus causing the applied particle pattern to be .shifted`. The resulting particle displacements between the two images will then be measured using the two-dimensional cross-correlation algorithms implemented in standard Particle Image Velocimetry (PIV) software to track the disk growth, which facilitates calculation of the localized strain field. In order to develop and validate this optical strain measurement technique an initial proof-of-concept experiment is carried out in a controlled environment. Using PIV optimization principles and guidelines, three potential speckle patterns, for future use on the rotating disk, are developed and investigated in the controlled experiment. A range of known shifts are induced on the patterns; reference and data images are acquired before and after the induced shift, respectively, and the images are processed using the cross-correlation algorithms in order to determine the particle displacements. The effectiveness of each pattern at resolving the known shift is evaluated and discussed in order to choose the most suitable pattern to be implemented onto a rotating disk in the Rotordynamics Lab. Although testing on the rotating disk has not yet been performed, the driving principles behind the development of the present optical technique are based upon critical aspects of the future experiment, such as the amount of expected radial growth, disk analysis, and experimental design and are therefore addressed in the paper.

  18. Cross-Stream PIV Measurements of Jets With Internal Lobed Mixers

    NASA Technical Reports Server (NTRS)

    Bridges, James; Wernet, Mark P.

    2004-01-01

    With emphasis being placed on enhanced mixing of jet plumes for noise reduction and on predictions of jet noise based upon turbulent kinetic energy, unsteady measurements of jet plumes are a very important part of jet noise studies. Given that hot flows are of most practical interest, optical techniques such as Particle Image Velocimetry (PIV) are applicable. When the flow has strong azimuthal features, such as those generated by chevrons or lobed mixers, traditional PIV, which aligns the measurement plane parallel to the dominant flow direction is very inefficient, requiring many planes of data to be acquired and stacked up to produce the desired flow cross-sections. This paper presents PIV data acquired in a plane normal to the jet axis, directly measuring the cross-stream gradients and features of an internally mixed nozzle operating at aircraft engine flow conditions. These nozzle systems included variations in lobed mixer penetration, lobe count, lobe scalloping, and nozzle length. Several cases validating the accuracy of the PIV data are examined along with examples of its use in answering questions about the jet noise generation processes in these nozzles. Of most interest is the relationship of low frequency aft-directed noise with turbulence kinetic energy and mean velocity.

  19. Micro X-Ray Computed Tomography Mass Loss Assessment of Different UHMWPE: A Hip Joint Simulator Study on Standard vs. Cross-Linked Polyethylene

    PubMed Central

    Zanini, Filippo; Carmignato, Simone

    2017-01-01

    More than 60.000 hip arthroplasty are performed every year in Italy. Although Ultra-High-Molecular-Weight-Polyethylene remains the most used material as acetabular cup, wear of this material induces over time in vivo a foreign-body response and consequently osteolysis, pain, and the need of implant revision. Furthermore, oxidative wear of the polyethylene provoke several and severe failures. To solve these problems, highly cross-linked polyethylene and Vitamin-E-stabilized polyethylene were introduced in the last years. In in vitro experiments, various efforts have been made to compare the wear behavior of standard PE and vitamin-E infused liners. In this study we compared the in vitro wear behavior of two different configurations of cross-linked polyethylene (with and without the add of Vitamin E) vs. the standard polyethylene acetabular cups. The aim of the present study was to validate a micro X-ray computed tomography technique to assess the wear of different commercially available, polyethylene’s acetabular cups after wear simulation; in particular, the gravimetric method was used to provide reference wear values. The agreement between the two methods is documented in this paper. PMID:28107468

  20. Transient and chaotic low-energy transfers in a system with bistable nonlinearity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romeo, F., E-mail: francesco.romeo@uniroma1.it; Manevitch, L. I.; Bergman, L. A.

    2015-05-15

    The low-energy dynamics of a two-dof system composed of a grounded linear oscillator coupled to a lightweight mass by means of a spring with both cubic nonlinear and negative linear components is investigated. The mechanisms leading to intense energy exchanges between the linear oscillator, excited by a low-energy impulse, and the nonlinear attachment are addressed. For lightly damped systems, it is shown that two main mechanisms arise: Aperiodic alternating in-well and cross-well oscillations of the nonlinear attachment, and secondary nonlinear beats occurring once the dynamics evolves solely in-well. The description of the former dissipative phenomenon is provided in a two-dimensionalmore » projection of the phase space, where transitions between in-well and cross-well oscillations are associated with sequences of crossings across a pseudo-separatrix. Whereas the second mechanism is described in terms of secondary limiting phase trajectories of the nonlinear attachment under certain resonance conditions. The analytical treatment of the two aformentioned low-energy transfer mechanisms relies on the reduction of the nonlinear dynamics and consequent analysis of the reduced dynamics by asymptotic techniques. Direct numerical simulations fully validate our analytical predictions.« less

  1. An approach to define semantics for BPM systems interoperability

    NASA Astrophysics Data System (ADS)

    Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María

    2015-04-01

    This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.

  2. A global food demand model for the assessment of complex human-earth systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    EDMONDS, JAMES A.; LINK, ROBERT; WALDHOFF, STEPHANIE T.

    Demand for agricultural products is an important problem in climate change economics. Food consumption will shape and shaped by climate change and emissions mitigation policies through interactions with bioenergy and afforestation, two critical issues in meeting international climate goals such as two-degrees. We develop a model of food demand for staple and nonstaple commodities that evolves with changing incomes and prices. The model addresses a long-standing issue in estimating food demands, the evolution of demand relationships across large changes in income and prices. We discuss the model, some of its properties and limitations. We estimate parameter values using pooled cross-sectional-time-seriesmore » observations and the Metropolis Monte Carlo method and cross-validate the model by estimating parameters using a subset of the observations and test its ability to project into the unused observations. Finally, we apply bias correction techniques borrowed from the climate-modeling community and report results.« less

  3. Quasi-simultaneous Measurements of Ionic Currents by Vibrating Probe and pH Distribution by Ion-selective Microelectrode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isaacs, H.S.; Lamaka, S.V.; Taryba, M.

    2011-01-01

    This work reports a new methodology to measure quasi-simultaneously the local electric fields and the distribution of specific ions in a solution via selective microelectrodes. The field produced by the net electric current was detected using the scanning vibrating electrode technique (SVET) with quasi-simultaneous measurements of pH with an ion-selective microelectrode (pH-SME). The measurements were performed in a validation cell providing a 48 ?m diameter Pt wire cross section as a source of electric current. A time lag between acquiring each current density and pH data-point was 1.5 s due to the response time of pH-SME. The quasi-simultaneous SVET-pH measurementsmore » that correlate electrochemical oxidation-reduction processes with acid-base chemical equilibria are reported for the first time. No cross-talk between the vibrating microelectrode and the ion-selective microelectrode could be detected under given experimental conditions.« less

  4. Fine-tuning convolutional deep features for MRI based brain tumor classification

    NASA Astrophysics Data System (ADS)

    Ahmed, Kaoutar B.; Hall, Lawrence O.; Goldgof, Dmitry B.; Liu, Renhao; Gatenby, Robert A.

    2017-03-01

    Prediction of survival time from brain tumor magnetic resonance images (MRI) is not commonly performed and would ordinarily be a time consuming process. However, current cross-sectional imaging techniques, particularly MRI, can be used to generate many features that may provide information on the patient's prognosis, including survival. This information can potentially be used to identify individuals who would benefit from more aggressive therapy. Rather than using pre-defined and hand-engineered features as with current radiomics methods, we investigated the use of deep features extracted from pre-trained convolutional neural networks (CNNs) in predicting survival time. We also provide evidence for the power of domain specific fine-tuning in improving the performance of a pre-trained CNN's, even though our data set is small. We fine-tuned a CNN initially trained on a large natural image recognition dataset (Imagenet ILSVRC) and transferred the learned feature representations to the survival time prediction task, obtaining over 81% accuracy in a leave one out cross validation.

  5. New strategy for drug discovery by large-scale association analysis of molecular networks of different species.

    PubMed

    Zhang, Bo; Fu, Yingxue; Huang, Chao; Zheng, Chunli; Wu, Ziyin; Zhang, Wenjuan; Yang, Xiaoyan; Gong, Fukai; Li, Yuerong; Chen, Xiaoyu; Gao, Shuo; Chen, Xuetong; Li, Yan; Lu, Aiping; Wang, Yonghua

    2016-02-25

    The development of modern omics technology has not significantly improved the efficiency of drug development. Rather precise and targeted drug discovery remains unsolved. Here a large-scale cross-species molecular network association (CSMNA) approach for targeted drug screening from natural sources is presented. The algorithm integrates molecular network omics data from humans and 267 plants and microbes, establishing the biological relationships between them and extracting evolutionarily convergent chemicals. This technique allows the researcher to assess targeted drugs for specific human diseases based on specific plant or microbe pathways. In a perspective validation, connections between the plant Halliwell-Asada (HA) cycle and the human Nrf2-ARE pathway were verified and the manner by which the HA cycle molecules act on the human Nrf2-ARE pathway as antioxidants was determined. This shows the potential applicability of this approach in drug discovery. The current method integrates disparate evolutionary species into chemico-biologically coherent circuits, suggesting a new cross-species omics analysis strategy for rational drug development.

  6. An Improved Time-Frequency Analysis Method in Interference Detection for GNSS Receivers

    PubMed Central

    Sun, Kewen; Jin, Tian; Yang, Dongkai

    2015-01-01

    In this paper, an improved joint time-frequency (TF) analysis method based on a reassigned smoothed pseudo Wigner–Ville distribution (RSPWVD) has been proposed in interference detection for Global Navigation Satellite System (GNSS) receivers. In the RSPWVD, the two-dimensional low-pass filtering smoothing function is introduced to eliminate the cross-terms present in the quadratic TF distribution, and at the same time, the reassignment method is adopted to improve the TF concentration properties of the auto-terms of the signal components. This proposed interference detection method is evaluated by experiments on GPS L1 signals in the disturbing scenarios compared to the state-of-the-art interference detection approaches. The analysis results show that the proposed interference detection technique effectively overcomes the cross-terms problem and also preserves good TF localization properties, which has been proven to be effective and valid to enhance the interference detection performance of the GNSS receivers, particularly in the jamming environments. PMID:25905704

  7. Phage display peptide libraries in molecular allergology: from epitope mapping to mimotope-based immunotherapy.

    PubMed

    Luzar, J; Štrukelj, B; Lunder, M

    2016-11-01

    Identification of allergen epitopes is a key component in proper understanding of the pathogenesis of type I allergies, for understanding cross-reactivity and for the development of mimotope immunotherapeutics. Phage particles have garnered recognition in the field of molecular allergology due to their value not only in competitive immunoscreening of peptide libraries but also as immunogenic carriers of allergen mimotopes. They integrate epitope discovery technology and immunization functions into a single platform. This article provides an overview of allergen mimotopes identified through the phage display technique. We discuss the contribution of phage display peptide libraries in determining dominant B-cell epitopes of allergens, in developing mimotope immunotherapy, in understanding cross-reactivity, and in determining IgE epitope profiles of individual patients to improve diagnostics and individualize immunotherapy. We also discuss the advantages and pitfalls of the methodology used to identify and validate the mimotopes. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. A simulation-based study on the influence of beam hardening in X-ray computed tomography for dimensional metrology.

    PubMed

    Lifton, Joseph J; Malcolm, Andrew A; McBride, John W

    2015-01-01

    X-ray computed tomography (CT) is a radiographic scanning technique for visualising cross-sectional images of an object non-destructively. From these cross-sectional images it is possible to evaluate internal dimensional features of a workpiece which may otherwise be inaccessible to tactile and optical instruments. Beam hardening is a physical process that degrades the quality of CT images and has previously been suggested to influence dimensional measurements. Using a validated simulation tool, the influence of spectrum pre-filtration and beam hardening correction are evaluated for internal and external dimensional measurements. Beam hardening is shown to influence internal and external dimensions in opposition, and to have a greater influence on outer dimensions compared to inner dimensions. The results suggest the combination of spectrum pre-filtration and a local gradient-based surface determination method are able to greatly reduce the influence of beam hardening in X-ray CT for dimensional metrology.

  9. Using a combined computational-experimental approach to predict antibody-specific B cell epitopes.

    PubMed

    Sela-Culang, Inbal; Benhnia, Mohammed Rafii-El-Idrissi; Matho, Michael H; Kaever, Thomas; Maybeno, Matt; Schlossman, Andrew; Nimrod, Guy; Li, Sheng; Xiang, Yan; Zajonc, Dirk; Crotty, Shane; Ofran, Yanay; Peters, Bjoern

    2014-04-08

    Antibody epitope mapping is crucial for understanding B cell-mediated immunity and required for characterizing therapeutic antibodies. In contrast to T cell epitope mapping, no computational tools are in widespread use for prediction of B cell epitopes. Here, we show that, utilizing the sequence of an antibody, it is possible to identify discontinuous epitopes on its cognate antigen. The predictions are based on residue-pairing preferences and other interface characteristics. We combined these antibody-specific predictions with results of cross-blocking experiments that identify groups of antibodies with overlapping epitopes to improve the predictions. We validate the high performance of this approach by mapping the epitopes of a set of antibodies against the previously uncharacterized D8 antigen, using complementary techniques to reduce method-specific biases (X-ray crystallography, peptide ELISA, deuterium exchange, and site-directed mutagenesis). These results suggest that antibody-specific computational predictions and simple cross-blocking experiments allow for accurate prediction of residues in conformational B cell epitopes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Validation and cross-cultural pilot testing of compliance with standard precautions scale: self-administered instrument for clinical nurses.

    PubMed

    Lam, Simon C

    2014-05-01

    To perform detailed psychometric testing of the compliance with standard precautions scale (CSPS) in measuring compliance with standard precautions of clinical nurses and to conduct cross-cultural pilot testing and assess the relevance of the CSPS on an international platform. A cross-sectional and correlational design with repeated measures. Nursing students from a local registered nurse training university, nurses from different hospitals in Hong Kong, and experts in an international conference. The psychometric properties of the CSPS were evaluated via internal consistency, 2-week and 3-month test-retest reliability, concurrent validation, and construct validation. The cross-cultural pilot testing and relevance check was examined by experts on infection control from various developed and developing regions. Among 453 participants, 193 were nursing students, 165 were enrolled nurses, and 95 were registered nurses. The results showed that the CSPS had satisfactory reliability (Cronbach α = 0.73; intraclass correlation coefficient, 0.79 for 2-week test-retest and 0.74 for 3-month test-retest) and validity (optimum correlation with criterion measure; r = 0.76, P < .001; satisfactory results on known-group method and hypothesis testing). A total of 19 experts from 16 countries assured that most of the CSPS findings were relevant and globally applicable. The CSPS demonstrated satisfactory results on the basis of the standard international criteria on psychometric testing, which ascertained the reliability and validity of this instrument in measuring the compliance of clinical nurses with standard precautions. The cross-cultural pilot testing further reinforced the instrument's relevance and applicability in most developed and developing regions.

  11. Validation of Yoon's Critical Thinking Disposition Instrument.

    PubMed

    Shin, Hyunsook; Park, Chang Gi; Kim, Hyojin

    2015-12-01

    The lack of reliable and valid evaluation tools targeting Korean nursing students' critical thinking (CT) abilities has been reported as one of the barriers to instructing and evaluating students in undergraduate programs. Yoon's Critical Thinking Disposition (YCTD) instrument was developed for Korean nursing students, but few studies have assessed its validity. This study aimed to validate the YCTD. Specifically, the YCTD was assessed to identify its cross-sectional and longitudinal measurement invariance. This was a validation study in which a cross-sectional and longitudinal (prenursing and postnursing practicum) survey was used to validate the YCTD using 345 nursing students at three universities in Seoul, Korea. The participants' CT abilities were assessed using the YCTD before and after completing an established pediatric nursing practicum. The validity of the YCTD was estimated and then group invariance test using multigroup confirmatory factor analysis was performed to confirm the measurement compatibility of multigroups. A test of the seven-factor model showed that the YCTD demonstrated good construct validity. Multigroup confirmatory factor analysis findings for the measurement invariance suggested that this model structure demonstrated strong invariance between groups (i.e., configural, factor loading, and intercept combined) but weak invariance within a group (i.e., configural and factor loading combined). In general, traditional methods for assessing instrument validity have been less than thorough. In this study, multigroup confirmatory factor analysis using cross-sectional and longitudinal measurement data allowed validation of the YCTD. This study concluded that the YCTD can be used for evaluating Korean nursing students' CT abilities. Copyright © 2015. Published by Elsevier B.V.

  12. Women's preference for traditional birth attendants and modern health care practitioners in Akpabuyo community of Cross River State, Nigeria.

    PubMed

    Akpabio, Idongesit I; Edet, Olaide B; Etifit, Rita E; Robinson-Bassey, Grace C

    2014-01-01

    The proportion of women who patronized traditional birth attendants (TBAs) or modern health care practitioners (MHCPs) was compared, including reasons for their choices. A comparative design was adopted to study 300 respondents selected through a multistage systematic random sampling technique. The instrument for data collection was a validated 21-item structured questionnaire. We observed that 75 (25%) patronized and 80 (27%) preferred TBAs, and 206 (69%) patronized and 220 (75%) preferred MHCPs, while 19 (6%) patronized both. The view that TBAs prayed before conducting deliveries was supported by a majority 75 (94%) of the respondents who preferred them. Factors associated with preference for TBAs should be addressed.

  13. A multiscale decomposition approach to detect abnormal vasculature in the optic disc.

    PubMed

    Agurto, Carla; Yu, Honggang; Murray, Victor; Pattichis, Marios S; Nemeth, Sheila; Barriga, Simon; Soliz, Peter

    2015-07-01

    This paper presents a multiscale method to detect neovascularization in the optic disc (NVD) using fundus images. Our method is applied to a manually selected region of interest (ROI) containing the optic disc. All the vessels in the ROI are segmented by adaptively combining contrast enhancement methods with a vessel segmentation technique. Textural features extracted using multiscale amplitude-modulation frequency-modulation, morphological granulometry, and fractal dimension are used. A linear SVM is used to perform the classification, which is tested by means of 10-fold cross-validation. The performance is evaluated using 300 images achieving an AUC of 0.93 with maximum accuracy of 88%. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. A preliminary comparison of scanning vs nonscanning radiometer data from Earth Radiation Budget Satellite (ERBS)

    NASA Technical Reports Server (NTRS)

    Wu, X.; Smith, W. L.; Herman, L. D.

    1986-01-01

    A cross validation technique is used to simulate the radiation flux detected by the nonscanning wide FOV (WFOV) and medium FOV (MFOV) radiometers on the ERBS by integrating the top of atmosphere spectral radiance recorded with narrow FOV (NFOV) sensors. Consideration is given to both bidirectional and isotropic radiance contributions, including all shortwave and longwave components. A weighting procedure is defined to adjust for missing or inaccurate data records and a coordinate transformation is devised to account for angular discrepancies among the views of the WFOV, MFOV and NFOV sensors. Student t-test values were calculated for values generated for whole orbit average, morning, noon, evening and night subsatellite views.

  15. HDX Workbench: Software for the Analysis of H/D Exchange MS Data

    NASA Astrophysics Data System (ADS)

    Pascal, Bruce D.; Willis, Scooter; Lauer, Janelle L.; Landgraf, Rachelle R.; West, Graham M.; Marciano, David; Novick, Scott; Goswami, Devrishi; Chalmers, Michael J.; Griffin, Patrick R.

    2012-09-01

    Hydrogen/deuterium exchange mass spectrometry (HDX-MS) is an established method for the interrogation of protein conformation and dynamics. While the data analysis challenge of HDX-MS has been addressed by a number of software packages, new computational tools are needed to keep pace with the improved methods and throughput of this technique. To address these needs, we report an integrated desktop program titled HDX Workbench, which facilitates automation, management, visualization, and statistical cross-comparison of large HDX data sets. Using the software, validated data analysis can be achieved at the rate of generation. The application is available at the project home page http://hdx.florida.scripps.edu.

  16. All-fiber variable optical delay line for applications in optical coherence tomography: feasibility study for a novel delay line.

    PubMed

    Choi, Eunseo; Na, Jihoon; Ryu, Seon; Mudhana, Gopinath; Lee, Byeong

    2005-02-21

    We have implemented an all-fiber optical delay line using two linearly chirped fiber Bragg gratings cascaded in reverse order and all-fiber optics components. The features of the proposed all-fiber based technique for variable delay line are discussed theoretically and demonstrated experimentally. The non-invasive cross-sectional images of biomedical samples as well as a transparent glass plate obtained with implemented all-fiber delay line having the axial resolution of 100 mum and the dynamic range of 50dB are presented to validates the imaging performance and demonstrate the feasibility of the delay line for optical coherence tomography.

  17. Dutch population specific sex estimation formulae using the proximal femur.

    PubMed

    Colman, K L; Janssen, M C L; Stull, K E; van Rijn, R R; Oostra, R J; de Boer, H H; van der Merwe, A E

    2018-05-01

    Sex estimation techniques are frequently applied in forensic anthropological analyses of unidentified human skeletal remains. While morphological sex estimation methods are able to endure population differences, the classification accuracy of metric sex estimation methods are population-specific. No metric sex estimation method currently exists for the Dutch population. The purpose of this study is to create Dutch population specific sex estimation formulae by means of osteometric analyses of the proximal femur. Since the Netherlands lacks a representative contemporary skeletal reference population, 2D plane reconstructions, derived from clinical computed tomography (CT) data, were used as an alternative source for a representative reference sample. The first part of this study assesses the intra- and inter-observer error, or reliability, of twelve measurements of the proximal femur. The technical error of measurement (TEM) and relative TEM (%TEM) were calculated using 26 dry adult femora. In addition, the agreement, or accuracy, between the dry bone and CT-based measurements was determined by percent agreement. Only reliable and accurate measurements were retained for the logistic regression sex estimation formulae; a training set (n=86) was used to create the models while an independent testing set (n=28) was used to validate the models. Due to high levels of multicollinearity, only single variable models were created. Cross-validated classification accuracies ranged from 86% to 92%. The high cross-validated classification accuracies indicate that the developed formulae can contribute to the biological profile and specifically in sex estimation of unidentified human skeletal remains in the Netherlands. Furthermore, the results indicate that clinical CT data can be a valuable alternative source of data when representative skeletal collections are unavailable. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Parallel image logical operations using cross correlation

    NASA Technical Reports Server (NTRS)

    Strong, J. P., III

    1972-01-01

    Methods are presented for counting areas in an image in a parallel manner using noncoherent optical techniques. The techniques presented include the Levialdi algorithm for counting, optical techniques for binary operations, and cross-correlation.

  19. Biomechanical comparison of the double-push technique and the conventional skate skiing technique in cross-country sprint skiing.

    PubMed

    Stöggl, Thomas; Müller, Erich; Lindinger, Stefan

    2008-09-01

    The aims of the study were to: (1) adapt the "double-push" technique from inline skating to cross-country skiing; (2) compare this new skiing technique with the conventional skate skiing cross-country technique; and (3) test the hypothesis that the double-push technique improves skiing speed in a short sprint. 13 elite skiers performed maximum-speed sprints over 100 m using the double-push skate skiing technique and using the conventional "V2" skate skiing technique. Pole and plantar forces, knee angle, cycle characteristics, and electromyography of nine lower body muscles were analysed. We found that the double-push technique could be successfully transferred to cross-country skiing, and that this new technique is faster than the conventional skate skiing technique. The double-push technique was 2.9 +/- 2.2% faster (P < 0.001), which corresponds to a time advantage of 0.41 +/- 0.31 s over 100 m. The double-push technique had a longer cycle length and a lower cycle rate, and it was characterized by higher muscle activity, higher knee extension amplitudes and velocities, and higher peak foot forces, especially in the first phase of the push-off. Also, the foot was more loaded laterally in the double-push technique than in the conventional skate skiing technique.

  20. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Uranium Metal, Oxide, and Solution Systems on the High Performance Computing Platform Moonlight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Bryan Scott; MacQuigg, Michael Robert; Wysong, Andrew Russell

    In this document, the code MCNP is validated with ENDF/B-VII.1 cross section data under the purview of ANSI/ANS-8.24-2007, for use with uranium systems. MCNP is a computer code based on Monte Carlo transport methods. While MCNP has wide reading capability in nuclear transport simulation, this validation is limited to the functionality related to neutron transport and calculation of criticality parameters such as k eff.

  1. A Cross-Cultural Test of Sex Bias in the Predictive Validity of Scholastic Aptitude Examinations: Some Israeli Findings.

    ERIC Educational Resources Information Center

    Zeidner, Moshe

    1987-01-01

    This study examined the cross-cultural validity of the sex bias contention with respect to standardized aptitude testing, used for academic prediction purposes in Israel. Analyses were based on the grade point average and scores of 1778 Jewish and 1017 Arab students who were administered standardized college entrance test batteries. (Author/LMO)

  2. Assessing Autistic Traits in a Taiwan Preschool Population: Cross-Cultural Validation of the Social Responsiveness Scale (SRS)

    ERIC Educational Resources Information Center

    Wang, Jessica; Lee, Li-Ching; Chen, Ying-Sheue; Hsu, Ju-Wei

    2012-01-01

    The cross-cultural validity of the Mandarin-adaptation of the social responsiveness scale (SRS) was examined in a sample of N = 307 participants in Taiwan, 140 typically developing and 167 with clinically-diagnosed developmental disorders. This scale is an autism assessment tool that provides a quantitative rather than categorical measure of…

  3. A New Symptom Model for Autism Cross-Validated in an Independent Sample

    ERIC Educational Resources Information Center

    Boomsma, A.; Van Lang, N. D. J.; De Jonge, M. V.; De Bildt, A. A.; Van Engeland, H.; Minderaa, R. B.

    2008-01-01

    Background: Results from several studies indicated that a symptom model other than the DSM triad might better describe symptom domains of autism. The present study focused on a) investigating the stability of a new symptom model for autism by cross-validating it in an independent sample and b) examining the invariance of the model regarding three…

  4. Translation, cross-cultural adaptation and validation of an HIV/AIDS knowledge and attitudinal instrument.

    PubMed

    Zometa, Carlos S; Dedrick, Robert; Knox, Michael D; Westhoff, Wayne; Siri, Rodrigo Simán; Debaldo, Ann

    2007-06-01

    An instrument developed in the United States by the Centers for Disease Control and Prevention to assess HIV/AIDS knowledge and four attitudinal dimensions (Peer Pressure, Abstinence, Drug Use, and Threat of HIV Infection) and an instrument developed by Basen-Engquist et al. (1999) to measure abstinence and condom use were translated, cross-culturally adapted, and validated for use with Spanish-speaking high school students in El Salvador. A back-translation of the English version was cross-culturally adapted using two different review panels and pilot-tested with Salvadorian students. An expert panel established content validity, and confirmatory factor analysis provided support for construct validity. Results indicated that the methodology was successful in cross-culturally adapting the instrument developed by the Centers for Disease Control and Prevention and the instrument developed by Basen-Engquist et al. The psychometric properties of the knowledge section were acceptable and there was partial support for the four-factor attitudinal model underlying the CDC instrument and the two-factor model underlying the Basen-Engquist et al. instrument. Additional studies with Spanish-speaking populations (either in the United States or Latin America) are needed to evaluate the generalizability of the present results.

  5. Development and Validation of Noninvasive Magnetic Resonance Relaxometry for the In Vivo Assessment of Tissue-Engineered Graft Oxygenation

    PubMed Central

    Einstein, Samuel A.; Weegman, Bradley P.; Firpo, Meri T.; Papas, Klearchos K.

    2016-01-01

    Techniques to monitor the oxygen partial pressure (pO2) within implanted tissue-engineered grafts (TEGs) are critically necessary for TEG development, but current methods are invasive and inaccurate. In this study, we developed an accurate and noninvasive technique to monitor TEG pO2 utilizing proton (1H) or fluorine (19F) magnetic resonance spectroscopy (MRS) relaxometry. The value of the spin-lattice relaxation rate constant (R1) of some biocompatible compounds is sensitive to dissolved oxygen (and temperature), while insensitive to other external factors. Through this physical mechanism, MRS can measure the pO2 of implanted TEGs. We evaluated six potential MRS pO2 probes and measured their oxygen and temperature sensitivities and their intrinsic R1 values at 16.4 T. Acellular TEGs were constructed by emulsifying porcine plasma with perfluoro-15-crown-5-ether, injecting the emulsion into a macroencapsulation device, and cross-linking the plasma with a thrombin solution. A multiparametric calibration equation containing R1, pO2, and temperature was empirically generated from MRS data and validated with fiber optic (FO) probes in vitro. TEGs were then implanted in a dorsal subcutaneous pocket in a murine model and evaluated with MRS up to 29 days postimplantation. R1 measurements from the TEGs were converted to pO2 values using the established calibration equation and these in vivo pO2 measurements were simultaneously validated with FO probes. Additionally, MRS was used to detect increased pO2 within implanted TEGs that received supplemental oxygen delivery. Finally, based on a comparison of our MRS data with previously reported data, ultra-high-field (16.4 T) is shown to have an advantage for measuring hypoxia with 19F MRS. Results from this study show MRS relaxometry to be a precise, accurate, and noninvasive technique to monitor TEG pO2 in vitro and in vivo. PMID:27758135

  6. Development and Validation of Noninvasive Magnetic Resonance Relaxometry for the In Vivo Assessment of Tissue-Engineered Graft Oxygenation.

    PubMed

    Einstein, Samuel A; Weegman, Bradley P; Firpo, Meri T; Papas, Klearchos K; Garwood, Michael

    2016-11-01

    Techniques to monitor the oxygen partial pressure (pO 2 ) within implanted tissue-engineered grafts (TEGs) are critically necessary for TEG development, but current methods are invasive and inaccurate. In this study, we developed an accurate and noninvasive technique to monitor TEG pO 2 utilizing proton ( 1 H) or fluorine ( 19 F) magnetic resonance spectroscopy (MRS) relaxometry. The value of the spin-lattice relaxation rate constant (R 1 ) of some biocompatible compounds is sensitive to dissolved oxygen (and temperature), while insensitive to other external factors. Through this physical mechanism, MRS can measure the pO 2 of implanted TEGs. We evaluated six potential MRS pO 2 probes and measured their oxygen and temperature sensitivities and their intrinsic R 1 values at 16.4 T. Acellular TEGs were constructed by emulsifying porcine plasma with perfluoro-15-crown-5-ether, injecting the emulsion into a macroencapsulation device, and cross-linking the plasma with a thrombin solution. A multiparametric calibration equation containing R 1 , pO 2 , and temperature was empirically generated from MRS data and validated with fiber optic (FO) probes in vitro. TEGs were then implanted in a dorsal subcutaneous pocket in a murine model and evaluated with MRS up to 29 days postimplantation. R 1 measurements from the TEGs were converted to pO 2 values using the established calibration equation and these in vivo pO 2 measurements were simultaneously validated with FO probes. Additionally, MRS was used to detect increased pO 2 within implanted TEGs that received supplemental oxygen delivery. Finally, based on a comparison of our MRS data with previously reported data, ultra-high-field (16.4 T) is shown to have an advantage for measuring hypoxia with 19 F MRS. Results from this study show MRS relaxometry to be a precise, accurate, and noninvasive technique to monitor TEG pO 2 in vitro and in vivo.

  7. Spanish translation, cross-cultural adaptation, and validation of the Questionnaire for Diabetes-Related Foot Disease (Q-DFD)

    PubMed Central

    Castillo-Tandazo, Wilson; Flores-Fortty, Adolfo; Feraud, Lourdes; Tettamanti, Daniel

    2013-01-01

    Purpose To translate, cross-culturally adapt, and validate the Questionnaire for Diabetes-Related Foot Disease (Q-DFD), originally created and validated in Australia, for its use in Spanish-speaking patients with diabetes mellitus. Patients and methods The translation and cross-cultural adaptation were based on international guidelines. The Spanish version of the survey was applied to a community-based (sample A) and a hospital clinic-based sample (samples B and C). Samples A and B were used to determine criterion and construct validity comparing the survey findings with clinical evaluation and medical records, respectively; while sample C was used to determine intra- and inter-rater reliability. Results After completing the rigorous translation process, only four items were considered problematic and required a new translation. In total, 127 patients were included in the validation study: 76 to determine criterion and construct validity and 41 to establish intra- and inter-rater reliability. For an overall diagnosis of diabetes-related foot disease, a substantial level of agreement was obtained when we compared the Q-DFD with the clinical assessment (kappa 0.77, sensitivity 80.4%, specificity 91.5%, positive likelihood ratio [LR+] 9.46, negative likelihood ratio [LR−] 0.21); while an almost perfect level of agreement was obtained when it was compared with medical records (kappa 0.88, sensitivity 87%, specificity 97%, LR+ 29.0, LR− 0.13). Survey reliability showed substantial levels of agreement, with kappa scores of 0.63 and 0.73 for intra- and inter-rater reliability, respectively. Conclusion The translated and cross-culturally adapted Q-DFD showed good psychometric properties (validity, reproducibility, and reliability) that allow its use in Spanish-speaking diabetic populations. PMID:24039434

  8. Development of the Brazilian Portuguese version of the Achilles Tendon Total Rupture Score (ATRS BrP): a cross-cultural adaptation with reliability and construct validity evaluation.

    PubMed

    Zambelli, Roberto; Pinto, Rafael Z; Magalhães, João Murilo Brandão; Lopes, Fernando Araujo Silva; Castilho, Rodrigo Simões; Baumfeld, Daniel; Dos Santos, Thiago Ribeiro Teles; Maffulli, Nicola

    2016-01-01

    There is a need for a patient-relevant instrument to evaluate outcome after treatment in patients with a total Achilles tendon rupture. The purpose of this study was to undertake a cross-cultural adaptation of the Achilles Tendon Total Rupture Score (ATRS) into Brazilian Portuguese, determining the test-retest reliability and construct validity of the instrument. A five-step approach was used in the cross-cultural adaptation process: initial translation (two bilingual Brazilian translators), synthesis of translation, back-translation (two native English language translators), consensus version and evaluation (expert committee), and testing phase. A total of 46 patients were recruited to evaluate the test-retest reproducibility and construct validity of the Brazilian Portuguese version of the ATRS. Test-retest reproducibility was performed by assessing each participant on two separate occasions. The construct validity was determined by the correlation index between the ATRS and the Orthopedic American Foot and Ankle Society (AOFAS) questionnaires. The final version of the Brazilian Portuguese ATRS had the same number of questions as the original ATRS. For the reliability analysis, an ICC(2,1) of 0.93 (95 % CI: 0.88 to 0.96) with SEM of 1.56 points and MDC of 4.32 was observed, indicating excellent reliability. The construct validity showed excellent correlation with R = 0.76 (95 % CI: 0.52 to 0.89, P < 0.001). The ATRS was successfully cross-culturally validated into Brazilian Portuguese. This version was a reliable and valid measure of function in patients who suffered complete rupture of the Achilles Tendon.

  9. Translation, Cross-cultural Adaptation and Psychometric Validation of the Korean-Language Cardiac Rehabilitation Barriers Scale (CRBS-K)

    PubMed Central

    2017-01-01

    Objective To perform a translation and cross-cultural adaptation of the Cardiac Rehabilitation Barriers Scale (CRBS) for use in Korea, followed by psychometric validation. The CRBS was developed to assess patients' perception of the degree to which patient, provider and health system-level barriers affect their cardiac rehabilitation (CR) participation. Methods The CRBS consists of 21 items (barriers to adherence) rated on a 5-point Likert scale. The first phase was to translate and cross-culturally adapt the CRBS to the Korean language. After back-translation, both versions were reviewed by a committee. The face validity was assessed in a sample of Korean patients (n=53) with history of acute myocardial infarction that did not participate in CR through semi-structured interviews. The second phase was to assess the construct and criterion validity of the Korean translation as well as internal reliability, through administration of the translated version in 104 patients, principle component analysis with varimax rotation and cross-referencing against CR use, respectively. Results The length, readability, and clarity of the questionnaire were rated well, demonstrating face validity. Analysis revealed a six-factor solution, demonstrating construct validity. Cronbach's alpha was greater than 0.65. Barriers rated highest included not knowing about CR and not being contacted by a program. The mean CRBS score was significantly higher among non-attendees (2.71±0.26) than CR attendees (2.51±0.18) (p<0.01). Conclusion The Korean version of CRBS has demonstrated face, content and criterion validity, suggesting it may be useful for assessing barriers to CR utilization in Korea. PMID:29201826

  10. Conceptual Design of a Flight Validation Mission for a Hypervelocity Asteroid Intercept Vehicle

    NASA Technical Reports Server (NTRS)

    Barbee, Brent W.; Wie, Bong; Steiner, Mark; Getzandanner, Kenneth

    2013-01-01

    Near-Earth Objects (NEOs) are asteroids and comets whose orbits approach or cross Earth s orbit. NEOs have collided with our planet in the past, sometimes to devastating effect, and continue to do so today. Collisions with NEOs large enough to do significant damage to the ground are fortunately infrequent, but such events can occur at any time and we therefore need to develop and validate the techniques and technologies necessary to prevent the Earth impact of an incoming NEO. In this paper we provide background on the hazard posed to Earth by NEOs and present the results of a recent study performed by the NASA/Goddard Space Flight Center s Mission Design Lab (MDL) in collaboration with Iowa State University s Asteroid Deflection Research Center (ADRC) to design a flight validation mission for a Hypervelocity Asteroid Intercept Vehicle (HAIV) as part of a Phase 2 NASA Innovative Advanced Concepts (NIAC) research project. The HAIV is a two-body vehicle consisting of a leading kinetic impactor and trailing follower carrying a Nuclear Explosive Device (NED) payload. The HAIV detonates the NED inside the crater in the NEO s surface created by the lead kinetic impactor portion of the vehicle, effecting a powerful subsurface detonation to disrupt the NEO. For the flight validation mission, only a simple mass proxy for the NED is carried in the HAIV. Ongoing and future research topics are discussed following the presentation of the detailed flight validation mission design results produced in the MDL.

  11. Cross-Cultural Applicability of the Montreal Cognitive Assessment (MoCA): A Systematic Review.

    PubMed

    O'Driscoll, Ciarán; Shaikh, Madiha

    2017-01-01

    The Montreal Cognitive Assessment (MoCA) is widely used to screen for mild cognitive impairment (MCI). While there are many available versions, the cross-cultural validity of the assessment has not been explored sufficiently. We aimed to interrogate the validity of the MoCA in a cross-cultural context: in differentiating MCI from normal controls (NC); and identifying cut-offs and adjustments for age and education where possible. This review sourced a wide range of studies including case-control studies. In addition, we report findings for differentiating dementias from NC and MCI from dementias, however, these were not considered to be an appropriate use of the MoCA. The subject of the review assumes heterogeneity and therefore meta-analyses was not conducted. Quality ratings, forest plots of validated studies (sensitivity and specificity) with covariates (suggested cut-offs, age, education and country), and summary receiver operating characteristic curve are presented. The results showed a wide range in suggested cutoffs for MCI cross-culturally, with variability in levels of sensitivity and specificity ranging from low to high. Poor methodological rigor appears to have affected reported accuracy and validity of the MoCA. The review highlights the necessity for cross-cultural considerations when using the MoCA, and recognizing it as a screen and not a diagnostic tool. Appropriate cutoffs and point adjustments for education are suggested.

  12. Cross-cultural validation of the revised temperament and character inventory in the Bulgarian language.

    PubMed

    Tilov, Boris; Dimitrova, Donka; Stoykova, Maria; Tornjova, Bianka; Foreva, Gergana; Stoyanov, Drozdstoj

    2012-12-01

    Health-care professions have long been considered prone to work-related stress, yet recent research in Bulgaria indicates alarmingly high levels of burnout. Cloninger's inventory is used to analyse and evaluate correlation between personality characteristics and degree of burnout syndrome manifestation among the risk categories of health-care professionals. The primary goal of this study was to test the conceptual validity and cross-cultural applicability of the revised TCI (TCI-R), developed in the United States, in a culturally, socially and economically diverse setting. Linguistic validation, test-retest studies, statistical and expert analyses were performed to assess cross-cultural applicability of the revised Cloninger's temperament and character inventory in Bulgarian, its reliability and internal consistency and construct validity. The overall internal consistency of TCI-R and its scales as well as the interscale and test-retest correlations prove that the translated version of the questionnaire is acceptable and cross-culturally applicable for the purposes of studying organizational stress and burnout risk in health-care professionals. In general the cross-cultural adaptation process, even if carried out in a rigorous way, does not always lead to the best target version and suggests it would be useful to develop new scales specific to each culture and, at the same time, to think about the trans-cultural adaptation. © 2012 Blackwell Publishing Ltd.

  13. Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach

    PubMed Central

    2014-01-01

    Background Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa. Methods The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation. Results Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context. Conclusions Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population settings. PMID:25285151

  14. Cognitive-Motor Interference in an Ecologically Valid Street Crossing Scenario.

    PubMed

    Janouch, Christin; Drescher, Uwe; Wechsler, Konstantin; Haeger, Mathias; Bock, Otmar; Voelcker-Rehage, Claudia

    2018-01-01

    Laboratory-based research revealed that gait involves higher cognitive processes, leading to performance impairments when executed with a concurrent loading task. Deficits are especially pronounced in older adults. Theoretical approaches like the multiple resource model highlight the role of task similarity and associated attention distribution problems. It has been shown that in cases where these distribution problems are perceived relevant to participant's risk of falls, older adults prioritize gait and posture over the concurrent loading task. Here we investigate whether findings on task similarity and task prioritization can be transferred to an ecologically valid scenario. Sixty-three younger adults (20-30 years of age) and 61 older adults (65-75 years of age) participated in a virtual street crossing simulation. The participants' task was to identify suitable gaps that would allow them to cross a simulated two way street safely. Therefore, participants walked on a manual treadmill that transferred their forward motion to forward displacements in a virtual city. The task was presented as a single task (crossing only) and as a multitask. In the multitask condition participants were asked, among others, to type in three digit numbers that were presented either visually or auditorily. We found that for both age groups, street crossing as well as typing performance suffered under multitasking conditions. Impairments were especially pronounced for older adults (e.g., longer crossing initiation phase, more missed opportunities). However, younger and older adults did not differ in the speed and success rate of crossing. Further, deficits were stronger in the visual compared to the auditory task modality for most parameters. Our findings conform to earlier studies that found an age-related decline in multitasking performance in less realistic scenarios. However, task similarity effects were inconsistent and question the validity of the multiple resource model within ecologically valid scenarios.

  15. Applying modern psychometric techniques to melodic discrimination testing: Item response theory, computerised adaptive testing, and automatic item generation.

    PubMed

    Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel

    2017-06-15

    Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.

  16. Validating Ultrasound-based HIFU Lesion-size Monitoring Technique with MR Thermometry and Histology

    NASA Astrophysics Data System (ADS)

    Zhou, Shiwei; Petruzzello, John; Anand, Ajay; Sethuraman, Shriram; Azevedo, Jose

    2010-03-01

    In order to control and monitor HIFU lesions accurately and cost-effectively in real-time, we have developed an ultrasound-based therapy monitoring technique using acoustic radiation force to track the change in tissue mechanical properties. We validate our method with concurrent MR thermometry and histology. Comparison studies have been completed on in-vitro bovine liver samples. A single-element 1.1 MHz focused transducer was used to deliver HIFU and produce acoustic radiation force (ARF). A 5 MHz single-element transducer was placed co-axially with the HIFU transducer to acquire the RF data, and track the tissue displacement induced by ARF. During therapy, the monitoring procedure was interleaved with HIFU. MR thermometry (Philips Panorama 1T system) and ultrasound monitoring were performed simultaneously. The tissue temperature and thermal dose (CEM43 = 240 mins) were computed from the MR thermometry data. The tissue displacement induced by the acoustic radiation force was calculated from the ultrasound RF data in real-time using a cross-correlation based method. A normalized displacement difference (NDD) parameter was developed and calibrated to estimate the lesion size. The lesion size estimated by the NDD was compared with both MR thermometry prediction and the histology analysis. For lesions smaller than 8mm, the NDD-based lesion monitoring technique showed very similar performance as MR thermometry. The standard deviation of lesion size error is 0.66 mm, which is comparable to MR thermal dose contour prediction (0.42 mm). A phased array is needed for tracking displacement in 2D and monitoring lesion larger than 8 mm. The study demonstrates the potential of our ultrasound based technique to achieve precise HIFU lesion control through real-time monitoring. The results compare well with histology and an established technique like MR Thermometry. This approach provides feedback control in real-time to terminate therapy when a pre-determined lesion size is achieved, and can be extended to 2D and implemented on commercial ultrasound scanner systems.

  17. Validation and application of Acoustic Mapping Velocimetry

    NASA Astrophysics Data System (ADS)

    Baranya, Sandor; Muste, Marian

    2016-04-01

    The goal of this paper is to introduce a novel methodology to estimate bedload transport in rivers based on an improved bedform tracking procedure. The measurement technique combines components and processing protocols from two contemporary nonintrusive instruments: acoustic and image-based. The bedform mapping is conducted with acoustic surveys while the estimation of the velocity of the bedforms is obtained with processing techniques pertaining to image-based velocimetry. The technique is therefore called Acoustic Mapping Velocimetry (AMV). The implementation of this technique produces a whole-field velocity map associated with the multi-directional bedform movement. Based on the calculated two-dimensional bedform migration velocity field, the bedload transport estimation is done using the Exner equation. A proof-of-concept experiment was performed to validate the AMV based bedload estimation in a laboratory flume at IIHR-Hydroscience & Engineering (IIHR). The bedform migration was analysed at three different flow discharges. Repeated bed geometry mapping, using a multiple transducer array (MTA), provided acoustic maps, which were post-processed with a particle image velocimetry (PIV) method. Bedload transport rates were calculated along longitudinal sections using the streamwise components of the bedform velocity vectors and the measured bedform heights. The bulk transport rates were compared with the results from concurrent direct physical samplings and acceptable agreement was found. As a first field implementation of the AMV an attempt was made to estimate bedload transport for a section of the Ohio river in the United States, where bed geometry maps, resulted by repeated multibeam echo sounder (MBES) surveys, served as input data. Cross-sectional distributions of bedload transport rates from the AMV based method were compared with the ones obtained from another non-intrusive technique (due to the lack of direct samplings), ISSDOTv2, developed by the US Army Corps of Engineers. The good agreement between the results from the two different methods is encouraging and suggests further field tests in varying hydro-morphological situations.

  18. Fourier-transform-infrared-spectroscopy based spectral-biomarker selection towards optimum diagnostic differentiation of oral leukoplakia and cancer.

    PubMed

    Banerjee, Satarupa; Pal, Mousumi; Chakrabarty, Jitamanyu; Petibois, Cyril; Paul, Ranjan Rashmi; Giri, Amita; Chatterjee, Jyotirmoy

    2015-10-01

    In search of specific label-free biomarkers for differentiation of two oral lesions, namely oral leukoplakia (OLK) and oral squamous-cell carcinoma (OSCC), Fourier-transform infrared (FTIR) spectroscopy was performed on paraffin-embedded tissue sections from 47 human subjects (eight normal (NOM), 16 OLK, and 23 OSCC). Difference between mean spectra (DBMS), Mann-Whitney's U test, and forward feature selection (FFS) techniques were used for optimising spectral-marker selection. Classification of diseases was performed with linear and quadratic support vector machine (SVM) at 10-fold cross-validation, using different combinations of spectral features. It was observed that six features obtained through FFS enabled differentiation of NOM and OSCC tissue (1782, 1713, 1665, 1545, 1409, and 1161 cm(-1)) and were most significant, able to classify OLK and OSCC with 81.3 % sensitivity, 95.7 % specificity, and 89.7 % overall accuracy. The 43 spectral markers extracted through Mann-Whitney's U Test were the least significant when quadratic SVM was used. Considering the high sensitivity and specificity of the FFS technique, extracting only six spectral biomarkers was thus most useful for diagnosis of OLK and OSCC, and to overcome inter and intra-observer variability experienced in diagnostic best-practice histopathological procedure. By considering the biochemical assignment of these six spectral signatures, this work also revealed altered glycogen and keratin content in histological sections which could able to discriminate OLK and OSCC. The method was validated through spectral selection by the DBMS technique. Thus this method has potential for diagnostic cost minimisation for oral lesions by label-free biomarker identification.

  19. Estimation of 2D to 3D dimensions and proportionality indices for facial examination.

    PubMed

    Martos, Rubén; Valsecchi, Andrea; Ibáñez, Oscar; Alemán, Inmaculada

    2018-06-01

    Photo-anthropometry is a metric-based facial image comparison technique where measurements of the face are taken from an image using predetermined facial landmarks. In particular, dimensions and proportionality indices (DPIs) are compared to DPIs from another facial image. Different studies concluded that photo-anthropometric facial comparison, as it is currently practiced, is unsuitable for elimination purposes. The major limitation is the need for images acquired under very restrictive, controlled conditions. To overcome this latter issue, we propose a novel methodology to estimate 3D DPIs from 2D ones. It uses computer graphic techniques to simulate thousands of facial photographs under known camera conditions and regression to derive the mathematical relationship between 2D and 3D DPIs automatically. Additionally, we present a methodology that makes use of the estimated 3D DPIs for reducing the number of potential matches of a given unknown facial photograph within a set of known candidates. The error in the estimation of the 3D DPIs can be as large as 35%, but both I and III quartiles are consistently inside the ±5% range. The methodology for filtering cases has demonstrated to be useful in the task of narrowing down the list of possible candidates for a given photograph. It is able to remove on average (validated using cross-validation technique) 57% and 24% of the negative cases, depending on the amounts of DPIs available. Limitations of the work developed together with open research lines are included within the Discussion section. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Validation and Cross-Cultural Adaptation of a Chinese Version of the Emotional and Social Dysfunction Questionnaire in Stroke Patients.

    PubMed

    Huang, Hui-Chuan; Shyu, Meei-Ling; Lin, Mei-Feng; Hu, Chaur-Jong; Chang, Chien-Hung; Lee, Hsin-Chien; Chi, Nai-Fang; Chang, Hsiu-Ju

    2017-12-01

    The objectives of this study were to develop a cross-cultural Chinese version of the Emotional and Social Dysfunction Questionnaire (ESDQ-C) and test its validity and reliability among Chinese-speaking stroke patients. Various methods were used to develop the ESDQ-C. A cross-sectional study was used to examine the validity and reliability of the developed questionnaire, which consists of 28 items belonging to six factors, anger, helplessness, emotional dyscontrol, indifference, inertia and fatigue, and euphoria. Satisfactory convergence and known-group validities were confirmed by significant correlations of the ESDQ-C with the Profile of Mood States-Short Form ( p < .05) and with the Hospital Anxiety and Depression Scale ( p < .05). The internal consistency was represented by Cronbach's alpha, which was .96 and .79 to .92 for the entire scale and subscales, respectively. Appropriate application of the ESDQ-C will be helpful to identify critical adjustment-related types of distress and patients who experience difficulty coping with such distress.

  1. Different Diagnosis, Shared Vulnerabilities: The Value of Cross Disorder Validation of Capacity to Consent.

    PubMed

    Rosen, Allyson; Weitlauf, Julie C

    2015-01-01

    A screening measure of capacity to consent can provide an efficient method of determining the appropriateness of including individuals from vulnerable patient populations in research, particularly in circumstances in which no caregiver is available to provide surrogate consent. Seaman et al. (2015) cross-validate a measure of capacity to consent to research developed by Jeste et al. (2007). They provide data on controls, caregivers, and patients with mild cognitive impairment and dementia. The study demonstrates the importance of validating measures across disorders with different domains of incapacity, as well as the need for timely and appropriate follow-up with potential participants who yield positive screens. Ultimately clinical measures need to adapt to the dimensional diagnostic approaches put forward in DSM 5. Integrative models of constructs, such as capacity to consent, will make this process more efficient by avoiding the need to test measures in each disorder. Until then, cross-validation studies, such as the work by Seaman et al. (2015) are critical.

  2. Validation of tungsten cross sections in the neutron energy region up to 100 keV

    NASA Astrophysics Data System (ADS)

    Pigni, Marco T.; Žerovnik, Gašper; Leal, Luiz. C.; Trkov, Andrej

    2017-09-01

    Following a series of recent cross section evaluations on tungsten isotopes performed at Oak Ridge National Laboratory (ORNL), this paper presents the validation work carried out to test the performance of the evaluated cross sections based on lead-slowing-down (LSD) benchmarks conducted in Grenoble. ORNL completed the resonance parameter evaluation of four tungsten isotopes - 182,183,184,186W - in August 2014 and submitted it as an ENDF-compatible file to be part of the next release of the ENDF/B-VIII.0 nuclear data library. The evaluations were performed with support from the US Nuclear Criticality Safety Program in an effort to provide improved tungsten cross section and covariance data for criticality safety sensitivity analyses. The validation analysis based on the LSD benchmarks showed an improved agreement with the experimental response when the ORNL tungsten evaluations were included in the ENDF/B-VII.1 library. Comparison with the results obtained with the JEFF-3.2 nuclear data library are also discussed.

  3. Dynamic MRI to quantify musculoskeletal motion: A systematic review of concurrent validity and reliability, and perspectives for evaluation of musculoskeletal disorders

    PubMed Central

    Lempereur, Mathieu; Lelievre, Mathieu; Burdin, Valérie; Ben Salem, Douraied; Brochard, Sylvain

    2017-01-01

    Purpose To report evidence for the concurrent validity and reliability of dynamic MRI techniques to evaluate in vivo joint and muscle mechanics, and to propose recommendations for their use in the assessment of normal and impaired musculoskeletal function. Materials and methods The search was conducted on articles published in Web of science, PubMed, Scopus, Academic search Premier, and Cochrane Library between 1990 and August 2017. Studies that reported the concurrent validity and/or reliability of dynamic MRI techniques for in vivo evaluation of joint or muscle mechanics were included after assessment by two independent reviewers. Selected articles were assessed using an adapted quality assessment tool and a data extraction process. Results for concurrent validity and reliability were categorized as poor, moderate, or excellent. Results Twenty articles fulfilled the inclusion criteria with a mean quality assessment score of 66% (±10.4%). Concurrent validity and/or reliability of eight dynamic MRI techniques were reported, with the knee being the most evaluated joint (seven studies). Moderate to excellent concurrent validity and reliability were reported for seven out of eight dynamic MRI techniques. Cine phase contrast and real-time MRI appeared to be the most valid and reliable techniques to evaluate joint motion, and spin tag for muscle motion. Conclusion Dynamic MRI techniques are promising for the in vivo evaluation of musculoskeletal mechanics; however results should be evaluated with caution since validity and reliability have not been determined for all joints and muscles, nor for many pathological conditions. PMID:29232401

  4. Test-retest reliability and cross validation of the functioning everyday with a wheelchair instrument.

    PubMed

    Mills, Tamara L; Holm, Margo B; Schmeler, Mark

    2007-01-01

    The purpose of this study was to establish the test-retest reliability and content validity of an outcomes tool designed to measure the effectiveness of seating-mobility interventions on the functional performance of individuals who use wheelchairs or scooters as their primary seating-mobility device. The instrument, Functioning Everyday With a Wheelchair (FEW), is a questionnaire designed to measure perceived user function related to wheelchair/scooter use. Using consumer-generated items, FEW Beta Version 1.0 was developed and test-retest reliability was established. Cross-validation of FEW Beta Version 1.0 was then carried out with five samples of seating-mobility users to establish content validity. Based on the content validity study, FEW Version 2.0 was developed and administered to seating-mobility consumers to examine its test-retest reliability. FEW Beta Version 1.0 yielded an intraclass correlation coefficient (ICC) Model (3,k) of .92, p < .001, and the content validity results revealed that FEW Beta Version 1.0 captured 55% of seating-mobility goals reported by consumers across five samples. FEW Version 2.0 yielded ICC(3,k) = .86, p < .001, and captured 98.5% of consumers' seating-mobility goals. The cross-validation study identified new categories of seating-mobility goals for inclusion in FEW Version 2.0, and the content validity of FEW Version 2.0 was confirmed. FEW Beta Version 1.0 and FEW Version 2.0 were highly stable in their measurement of participants' seating-mobility goals over a 1-week interval.

  5. Cross-Cultural Adaptation, Validity, and Reliability of the Persian Version of the Orebro Musculoskeletal Pain Screening Questionnaire.

    PubMed

    Shafeei, Asrin; Mokhtarinia, Hamid Reza; Maleki-Ghahfarokhi, Azam; Piri, Leila

    2017-08-01

    Observational study. To cross-culturally translate the Orebro Musculoskeletal Pain Screening Questionnaire (OMPQ) into Persian and then evaluate its psychometric properties (reliability, validity, ceiling, and flooring effects). To the authors' knowledge, prior to this study there has been no validated instrument to screen the risk of chronicity in Persian-speaking patients with low back pain (LBP) in Iran. The OMPQ was specifically developed as a self-administered screening tool for assessing the risk of LBP chronicity. The forward-backward translation method was used for the translation and cross-cultural adaptation of the original questionnaire. In total, 202 patients with subacute LBP completed the OMPQ and the pain disability questionnaire (PDQ), which was used to assess convergent validity. 62 patients completed the OMPQ a week later as a retest. Slight changes were made to the OMPQ during the translation/cultural adaptation process; face validity of the Persian version was obtained. The Persian OMPQ showed excellent test-retest reliability (intraclass correlation coefficient=0.89). Its internal consistency was 0.71, and its convergent validity was confirmed by good correlation coefficient between the OMPQ and PDQ total scores ( r =0.72, p <0.05). No ceiling or floor effects were observed. The Persian version of the OMPQ is acceptable for the target society in terms of face validity, construct validity, reliability, and consistency. It is therefore considered a useful instrument for screening Iranian patients with LBP.

  6. Validity of the FACT-H&N (v 4.0) among Malaysian oral cancer patients.

    PubMed

    Doss, J G; Thomson, W M; Drummond, B K; Raja Latifah, R J

    2011-07-01

    To assess the cross-sectional construct validity of the Malay-translated and cross-culturally adapted FACT-H&N (v 4.0) for discriminative use in a sample of Malaysian oral cancer patients. A cross-sectional study of adults newly diagnosed with oral cancer. HRQOL data were collected using the FACT-H&N (v 4.0), a global question and a supplementary set of eight questions ('MAQ') obtained earlier in pilot work. Of the 76 participants (61.8% female; 23.7% younger than 50), most (96.1%) had oral squamous cell carcinoma; two-thirds were in Stages III or IV. At baseline, patients' mean FACT summary (FACT-G, FACT-H&N, FACT-H&N TOI, and FHNSI) and subscale (pwb, swb, ewb, fwb, and hnsc) scores were towards the higher end of the range. Equal proportions (36.8%) rated their overall HRQOL as 'good' or 'average'; fewer than one-quarter rated it as 'poor', and only two as 'very good'. All six FACT summary and most subscales had moderate-to-good internal consistency. For all summary scales, those with 'very poor/poor' self-rated HRQOL differed significantly from the 'good/very good' group. All FACT summary scales correlated strongly (r>0.75). Summary scales showed convergent validity (r>0.90) but little discriminant validity. The discriminant validity of the FHNSI improved with the addition of the MAQ. The FACT-H&N summary scales and most subscales demonstrated acceptable cross-sectional construct validity, reliability and discriminative ability, and thus appear appropriate for further use among Malaysian oral cancer patients. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Initial Reliability and Validity of the Perceived Social Competence Scale

    ERIC Educational Resources Information Center

    Anderson-Butcher, Dawn; Iachini, Aidyn L.; Amorose, Anthony J.

    2008-01-01

    Objective: This study describes the development and validation of a perceived social competence scale that social workers can easily use to assess children's and youth's social competence. Method: Exploratory and confirmatory factor analyses were conducted on a calibration and a cross-validation sample of youth. Predictive validity was also…

  8. A simple equation to estimate body fat percentage in children with overweightness or obesity: a retrospective study.

    PubMed

    Cortés-Castell, Ernesto; Juste, Mercedes; Palazón-Bru, Antonio; Monge, Laura; Sánchez-Ferrer, Francisco; Rizo-Baeza, María Mercedes

    2017-01-01

    Dual-energy X-ray absorptiometry (DXA) provides separate measurements of fat mass, fat-free mass and bone mass, and is a quick, accurate, and safe technique, yet one that is not readily available in routine clinical practice. Consequently, we aimed to develop statistical formulas to predict fat mass (%) and fat mass index (FMI) with simple parameters (age, sex, weight and height). We conducted a retrospective observational cross-sectional study in 416 overweight or obese patients aged 4-18 years that involved assessing adiposity by DXA (fat mass percentage and FMI), body mass index (BMI), sex and age. We randomly divided the sample into two parts (construction and validation). In the construction sample, we developed formulas to predict fat mass and FMI using linear multiple regression models. The formulas were validated in the other sample, calculating the intraclass correlation coefficient via bootstrapping. The fat mass percentage formula had a coefficient of determination of 0.65. This value was 0.86 for FMI. In the validation, the constructed formulas had an intraclass correlation coefficient of 0.77 for fat mass percentage and 0.92 for FMI. Our predictive formulas accurately predicted fat mass and FMI with simple parameters (BMI, sex and age) in children with overweight and obesity. The proposed methodology could be applied in other fields. Further studies are needed to externally validate these formulas.

  9. INTERPRETING PHYSICAL AND BEHAVIORAL HEALTH SCORES FROM NEW WORK DISABILITY INSTRUMENTS

    PubMed Central

    Marfeo, Elizabeth E.; Ni, Pengsheng; Chan, Leighton; Rasch, Elizabeth K.; McDonough, Christine M.; Brandt, Diane E.; Bogusz, Kara; Jette, Alan M.

    2015-01-01

    Objective To develop a system to guide interpretation of scores generated from 2 new instruments measuring work-related physical and behavioral health functioning (Work Disability – Physical Function (WD-PF) and WD – Behavioral Function (WD-BH)). Design Cross-sectional, secondary data from 3 independent samples to develop and validate the functional levels for physical and behavioral health functioning. Subjects Physical group: 999 general adult subjects, 1,017 disability applicants and 497 work-disabled subjects. Behavioral health group: 1,000 general adult subjects, 1,015 disability applicants and 476 work-disabled subjects. Methods Three-phase analytic approach including item mapping, a modified-Delphi technique, and known-groups validation analysis were used to develop and validate cut-points for functional levels within each of the WD-PF and WD-BH instrument’s scales. Results Four and 5 functional levels were developed for each of the scales in the WD-PF and WD-BH instruments. Distribution of the comparative samples was in the expected direction: the general adult samples consistently demonstrated scores at higher functional levels compared with the claimant and work-disabled samples. Conclusion Using an item-response theory-based methodology paired with a qualitative process appears to be a feasible and valid approach for translating the WD-BH and WD-PF scores into meaningful levels useful for interpreting a person’s work-related physical and behavioral health functioning. PMID:25729901

  10. The Application of FT-IR Spectroscopy for Quality Control of Flours Obtained from Polish Producers

    PubMed Central

    Ceglińska, Alicja; Reder, Magdalena; Ciemniewska-Żytkiewicz, Hanna

    2017-01-01

    Samples of wheat, spelt, rye, and triticale flours produced by different Polish mills were studied by both classic chemical methods and FT-IR MIR spectroscopy. An attempt was made to statistically correlate FT-IR spectral data with reference data with regard to content of various components, for example, proteins, fats, ash, and fatty acids as well as properties such as moisture, falling number, and energetic value. This correlation resulted in calibrated and validated statistical models for versatile evaluation of unknown flour samples. The calibration data set was used to construct calibration models with use of the CSR and the PLS with the leave one-out, cross-validation techniques. The calibrated models were validated with a validation data set. The results obtained confirmed that application of statistical models based on MIR spectral data is a robust, accurate, precise, rapid, inexpensive, and convenient methodology for determination of flour characteristics, as well as for detection of content of selected flour ingredients. The obtained models' characteristics were as follows: R2 = 0.97, PRESS = 2.14; R2 = 0.96, PRESS = 0.69; R2 = 0.95, PRESS = 1.27; R2 = 0.94, PRESS = 0.76, for content of proteins, lipids, ash, and moisture level, respectively. Best results of CSR models were obtained for protein, ash, and crude fat (R2 = 0.86; 0.82; and 0.78, resp.). PMID:28243483

  11. Modelling by partial least squares the relationship between the HPLC mobile phases and analytes on phenyl column.

    PubMed

    Markopoulou, Catherine K; Kouskoura, Maria G; Koundourellis, John E

    2011-06-01

    Twenty-five descriptors and 61 structurally different analytes have been used on a partial least squares (PLS) to latent structure technique in order to study chromatographically their interaction mechanism on a phenyl column. According to the model, 240 different retention times of the analytes, expressed as Y variable (log k), at different % MeOH mobile-phase concentrations have been correlated with their theoretical most important structural or molecular descriptors. The goodness-of-fit was estimated by the coefficient of multiple determinations r(2) (0.919), and the root mean square error of estimation (RMSEE=0.1283) values with a predictive ability (Q(2)) of 0.901. The model was further validated using cross-validation (CV), validated by 20 response permutations r(2) (0.0, 0.0146), Q(2) (0.0, -0.136) and validated by external prediction. The contribution of certain mechanism interactions between the analytes, the mobile phase and the column, proportional or counterbalancing is also studied. Trying to evaluate the influence on Y of every variable in a PLS model, VIP (variables importance in the projection) plot provides evidence that lipophilicity (expressed as Log D, Log P), polarizability, refractivity and the eluting power of the mobile phase are dominant in the retention mechanism on a phenyl column. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Validation of Cross Sections for Monte Carlo Simulation of the Photoelectric Effect

    NASA Astrophysics Data System (ADS)

    Han, Min Cheol; Kim, Han Sung; Pia, Maria Grazia; Basaglia, Tullio; Batič, Matej; Hoff, Gabriela; Kim, Chan Hyeong; Saracco, Paolo

    2016-04-01

    Several total and partial photoionization cross section calculations, based on both theoretical and empirical approaches, are quantitatively evaluated with statistical analyses using a large collection of experimental data retrieved from the literature to identify the state of the art for modeling the photoelectric effect in Monte Carlo particle transport. Some of the examined cross section models are available in general purpose Monte Carlo systems, while others have been implemented and subjected to validation tests for the first time to estimate whether they could improve the accuracy of particle transport codes. The validation process identifies Scofield's 1973 non-relativistic calculations, tabulated in the Evaluated Photon Data Library (EPDL), as the one best reproducing experimental measurements of total cross sections. Specialized total cross section models, some of which derive from more recent calculations, do not provide significant improvements. Scofield's non-relativistic calculations are not surpassed regarding the compatibility with experiment of K and L shell photoionization cross sections either, although in a few test cases Ebel's parameterization produces more accurate results close to absorption edges. Modifications to Biggs and Lighthill's parameterization implemented in Geant4 significantly reduce the accuracy of total cross sections at low energies with respect to its original formulation. The scarcity of suitable experimental data hinders a similar extensive analysis for the simulation of the photoelectron angular distribution, which is limited to a qualitative appraisal.

  13. Cross-Validation of a Recently Published Equation Predicting Energy Expenditure to Run or Walk a Mile in Normal-Weight and Overweight Adults

    ERIC Educational Resources Information Center

    Morris, Cody E.; Owens, Scott G.; Waddell, Dwight E.; Bass, Martha A.; Bentley, John P.; Loftin, Mark

    2014-01-01

    An equation published by Loftin, Waddell, Robinson, and Owens (2010) was cross-validated using ten normal-weight walkers, ten overweight walkers, and ten distance runners. Energy expenditure was measured at preferred walking (normal-weight walker and overweight walkers) or running pace (distance runners) for 5 min and corrected to a mile. Energy…

  14. Future Performance Trend Indicators: A Current Value Approach to Human Resources Accounting. Report III. Multivariate Predictions of Organizational Performance Across Time.

    ERIC Educational Resources Information Center

    Pecorella, Patricia A.; Bowers, David G.

    Multiple regression in a double cross-validated design was used to predict two performance measures (total variable expense and absence rate) by multi-month period in five industrial firms. The regressions do cross-validate, and produce multiple coefficients which display both concurrent and predictive effects, peaking 18 months to two years…

  15. Short Forms of the Wechsler Memory Scale--Revised: Cross- Validation and Derivation of a Two-Subtest Form.

    ERIC Educational Resources Information Center

    van den Broek, Anneke; Golden, Charles J.; Loonstra, Ann; Ghinglia, Katheryne; Goldstein, Diane

    1998-01-01

    Indicated excellent cross-validations with correlation of 0.99 for past formulas (J. L. Woodard and B. N. Axelrod, 1995; B. N. Axelrod et al, 1996) for estimating the Wechsler Memory Scale- Revised General Memory and Delayed Recall Indexes. Over 85% of the estimated scores were within 10 points of actual scores. Age, education, diagnosis, and IQ…

  16. Validation of the Technology Acceptance Measure for Pre-Service Teachers (TAMPST) on a Malaysian Sample: A Cross-Cultural Study

    ERIC Educational Resources Information Center

    Teo, Timothy

    2010-01-01

    Purpose: The purpose of this paper is to assess the cross-cultural validity of the technology acceptance measure for pre-service teachers (TAMPST) on a Malaysian sample. Design/methodology/approach: A total of 193 pre-service teachers from a Malaysian university completed a survey questionnaire measuring their responses to five constructs in the…

  17. Validation of the Pain Resilience Scale in Chinese-speaking patients with temporomandibular disorders pain.

    PubMed

    He, S L; Wang, J H; Ji, P

    2018-03-01

    To validate the Pain Resilience Scale (PRS) for use in Chinese patients with temporomandibular disorders (TMD) pain. According to international guidelines, the original PRS was first translated and cross-culturally adapted to formulate the Chinese version of PRS (PRS-C). A total of 152 patients with TMD pain were recruited to complete series of questionnaires. Reliability of the PRS-C was investigated using internal consistency and test-retest reliability. Validity of the PRS-C was calculated using cross-cultural validity and convergent validity. Cross-cultural validity was evaluated by examining the confirmatory factor analysis (CFA). And convergent validity was examined through correlating the PRS-C scores with scores of 2 commonly used pain-related measures (the Connor-Davidson Resilience Scale [CD-RISC] and the Tampa Scale for Kinesiophobia for Temporomandibular Disorders [TSK-TMD]). The PRS-C had a high internal consistency (Cronbach's alpha = 0.92) and good test-retest reliability (intra-class correlation coefficient [ICC] = 0.81). The CFA supported a 2-factor model for the PRS-C with acceptable fit to the data. The fit indices were chi-square/DF = 2.21, GFI = 0.91, TLI = 0.97, CFI = 0.98 and RMSEA = 0.08. As regards convergent validity, the PRS-C evidenced moderate-to-good relationships with the CD-RISC and the TSK-TMD. The PRS-C shows good psychometric properties and could be considered as a reliable and valid measure to evaluate pain-related resilience in patients with TMD pain. © 2017 John Wiley & Sons Ltd.

  18. Measuring Nigrescence Attitudes in School-Aged Adolescents

    ERIC Educational Resources Information Center

    Gardner-Kitt, Donna L.; Worrell, Frank C.

    2007-01-01

    In this study, we examined the reliability and validity of Cross Racial Identity Scale (CRIS; Vandiver, B. J., Cross Jr., W. E., Fhagen-Smith, P. E., Worrell, F. C., Swim, J. K., & Caldwell, L. D. (2000). "The Cross Racial Identity Scale." Unpublished scale; Worrell, F. C., Vandiver, B. J., & Cross Jr., W. E., (2004). "The Cross Racial Identity…

  19. Development and validation of anthropometric equations to estimate appendicular muscle mass in elderly women.

    PubMed

    Pereira, Piettra Moura Galvão; da Silva, Giselma Alcântara; Santos, Gilberto Moreira; Petroski, Edio Luiz; Geraldes, Amandio Aristides Rihan

    2013-07-02

    This study aimed to examine the cross validity of two anthropometric equations commonly used and propose simple anthropometric equations to estimate appendicular muscle mass (AMM) in elderly women. Among 234 physically active and functionally independent elderly women, 101 (60 to 89 years) were selected through simple drawing to compose the study sample. The paired t test and the Pearson correlation coefficient were used to perform cross-validation and concordance was verified by intraclass correction coefficient (ICC) and by the Bland and Altman technique. To propose predictive models, multiple linear regression analysis, anthropometric measures of body mass (BM), height, girth, skinfolds, body mass index (BMI) were used, and muscle perimeters were included in the analysis as independent variables. Dual-Energy X-ray Absorptiometry (AMMDXA) was used as criterion measurement. The sample power calculations were carried out by Post Hoc Compute Achieved Power. Sample power values from 0.88 to 0.91 were observed. When compared, the two equations tested differed significantly from the AMMDXA (p <0.001 and p = 0.001). Ten population / specific anthropometric equations were developed to estimate AMM, among them, three equations achieved all validation criteria used: AMM (E2) = 4.150 +0.251 [bodymass (BM)] - 0.411 [bodymass index (BMI)] + 0.011 [Right forearm perimeter (PANTd) 2]; AMM (E3) = 4.087 + 0.255 (BM) - 0.371 (BMI) + 0.011 (PANTd) 2 - 0.035 [thigh skinfold (DCCO)]; MMA (E6) = 2.855 + 0.298 (BM) + 0.019 (Age) - 0,082 [hip circumference (PQUAD)] + 0.400 (PANTd) - 0.332 (BMI). The equations estimated the criterion method (p = 0.056 p = 0.158), and explained from 0.69% to 0.74% of variations observed in AMMDXA with low standard errors of the estimate (1.36 to 1.55 kg) and high concordance (ICC between 0,90 and 0.91 and concordance limits from -2,93 to 2,33 kg). The equations tested were not valid for use in physically active and functionally independent elderly women. The simple anthropometric equations developed in this study showed good practical applicability and high validity to estimate AMM in elderly women.

  20. Development and validation of anthropometric equations to estimate appendicular muscle mass in elderly women

    PubMed Central

    2013-01-01

    Objective This study aimed to examine the cross validity of two anthropometric equations commonly used and propose simple anthropometric equations to estimate appendicular muscle mass (AMM) in elderly women. Methods Among 234 physically active and functionally independent elderly women, 101 (60 to 89 years) were selected through simple drawing to compose the study sample. The paired t test and the Pearson correlation coefficient were used to perform cross-validation and concordance was verified by intraclass correction coefficient (ICC) and by the Bland and Altman technique. To propose predictive models, multiple linear regression analysis, anthropometric measures of body mass (BM), height, girth, skinfolds, body mass index (BMI) were used, and muscle perimeters were included in the analysis as independent variables. Dual-Energy X-ray Absorptiometry (AMMDXA) was used as criterion measurement. The sample power calculations were carried out by Post Hoc Compute Achieved Power. Sample power values from 0.88 to 0.91 were observed. Results When compared, the two equations tested differed significantly from the AMMDXA (p <0.001 and p = 0.001). Ten population / specific anthropometric equations were developed to estimate AMM, among them, three equations achieved all validation criteria used: AMM (E2) = 4.150 +0.251 [bodymass (BM)] - 0.411 [bodymass index (BMI)] + 0.011 [Right forearm perimeter (PANTd) 2]; AMM (E3) = 4.087 + 0.255 (BM) - 0.371 (BMI) + 0.011 (PANTd) 2 - 0.035 [thigh skinfold (DCCO)]; MMA (E6) = 2.855 + 0.298 (BM) + 0.019 (Age) - 0,082 [hip circumference (PQUAD)] + 0.400 (PANTd) - 0.332 (BMI). The equations estimated the criterion method (p = 0.056 p = 0.158), and explained from 0.69% to 0.74% of variations observed in AMMDXA with low standard errors of the estimate (1.36 to 1.55 kg) and high concordance (ICC between 0,90 and 0.91 and concordance limits from -2,93 to 2,33 kg). Conclusion The equations tested were not valid for use in physically active and functionally independent elderly women. The simple anthropometric equations developed in this study showed good practical applicability and high validity to estimate AMM in elderly women. PMID:23815948

Top