Sample records for approach comparative analysis

  1. A Comparative Study of Data Envelopment Analysis and Other Approaches to Efficiency Evaluation and Estimation.

    DTIC Science & Technology

    1982-11-01

    ADA27 91 A COMPARATIVE STJDO DATAENVEOPMENT ANALYSISAND / HER APPROACHES TO A .i)TEXAS UN VAT AUSIN CENTER FOCAS RE CCYR 5 NO BERRETIC STADIES A... COMPARATIVE STUDY OF DATA ENVELOPMENT ANALYSIS AND OTHER APPROACHES TO EFFICIENCY EVALUATION AND ESTIMATIONt by A. Charnes W.W. Cooper H.D. Sherman...School of Business, 1981, entitled "Measurement of Hospital Efficiency: A Comparative Analysis of Data Envelopment Analysis and Other Approaches for

  2. Parametric and experimental analysis using a power flow approach

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1990-01-01

    A structural power flow approach for the analysis of structure-borne transmission of vibrations is used to analyze the influence of structural parameters on transmitted power. The parametric analysis is also performed using the Statistical Energy Analysis approach and the results are compared with those obtained using the power flow approach. The advantages of structural power flow analysis are demonstrated by comparing the type of results that are obtained by the two analytical methods. Also, to demonstrate that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental study of structural power flow is presented. This experimental study presents results for an L shaped beam for which an available solution was already obtained. Various methods to measure vibrational power flow are compared to study their advantages and disadvantages.

  3. [Approaches to medical training among physicians who teach; analysis of two different educational strategies].

    PubMed

    Loría-Castellanos, Jorge; Rivera-lbarra, Doris Beatriz; Márquez-Avila, Guadalupe

    2009-01-01

    Compare the outreach of a promotional educational strategy that focuses on active participation and compare it with a more traditional approach to medical training. A quasi-experimental design was approved by the research committee. We compared the outreach of two different approaches to medical training. We administered a validated instrument that included 72 items that analyze statements used to measure educational tasks in the form of duplets through 3 indicators. A group that included seven physicians that were actively participating in teaching activities was stratified according to teaching approaches. One of the approaches was a traditional one and the other included a promotional strategy aimed at increasing participation. All participants signed informed consent before answering the research instruments. Statistical analysis was done using non-parametric tests. Mann-Whitney results did not show differences among the group in the preliminary analysis. A second analysis with the same test after the interventions found significant differences (p d" 0.018) in favor of those subjects that had participated in the promotional approach mainly in the indicator measuring "consequence". The Wilcoxon test showed that all participants in the promotional approach increased significantly (pd" 0.018) in 3 main indicators as compared with the control group. A promotional strategy aimed at increasing physician participation constitutes a more profitable approach when compared with traditional teaching methods.

  4. Parametric and experimental analysis using a power flow approach

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1988-01-01

    Having defined and developed a structural power flow approach for the analysis of structure-borne transmission of structural vibrations, the technique is used to perform an analysis of the influence of structural parameters on the transmitted energy. As a base for comparison, the parametric analysis is first performed using a Statistical Energy Analysis approach and the results compared with those obtained using the power flow approach. The advantages of using structural power flow are thus demonstrated by comparing the type of results obtained by the two methods. Additionally, to demonstrate the advantages of using the power flow method and to show that the power flow results represent a direct physical parameter that can be measured on a typical structure, an experimental investigation of structural power flow is also presented. Results are presented for an L-shaped beam for which an analytical solution has already been obtained. Furthermore, the various methods available to measure vibrational power flow are compared to investigate the advantages and disadvantages of each method.

  5. Comparative Analysis of Future Cooks' Training in Vocational Institutions in Ukraine and Abroad

    ERIC Educational Resources Information Center

    Kankovsky, Ihor; Krasylnykova, Hanna; Drozich, Iryna

    2017-01-01

    The article deals with comparative analysis of conceptual approaches and content of cooks' training in Ukraine, European countries, the USA and Eastern Partnership countries. It has been found out that national vocational education is grounded on education standards and activity-based approach to forming the training content, subject-based…

  6. Face recognition using an enhanced independent component analysis approach.

    PubMed

    Kwak, Keun-Chang; Pedrycz, Witold

    2007-03-01

    This paper is concerned with an enhanced independent component analysis (ICA) and its application to face recognition. Typically, face representations obtained by ICA involve unsupervised learning and high-order statistics. In this paper, we develop an enhancement of the generic ICA by augmenting this method by the Fisher linear discriminant analysis (LDA); hence, its abbreviation, FICA. The FICA is systematically developed and presented along with its underlying architecture. A comparative analysis explores four distance metrics, as well as classification with support vector machines (SVMs). We demonstrate that the FICA approach leads to the formation of well-separated classes in low-dimension subspace and is endowed with a great deal of insensitivity to large variation in illumination and facial expression. The comprehensive experiments are completed for the facial-recognition technology (FERET) face database; a comparative analysis demonstrates that FICA comes with improved classification rates when compared with some other conventional approaches such as eigenface, fisherface, and the ICA itself.

  7. [Critic analysis of a comparative meta-analysis on the morbidity, functional and carcinologic results after radical prostatectomy according to surgical approach. Work of cancerology committee of the French urological association].

    PubMed

    Bastide, C; Rozet, F; Salomon, L; Mongiat-Artus, P; Beuzeboc, P; Cormier, L; Eiss, D; Gaschignard, N; Peyromaure, M; Richaud, P; Soulié, M

    2010-09-01

    Surgical approach for radical prostatectomy is even today a subject of debate in the urologic community. Many comparative studies between retropubic and laparoscopic approach (robotic assisted or not) were reported since 10 years without being able to decide between the supporters of retropubic or laparoscopic approach. The committee of cancer research of the French urological association took hold this question after a recent meta-analysis publication on this subject. Although imperfect, this meta-analysis exists and permits to conclude partially on the advantages and the inconveniences supposed for each surgical approach. Regarding morbidity after radical prostatectomy, the only significant difference reported concerns the hemorrhagic risk in favour of the laparoscopic approach. Regarding oncologic results, the only exploitable data concern positive surgical margins rate, which is identical whatever surgical approach. Concerning the functional results, no difference was reported in the literature between different surgical approaches. Copyright © 2010 Elsevier Masson SAS. All rights reserved.

  8. Comparative Rhetorical Organization of ELT Thesis Introductions Composed by Thai and American Students

    ERIC Educational Resources Information Center

    Wuttisrisiriporn, Niwat

    2017-01-01

    Genre analysis is today's dominant approach for textual analysis, especially in the ESP learning and teaching profession. Adopting this approach, the present study compares the Introduction chapters of MA theses in ELT (English Language Teaching) written by Thai students to those written by American university students based on the move-step…

  9. Authority in Cross-Racial Teaching and Learning (Re)considering the Transferability of Warm Demander Approaches

    ERIC Educational Resources Information Center

    Ford, Amy Carpenter; Sassi, Kelly

    2014-01-01

    This article compares a White teacher's approach to authority with that of an African American warm demander. Ethnographic methods and discourse analysis illuminated how an African American teacher grounded her authority with African American students in shared culture, history, and frame of reference. A comparative analysis makes visible…

  10. Effect of electrocardiogram interference on cortico-cortical connectivity analysis and a possible solution.

    PubMed

    Govindan, R B; Kota, Srinivas; Al-Shargabi, Tareq; Massaro, An N; Chang, Taeun; du Plessis, Adre

    2016-09-01

    Electroencephalogram (EEG) signals are often contaminated by the electrocardiogram (ECG) interference, which affects quantitative characterization of EEG. We propose null-coherence, a frequency-based approach, to attenuate the ECG interference in EEG using simultaneously recorded ECG as a reference signal. After validating the proposed approach using numerically simulated data, we apply this approach to EEG recorded from six newborns receiving therapeutic hypothermia for neonatal encephalopathy. We compare our approach with an independent component analysis (ICA), a previously proposed approach to attenuate ECG artifacts in the EEG signal. The power spectrum and the cortico-cortical connectivity of the ECG attenuated EEG was compared against the power spectrum and the cortico-cortical connectivity of the raw EEG. The null-coherence approach attenuated the ECG contamination without leaving any residual of the ECG in the EEG. We show that the null-coherence approach performs better than ICA in attenuating the ECG contamination without enhancing cortico-cortical connectivity. Our analysis suggests that using ICA to remove ECG contamination from the EEG suffers from redistribution problems, whereas the null-coherence approach does not. We show that both the null-coherence and ICA approaches attenuate the ECG contamination. However, the EEG obtained after ICA cleaning displayed higher cortico-cortical connectivity compared with that obtained using the null-coherence approach. This suggests that null-coherence is superior to ICA in attenuating the ECG interference in EEG for cortico-cortical connectivity analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Comparative study of two approaches to model the offshore fish cages

    NASA Astrophysics Data System (ADS)

    Zhao, Yun-peng; Wang, Xin-xin; Decew, Jud; Tsukrov, Igor; Bai, Xiao-dong; Bi, Chun-wei

    2015-06-01

    The goal of this paper is to provide a comparative analysis of two commonly used approaches to discretize offshore fish cages: the lumped-mass approach and the finite element technique. Two case studies are chosen to compare predictions of the LMA (lumped-mass approach) and FEA (finite element analysis) based numerical modeling techniques. In both case studies, we consider several loading conditions consisting of different uniform currents and monochromatic waves. We investigate motion of the cage, its deformation, and the resultant tension in the mooring lines. Both model predictions are sufficient close to the experimental data, but for the first experiment, the DUT-FlexSim predictions are slightly more accurate than the ones provided by Aqua-FE™. According to the comparisons, both models can be successfully utilized to the design and analysis of the offshore fish cages provided that an appropriate safety factor is chosen.

  12. A global optimization approach to multi-polarity sentiment analysis.

    PubMed

    Li, Xinmiao; Li, Jing; Wu, Yukeng

    2015-01-01

    Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG) and support vector machines (SVM) are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti) approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA) and grid search method. From the results of this comparison, we found that PSOGO-Senti is more suitable for improving a difficult multi-polarity sentiment analysis problem.

  13. Design rainfall depth estimation through two regional frequency analysis methods in Hanjiang River Basin, China

    NASA Astrophysics Data System (ADS)

    Xu, Yue-Ping; Yu, Chaofeng; Zhang, Xujie; Zhang, Qingqing; Xu, Xiao

    2012-02-01

    Hydrological predictions in ungauged basins are of significant importance for water resources management. In hydrological frequency analysis, regional methods are regarded as useful tools in estimating design rainfall/flood for areas with only little data available. The purpose of this paper is to investigate the performance of two regional methods, namely the Hosking's approach and the cokriging approach, in hydrological frequency analysis. These two methods are employed to estimate 24-h design rainfall depths in Hanjiang River Basin, one of the largest tributaries of Yangtze River, China. Validation is made through comparing the results to those calculated from the provincial handbook approach which uses hundreds of rainfall gauge stations. Also for validation purpose, five hypothetically ungauged sites from the middle basin are chosen. The final results show that compared to the provincial handbook approach, the Hosking's approach often overestimated the 24-h design rainfall depths while the cokriging approach most of the time underestimated. Overall, the Hosking' approach produced more accurate results than the cokriging approach.

  14. A Comparative Analysis of Meeting the Whole Child Initiatives through Standardized and Competency-Based Education Systems in Terms of Achievement and Meeting the Whole Child Initiatives: Comparing Professional Perceptions and Identified Measurable Results

    ERIC Educational Resources Information Center

    Ward, Jacqueline M.

    2011-01-01

    Traditional education (TE) largely uses a standardized (SbE) approach while alternatives (nTE) tend to more of a competency (CbE), or student-centered approach. This comparative analysis examines essential aspects of such pedagogies in determining the effectiveness of schooling systems in meeting the Whole Child Initiative (Souza, 1999; Carter et…

  15. Building Bridges Between Structural and Program Evaluation Approaches to Evaluating Policy

    PubMed Central

    Heckman, James J.

    2011-01-01

    This paper compares the structural approach to economic policy analysis with the program evaluation approach. It offers a third way to do policy analysis that combines the best features of both approaches. We illustrate the value of this alternative approach by making the implicit economics of LATE explicit, thereby extending the interpretability and range of policy questions that LATE can answer. PMID:21743749

  16. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    NASA Technical Reports Server (NTRS)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  17. Analyzing Information Systems Development: A Comparison and Analysis of Eight IS Development Approaches.

    ERIC Educational Resources Information Center

    Iivari, Juhani; Hirschheim, Rudy

    1996-01-01

    Analyzes and compares eight information systems (IS) development approaches: Information Modelling, Decision Support Systems, the Socio-Technical approach, the Infological approach, the Interactionist approach, the Speech Act-based approach, Soft Systems Methodology, and the Scandinavian Trade Unionist approach. Discusses the organizational roles…

  18. Test Cases for Modeling and Validation of Structures with Piezoelectric Actuators

    NASA Technical Reports Server (NTRS)

    Reaves, Mercedes C.; Horta, Lucas G.

    2001-01-01

    A set of benchmark test articles were developed to validate techniques for modeling structures containing piezoelectric actuators using commercially available finite element analysis packages. The paper presents the development, modeling, and testing of two structures: an aluminum plate with surface mounted patch actuators and a composite box beam with surface mounted actuators. Three approaches for modeling structures containing piezoelectric actuators using the commercially available packages: MSC/NASTRAN and ANSYS are presented. The approaches, applications, and limitations are discussed. Data for both test articles are compared in terms of frequency response functions from deflection and strain data to input voltage to the actuator. Frequency response function results using the three different analysis approaches provided comparable test/analysis results. It is shown that global versus local behavior of the analytical model and test article must be considered when comparing different approaches. Also, improper bonding of actuators greatly reduces the electrical to mechanical effectiveness of the actuators producing anti-resonance errors.

  19. A comparison of Rasch item-fit and Cronbach's alpha item reduction analysis for the development of a Quality of Life scale for children and adolescents.

    PubMed

    Erhart, M; Hagquist, C; Auquier, P; Rajmil, L; Power, M; Ravens-Sieberer, U

    2010-07-01

    This study compares item reduction analysis based on classical test theory (maximizing Cronbach's alpha - approach A), with analysis based on the Rasch Partial Credit Model item-fit (approach B), as applied to children and adolescents' health-related quality of life (HRQoL) items. The reliability and structural, cross-cultural and known-group validity of the measures were examined. Within the European KIDSCREEN project, 3019 children and adolescents (8-18 years) from seven European countries answered 19 HRQoL items of the Physical Well-being dimension of a preliminary KIDSCREEN instrument. The Cronbach's alpha and corrected item total correlation (approach A) were compared with infit mean squares and the Q-index item-fit derived according to a partial credit model (approach B). Cross-cultural differential item functioning (DIF ordinal logistic regression approach), structural validity (confirmatory factor analysis and residual correlation) and relative validity (RV) for socio-demographic and health-related factors were calculated for approaches (A) and (B). Approach (A) led to the retention of 13 items, compared with 11 items with approach (B). The item overlap was 69% for (A) and 78% for (B). The correlation coefficient of the summated ratings was 0.93. The Cronbach's alpha was similar for both versions [0.86 (A); 0.85 (B)]. Both approaches selected some items that are not strictly unidimensional and items displaying DIF. RV ratios favoured (A) with regard to socio-demographic aspects. Approach (B) was superior in RV with regard to health-related aspects. Both types of item reduction analysis should be accompanied by additional analyses. Neither of the two approaches was universally superior with regard to cultural, structural and known-group validity. However, the results support the usability of the Rasch method for developing new HRQoL measures for children and adolescents.

  20. Accounting for standard errors of vision-specific latent trait in regression models.

    PubMed

    Wong, Wan Ling; Li, Xiang; Li, Jialiang; Wong, Tien Yin; Cheng, Ching-Yu; Lamoureux, Ecosse L

    2014-07-11

    To demonstrate the effectiveness of Hierarchical Bayesian (HB) approach in a modeling framework for association effects that accounts for SEs of vision-specific latent traits assessed using Rasch analysis. A systematic literature review was conducted in four major ophthalmic journals to evaluate Rasch analysis performed on vision-specific instruments. The HB approach was used to synthesize the Rasch model and multiple linear regression model for the assessment of the association effects related to vision-specific latent traits. The effectiveness of this novel HB one-stage "joint-analysis" approach allows all model parameters to be estimated simultaneously and was compared with the frequently used two-stage "separate-analysis" approach in our simulation study (Rasch analysis followed by traditional statistical analyses without adjustment for SE of latent trait). Sixty-six reviewed articles performed evaluation and validation of vision-specific instruments using Rasch analysis, and 86.4% (n = 57) performed further statistical analyses on the Rasch-scaled data using traditional statistical methods; none took into consideration SEs of the estimated Rasch-scaled scores. The two models on real data differed for effect size estimations and the identification of "independent risk factors." Simulation results showed that our proposed HB one-stage "joint-analysis" approach produces greater accuracy (average of 5-fold decrease in bias) with comparable power and precision in estimation of associations when compared with the frequently used two-stage "separate-analysis" procedure despite accounting for greater uncertainty due to the latent trait. Patient-reported data, using Rasch analysis techniques, do not take into account the SE of latent trait in association analyses. The HB one-stage "joint-analysis" is a better approach, producing accurate effect size estimations and information about the independent association of exposure variables with vision-specific latent traits. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  1. Meta-analysis of laparoscopic versus open repair of perforated peptic ulcer.

    PubMed

    Antoniou, Stavros A; Antoniou, George A; Koch, Oliver O; Pointner, Rudolph; Granderath, Frank A

    2013-01-01

    Laparoscopic treatment of perforated peptic ulcer (PPU) has been introduced as an alternative procedure to open surgery. It has been postulated that the minimally invasive approach involves less operative stress and results in decreased morbidity and mortality. We conducted a meta-analysis of randomized trials to test this hypothesis. Medline, EMBASE, and the Cochrane Central Register of Randomized Trials databases were searched, with no date or language restrictions. Our literature search identified 4 randomized trials, with a cumulative number of 289 patients, that compared the laparoscopic approach with open sutured repair of perforated ulcer. Analysis of outcomes did not favor either approach in terms of morbidity, mortality, and reoperation rate, although odds ratios seemed to consistently support the laparoscopic approach. Results did not determine the comparative efficiency and safety of laparoscopic or open approach for PPU. In view of an increased interest in the laparoscopic approach, further randomized trials are considered essential to determine the relative effectiveness of laparoscopic and open repair of PPU.

  2. Meta-analysis of Laparoscopic Versus Open Repair of Perforated Peptic Ulcer

    PubMed Central

    Antoniou, George A.; Koch, Oliver O.; Pointner, Rudolph; Granderath, Frank A.

    2013-01-01

    Background and Objectives: Laparoscopic treatment of perforated peptic ulcer (PPU) has been introduced as an alternative procedure to open surgery. It has been postulated that the minimally invasive approach involves less operative stress and results in decreased morbidity and mortality. Methods: We conducted a meta-analysis of randomized trials to test this hypothesis. Medline, EMBASE, and the Cochrane Central Register of Randomized Trials databases were searched, with no date or language restrictions. Results: Our literature search identified 4 randomized trials, with a cumulative number of 289 patients, that compared the laparoscopic approach with open sutured repair of perforated ulcer. Analysis of outcomes did not favor either approach in terms of morbidity, mortality, and reoperation rate, although odds ratios seemed to consistently support the laparoscopic approach. Results did not determine the comparative efficiency and safety of laparoscopic or open approach for PPU. Conclusion: In view of an increased interest in the laparoscopic approach, further randomized trials are considered essential to determine the relative effectiveness of laparoscopic and open repair of PPU. PMID:23743368

  3. Comparative Analysis of Sustainable Approaches and Systems for Scientific Data Stewardship

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.

    2012-12-01

    Sustainable data systems are critical components of the cyberinfrastructure needed to provide long-term stewardship of scientific data, including Earth science data, throughout their entire life cycle. A variety of approaches may help ensure the sustainability of such systems, but these approaches must be able to survive the demands of competing priorities and decreasing budgets. Analyzing and comparing alternative approaches can identify viable aspects of each approach and inform decisions for developing, managing, and supporting the cyberinfrastructure needed to facilitate discovery, access, and analysis of data by future communities of users. A typology of sustainability approaches is proposed, and example use cases are offered for comparing the approaches over time. These examples demonstrate the potential strengths and weaknesses of each approach under various conditions and with regard to different objectives, e.g., open vs. limited access. By applying the results of these analyses to their particular circumstances, systems stakeholders can assess their options for a sustainable systems approach along with other metrics and identify alternative strategies to ensure the sustainability of the scientific data and information for which they are responsible. In addition, comparing sustainability approaches should inform the design of new systems and the improvement of existing systems to meet the needs for long-term stewardship of scientific data, and support education and workforce development efforts needed to ensure that the appropriate scientific and technical skills are available to operate and further develop sustainable cyberinfrastructure.

  4. Variables Influencing the Return on Investment in Management Training Programs: A Utility Analysis of 10 Swiss Cases

    ERIC Educational Resources Information Center

    Chochard, Yves; Davoine, Eric

    2011-01-01

    In this article, we present the utility analysis approach as an alternative and promising approach to measure the return on investment in managerial training programs. This approach, linking economic value with competencies developed by trainees, enables researchers and decision-makers to compare the return on investment from different programs in…

  5. Discourse Analysis and Development of English Listening for Non-English Majors in China

    ERIC Educational Resources Information Center

    Ji, Yinxiu

    2015-01-01

    Traditional approach of listening teaching mainly focuses on the sentence level and regards the listening process in a passive and static way. To compensate for this deficiency, a new listening approach, that is, discourse-oriented approach has been introduced into the listening classroom. Although discourse analysis is a comparatively new field…

  6. Comparison between the retropubic and transobturator approaches in the treatment of female stress urinary incontinence: a systematic review and meta-analysis of effectiveness and complications

    PubMed Central

    Sun, Xincheng; Yang, Qingsong; Sun, Feng; Shi, Qinglu

    2015-01-01

    Objective This study aimed to compare the effectiveness and complications between the retropubic and transobturator approaches for the treatment of female stress urinary incontinence (SUI) by conducting a systematic review. Materials and Methods We selected all randomized controlled trials (RCTs) that compared retropubic and transobturator sling placements for treatment of SUI. We estimated pooled odds ratios and 95% confidence intervals for intraoperative and postoperative outcomes and complications. Results Six hundred twelve studies that compared retropubic and transobturator approaches to midurethral sling placement were identified, of which 16 were included in our research. Our study was based on results from 2646 women. We performed a subgroup analysis to compare outcomes and complications between the two approaches. The evidence to support the superior approach that leads to better objective/subjective cure rate was insufficient. The transobturator approach was associated with lower risks of bladder perforation (odds ratio (OR) 0.17, 95% confidence interval (CI) 0.09-0.32), retropubic/vaginal hematoma (OR 0.32, 95% CI 0.16-0.63), and long-term voiding dysfunction (OR 0.32, 95% CI 0.17-0.61). However, the risk of thigh/groin pain seemed higher in the transobturator group (OR 2.53, 95% CI 1.72-3.72). We found no statistically significant differences in the risks of other complications between the two approaches. Conclusions This meta-analysis shows analogical objective and subjective cure rates between the retropubic and transobturator approaches to midurethral sling placement. The transobturator approach was associated with lower risks of several complications. However, good-quality studies with long-term follow-ups are warranted for further research. PMID:26005962

  7. Mind and consciousness in yoga – Vedanta: A comparative analysis with western psychological concepts

    PubMed Central

    Prabhu, H. R. Aravinda; Bhat, P. S.

    2013-01-01

    Study of mind and consciousness through established scientific methods is often difficult due to the observed-observer dichotomy. Cartesian approach of dualism considering the mind and matter as two diverse and unconnected entities has been questioned by oriental schools of Yoga and Vedanta as well as the recent quantum theories of modern physics. Freudian and Neo-freudian schools based on the Cartesian model have been criticized by the humanistic schools which come much closer to the vedantic approach of unitariness. A comparative analysis of the two approaches is discussed. PMID:23858252

  8. Analysis of Tube Free Hydroforming using an Inverse Approach with FLD-based Adjustment of Process Parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Johnson, Kenneth I.; Khaleel, Mohammad A.

    2003-04-01

    This paper employs an inverse approach (IA) formulation for the analysis of tubes under free hydroforming conditions. The IA formulation is derived from that of Guo et al. established for flat sheet hydroforming analysis using constant strain triangular membrane elements. At first, an incremental analysis of free hydroforming for a hot-dip galvanized (HG/Z140) DP600 tube is performed using the finite element Marc code. The deformed geometry obtained at the last converged increment is then used as the final configuration in the inverse analysis. This comparative study allows us to assess the predicting capability of the inverse analysis. The results willmore » be compared with the experimental values determined by Asnafi and Skogsgardh. After that, a procedure based on a forming limit diagram (FLD) is proposed to adjust the process parameters such as the axial feed and internal pressure. Finally, the adjustment process is illustrated through a re-analysis of the same tube using the inverse approach« less

  9. Direct determination approach for the multifractal detrending moving average analysis

    NASA Astrophysics Data System (ADS)

    Xu, Hai-Chuan; Gu, Gao-Feng; Zhou, Wei-Xing

    2017-11-01

    In the canonical framework, we propose an alternative approach for the multifractal analysis based on the detrending moving average method (MF-DMA). We define a canonical measure such that the multifractal mass exponent τ (q ) is related to the partition function and the multifractal spectrum f (α ) can be directly determined. The performances of the direct determination approach and the traditional approach of the MF-DMA are compared based on three synthetic multifractal and monofractal measures generated from the one-dimensional p -model, the two-dimensional p -model, and the fractional Brownian motions. We find that both approaches have comparable performances to unveil the fractal and multifractal nature. In other words, without loss of accuracy, the multifractal spectrum f (α ) can be directly determined using the new approach with less computation cost. We also apply the new MF-DMA approach to the volatility time series of stock prices and confirm the presence of multifractality.

  10. Criteria for Comparing Domain Analysis Approaches Version 01.00.00

    DTIC Science & Technology

    1991-12-01

    Down-Bottom-Up Domain Analysis Process (1990 Version) ..... 14 Figure 8. FODAs Domain Analysis Process ............................................ 16... FODA , which uses the Design Approach for Real-Time Systems (DARTS) design method (Gomaa 1984)? 1 1. hntmduction Domain analysis is still immature... Analysis Process 16 2. An Orvicw of Some Domain AnabAppro•d•a 2.4.3 ExAzs The FODA report illustrates the process by using the window management

  11. The Covariance Adjustment Approaches for Combining Incomparable Cox Regressions Caused by Unbalanced Covariates Adjustment: A Multivariate Meta-Analysis Study.

    PubMed

    Dehesh, Tania; Zare, Najaf; Ayatollahi, Seyyed Mohammad Taghi

    2015-01-01

    Univariate meta-analysis (UM) procedure, as a technique that provides a single overall result, has become increasingly popular. Neglecting the existence of other concomitant covariates in the models leads to loss of treatment efficiency. Our aim was proposing four new approximation approaches for the covariance matrix of the coefficients, which is not readily available for the multivariate generalized least square (MGLS) method as a multivariate meta-analysis approach. We evaluated the efficiency of four new approaches including zero correlation (ZC), common correlation (CC), estimated correlation (EC), and multivariate multilevel correlation (MMC) on the estimation bias, mean square error (MSE), and 95% probability coverage of the confidence interval (CI) in the synthesis of Cox proportional hazard models coefficients in a simulation study. Comparing the results of the simulation study on the MSE, bias, and CI of the estimated coefficients indicated that MMC approach was the most accurate procedure compared to EC, CC, and ZC procedures. The precision ranking of the four approaches according to all above settings was MMC ≥ EC ≥ CC ≥ ZC. This study highlights advantages of MGLS meta-analysis on UM approach. The results suggested the use of MMC procedure to overcome the lack of information for having a complete covariance matrix of the coefficients.

  12. Delay differential equations via the matrix Lambert W function and bifurcation analysis: application to machine tool chatter.

    PubMed

    Yi, Sun; Nelson, Patrick W; Ulsoy, A Galip

    2007-04-01

    In a turning process modeled using delay differential equations (DDEs), we investigate the stability of the regenerative machine tool chatter problem. An approach using the matrix Lambert W function for the analytical solution to systems of delay differential equations is applied to this problem and compared with the result obtained using a bifurcation analysis. The Lambert W function, known to be useful for solving scalar first-order DDEs, has recently been extended to a matrix Lambert W function approach to solve systems of DDEs. The essential advantages of the matrix Lambert W approach are not only the similarity to the concept of the state transition matrix in lin ear ordinary differential equations, enabling its use for general classes of linear delay differential equations, but also the observation that we need only the principal branch among an infinite number of roots to determine the stability of a system of DDEs. The bifurcation method combined with Sturm sequences provides an algorithm for determining the stability of DDEs without restrictive geometric analysis. With this approach, one can obtain the critical values of delay, which determine the stability of a system and hence the preferred operating spindle speed without chatter. We apply both the matrix Lambert W function and the bifurcation analysis approach to the problem of chatter stability in turning, and compare the results obtained to existing methods. The two new approaches show excellent accuracy and certain other advantages, when compared to traditional graphical, computational and approximate methods.

  13. Comparative effects of different dietary approaches on blood pressure in hypertensive and pre-hypertensive patients: A systematic review and network meta-analysis.

    PubMed

    Schwingshackl, Lukas; Chaimani, Anna; Schwedhelm, Carolina; Toledo, Estefania; Pünsch, Marina; Hoffmann, Georg; Boeing, Heiner

    2018-05-02

    Pairwise meta-analyses have shown beneficial effects of individual dietary approaches on blood pressure but their comparative effects have not been established. Therefore we performed a systematic review of different dietary intervention trials and estimated the aggregate blood pressure effects through network meta-analysis including hypertensive and pre-hypertensive patients. PubMed, Cochrane CENTRAL, and Google Scholar were searched until June 2017. The inclusion criteria were defined as follows: i) Randomized trial with a dietary approach; ii) hypertensive and pre-hypertensive adult patients; and iii) minimum intervention period of 12 weeks. In order to determine the pooled effect of each intervention relative to each of the other intervention for both diastolic and systolic blood pressure (SBP and DBP), random effects network meta-analysis was performed. A total of 67 trials comparing 13 dietary approaches (DASH, low-fat, moderate-carbohydrate, high-protein, low-carbohydrate, Mediterranean, Palaeolithic, vegetarian, low-GI/GL, low-sodium, Nordic, Tibetan, and control) enrolling 17,230 participants were included. In the network meta-analysis, the DASH, Mediterranean, low-carbohydrate, Palaeolithic, high-protein, low-glycaemic index, low-sodium, and low-fat dietary approaches were significantly more effective in reducing SBP (-8.73 to -2.32 mmHg) and DBP (-4.85 to -1.27 mmHg) compared to a control diet. According to the SUCRAs, the DASH diet was ranked the most effective dietary approach in reducing SBP (90%) and DBP (91%), followed by the Palaeolithic, and the low-carbohydrate diet (ranked 3rd for SBP) or the Mediterranean diet (ranked 3rd for DBP). For most comparisons, the credibility of evidence was rated very low to moderate, with the exception for the DASH vs. the low-fat dietary approach for which the quality of evidence was rated high. The present network meta-analysis suggests that the DASH dietary approach might be the most effective dietary measure to reduce blood pressure among hypertensive and pre-hypertensive patients based on high quality evidence.

  14. Comparative Analysis of Academic Grades in Compulsory Secondary Education in Spain Using Statistical Techniques

    ERIC Educational Resources Information Center

    Veas, Alejandro; Gilar, Raquel; Miñano, Pablo; Castejón, Juan Luis

    2017-01-01

    The present study, based on the construct comparability approach, performs a comparative analysis of general points average for seven courses, using exploratory factor analysis (EFA) and the Partial Credit model (PCM) with a sample of 1398 student subjects (M = 12.5, SD = 0.67) from 8 schools in the province of Alicante (Spain). EFA confirmed a…

  15. Value Creating Education and the Capability Approach: A Comparative Analysis of Soka Education's Facility to Promote Well-Being and Social Justice

    ERIC Educational Resources Information Center

    Sherman, Paul David

    2016-01-01

    The relatively unfamiliar pedagogy of Soka (value creating) education is analysed for its capacity to promote well-being and social justice, using the well-known Capability Approach (CA) as a comparator. Various aspects of Soka education correspond favourably with the CA, indicating its potential as a credible and constructive approach for…

  16. Comparative study on gene set and pathway topology-based enrichment methods.

    PubMed

    Bayerlová, Michaela; Jung, Klaus; Kramer, Frank; Klemm, Florian; Bleckmann, Annalen; Beißbarth, Tim

    2015-10-22

    Enrichment analysis is a popular approach to identify pathways or sets of genes which are significantly enriched in the context of differentially expressed genes. The traditional gene set enrichment approach considers a pathway as a simple gene list disregarding any knowledge of gene or protein interactions. In contrast, the new group of so called pathway topology-based methods integrates the topological structure of a pathway into the analysis. We comparatively investigated gene set and pathway topology-based enrichment approaches, considering three gene set and four topological methods. These methods were compared in two extensive simulation studies and on a benchmark of 36 real datasets, providing the same pathway input data for all methods. In the benchmark data analysis both types of methods showed a comparable ability to detect enriched pathways. The first simulation study was conducted with KEGG pathways, which showed considerable gene overlaps between each other. In this study with original KEGG pathways, none of the topology-based methods outperformed the gene set approach. Therefore, a second simulation study was performed on non-overlapping pathways created by unique gene IDs. Here, methods accounting for pathway topology reached higher accuracy than the gene set methods, however their sensitivity was lower. We conducted one of the first comprehensive comparative works on evaluating gene set against pathway topology-based enrichment methods. The topological methods showed better performance in the simulation scenarios with non-overlapping pathways, however, they were not conclusively better in the other scenarios. This suggests that simple gene set approach might be sufficient to detect an enriched pathway under realistic circumstances. Nevertheless, more extensive studies and further benchmark data are needed to systematically evaluate these methods and to assess what gain and cost pathway topology information introduces into enrichment analysis. Both types of methods for enrichment analysis require further improvements in order to deal with the problem of pathway overlaps.

  17. Scandinavian Approaches to Gender Equality in Academia: A Comparative Study

    ERIC Educational Resources Information Center

    Nielsen, Mathias Wullum

    2017-01-01

    This study investigates how Denmark, Norway, and Sweden approach issues of gender equality in research differently. Based on a comparative document analysis of gender equality activities in six Scandinavian universities, together with an examination of the legislative and political frameworks surrounding these activities, the article provides new…

  18. Communicating Comparative Findings from Meta-Analysis in Educational Research: Some Examples and Suggestions

    ERIC Educational Resources Information Center

    Higgins, Steve; Katsipataki, Maria

    2016-01-01

    This article reviews some of the strengths and limitations of the comparative use of meta-analysis findings, using examples from the Sutton Trust-Education Endowment Foundation Teaching and Learning "Toolkit" which summarizes a range of educational approaches to improve pupil attainment in schools. This comparative use of quantitative…

  19. Evaluation of a cost-effective loads approach. [shock spectra/impedance method for Viking Orbiter

    NASA Technical Reports Server (NTRS)

    Garba, J. A.; Wada, B. K.; Bamford, R.; Trubert, M. R.

    1976-01-01

    A shock spectra/impedance method for loads predictions is used to estimate member loads for the Viking Orbiter, a 7800-lb interplanetary spacecraft that has been designed using transient loads analysis techniques. The transient loads analysis approach leads to a lightweight structure but requires complex and costly analyses. To reduce complexity and cost, a shock spectra/impedance method is currently being used to design the Mariner Jupiter Saturn spacecraft. This method has the advantage of using low-cost in-house loads analysis techniques and typically results in more conservative structural loads. The method is evaluated by comparing the increase in Viking member loads to the loads obtained by the transient loads analysis approach. An estimate of the weight penalty incurred by using this method is presented. The paper also compares the calculated flight loads from the transient loads analyses and the shock spectra/impedance method to measured flight data.

  20. Evaluation of a cost-effective loads approach. [for Viking Orbiter light weight structural design

    NASA Technical Reports Server (NTRS)

    Garba, J. A.; Wada, B. K.; Bamford, R.; Trubert, M. R.

    1976-01-01

    A shock spectra/impedance method for loads prediction is used to estimate member loads for the Viking Orbiter, a 7800-lb interplanetary spacecraft that has been designed using transient loads analysis techniques. The transient loads analysis approach leads to a lightweight structure but requires complex and costly analyses. To reduce complexity and cost a shock spectra/impedance method is currently being used to design the Mariner Jupiter Saturn spacecraft. This method has the advantage of using low-cost in-house loads analysis techniques and typically results in more conservative structural loads. The method is evaluated by comparing the increase in Viking member loads to the loads obtained by the transient loads analysis approach. An estimate of the weight penalty incurred by using this method is presented. The paper also compares the calculated flight loads from the transient loads analyses and the shock spectra/impedance method to measured flight data.

  1. Analysis of Illumina Microbial Assemblies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clum, Alicia; Foster, Brian; Froula, Jeff

    2010-05-28

    Since the emerging of second generation sequencing technologies, the evaluation of different sequencing approaches and their assembly strategies for different types of genomes has become an important undertaken. Next generation sequencing technologies dramatically increase sequence throughput while decreasing cost, making them an attractive tool for whole genome shotgun sequencing. To compare different approaches for de-novo whole genome assembly, appropriate tools and a solid understanding of both quantity and quality of the underlying sequence data are crucial. Here, we performed an in-depth analysis of short-read Illumina sequence assembly strategies for bacterial and archaeal genomes. Different types of Illumina libraries as wellmore » as different trim parameters and assemblers were evaluated. Results of the comparative analysis and sequencing platforms will be presented. The goal of this analysis is to develop a cost-effective approach for the increased throughput of the generation of high quality microbial genomes.« less

  2. Developing comparative criminology and the case of China: an introduction.

    PubMed

    Liu, Jianhong

    2007-02-01

    Although comparative criminology has made significant development during the past decade or so, systematic empirical research has only developed along a few topics. Comparative criminology has never occupied a central position in criminology. This article analyzes the major theoretical and methodological impediments in the development of comparative criminology. It stresses a need to shift methodology from a conventional primary approach that uses the nation as the unit of analysis to an in-depth case study method as a primary methodological approach. The article maintains that case study method can overcome the limitation of its descriptive tradition and become a promising methodological approach for comparative criminology.

  3. Comparing methods for analysis of biomedical hyperspectral image data

    NASA Astrophysics Data System (ADS)

    Leavesley, Silas J.; Sweat, Brenner; Abbott, Caitlyn; Favreau, Peter F.; Annamdevula, Naga S.; Rich, Thomas C.

    2017-02-01

    Over the past 2 decades, hyperspectral imaging technologies have been adapted to address the need for molecule-specific identification in the biomedical imaging field. Applications have ranged from single-cell microscopy to whole-animal in vivo imaging and from basic research to clinical systems. Enabling this growth has been the availability of faster, more effective hyperspectral filtering technologies and more sensitive detectors. Hence, the potential for growth of biomedical hyperspectral imaging is high, and many hyperspectral imaging options are already commercially available. However, despite the growth in hyperspectral technologies for biomedical imaging, little work has been done to aid users of hyperspectral imaging instruments in selecting appropriate analysis algorithms. Here, we present an approach for comparing the effectiveness of spectral analysis algorithms by combining experimental image data with a theoretical "what if" scenario. This approach allows us to quantify several key outcomes that characterize a hyperspectral imaging study: linearity of sensitivity, positive detection cut-off slope, dynamic range, and false positive events. We present results of using this approach for comparing the effectiveness of several common spectral analysis algorithms for detecting weak fluorescent protein emission in the midst of strong tissue autofluorescence. Results indicate that this approach should be applicable to a very wide range of applications, allowing a quantitative assessment of the effectiveness of the combined biology, hardware, and computational analysis for detecting a specific molecular signature.

  4. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    NASA Astrophysics Data System (ADS)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  5. Comparing energy sources for surgical ablation of atrial fibrillation: a Bayesian network meta-analysis of randomized, controlled trials.

    PubMed

    Phan, Kevin; Xie, Ashleigh; Kumar, Narendra; Wong, Sophia; Medi, Caroline; La Meir, Mark; Yan, Tristan D

    2015-08-01

    Simplified maze procedures involving radiofrequency, cryoenergy and microwave energy sources have been increasingly utilized for surgical treatment of atrial fibrillation as an alternative to the traditional cut-and-sew approach. In the absence of direct comparisons, a Bayesian network meta-analysis is another alternative to assess the relative effect of different treatments, using indirect evidence. A Bayesian meta-analysis of indirect evidence was performed using 16 published randomized trials identified from 6 databases. Rank probability analysis was used to rank each intervention in terms of their probability of having the best outcome. Sinus rhythm prevalence beyond the 12-month follow-up was similar between the cut-and-sew, microwave and radiofrequency approaches, which were all ranked better than cryoablation (respectively, 39, 36, and 25 vs 1%). The cut-and-sew maze was ranked worst in terms of mortality outcomes compared with microwave, radiofrequency and cryoenergy (2 vs 19, 34, and 24%, respectively). The cut-and-sew maze procedure was associated with significantly lower stroke rates compared with microwave ablation [odds ratio <0.01; 95% confidence interval 0.00, 0.82], and ranked the best in terms of pacemaker requirements compared with microwave, radiofrequency and cryoenergy (81 vs 14, and 1, <0.01% respectively). Bayesian rank probability analysis shows that the cut-and-sew approach is associated with the best outcomes in terms of sinus rhythm prevalence and stroke outcomes, and remains the gold standard approach for AF treatment. Given the limitations of indirect comparison analysis, these results should be viewed with caution and not over-interpreted. © The Author 2014. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  6. Flipped classroom improves student learning in health professions education: a meta-analysis.

    PubMed

    Hew, Khe Foon; Lo, Chung Kwan

    2018-03-15

    The use of flipped classroom approach has become increasingly popular in health professions education. However, no meta-analysis has been published that specifically examines the effect of flipped classroom versus traditional classroom on student learning. This study examined the findings of comparative articles through a meta-analysis in order to summarize the overall effects of teaching with the flipped classroom approach. We focused specifically on a set of flipped classroom studies in which pre-recorded videos were provided before face-to-face class meetings. These comparative articles focused on health care professionals including medical students, residents, doctors, nurses, or learners in other health care professions and disciplines (e.g., dental, pharmacy, environmental or occupational health). Using predefined study eligibility criteria, seven electronic databases were searched in mid-April 2017 for relevant articles. Methodological quality was graded using the Medical Education Research Study Quality Instrument (MERSQI). Effect sizes, heterogeneity estimates, analysis of possible moderators, and publication bias were computed using the COMPREHENSIVE META-ANALYSIS software. A meta-analysis of 28 eligible comparative studies (between-subject design) showed an overall significant effect in favor of flipped classrooms over traditional classrooms for health professions education (standardized mean difference, SMD = 0.33, 95% confidence interval, CI = 0.21-0.46, p < 0.001), with no evidence of publication bias. In addition, the flipped classroom approach was more effective when instructors used quizzes at the start of each in-class session. More respondents reported they preferred flipped to traditional classrooms. Current evidence suggests that the flipped classroom approach in health professions education yields a significant improvement in student learning compared with traditional teaching methods.

  7. A Strategic Culture Assessment of the Transatlantic Divide

    DTIC Science & Technology

    2008-03-01

    security divide through the strategic culture lens, taking a comparative case study approach . It analyzes the emergent EU strategic culture by looking...utilize the strategic culture approach in the ensuing case study comparisons. B. WHY THE USE OF STRATEGIC CULTURE? In a study published in 2004...analysis use a comparative cultural approach when a previous comparison of U.S. and EU behavior found these actors’ behavior most aligned with realism’s

  8. A density difference based analysis of orbital-dependent exchange-correlation functionals

    NASA Astrophysics Data System (ADS)

    Grabowski, Ireneusz; Teale, Andrew M.; Fabiano, Eduardo; Śmiga, Szymon; Buksztel, Adam; Della Sala, Fabio

    2014-03-01

    We present a density difference based analysis for a range of orbital-dependent Kohn-Sham functionals. Results for atoms, some members of the neon isoelectronic series and small molecules are reported and compared with ab initio wave function calculations. Particular attention is paid to the quality of approximations to the exchange-only optimised effective potential (OEP) approach: we consider both the localised Hartree-Fock as well as the Krieger-Li-Iafrate methods. Analysis of density differences at the exchange-only level reveals the impact of the approximations on the resulting electronic densities. These differences are further quantified in terms of the ground state energies, frontier orbital energy differences and highest occupied orbital energies obtained. At the correlated level, an OEP approach based on a perturbative second-order correlation energy expression is shown to deliver results comparable with those from traditional wave function approaches, making it suitable for use as a benchmark against which to compare standard density functional approximations.

  9. Latent class analysis derived subgroups of low back pain patients - do they have prognostic capacity?

    PubMed

    Molgaard Nielsen, Anne; Hestbaek, Lise; Vach, Werner; Kent, Peter; Kongsted, Alice

    2017-08-09

    Heterogeneity in patients with low back pain is well recognised and different approaches to subgrouping have been proposed. One statistical technique that is increasingly being used is Latent Class Analysis as it performs subgrouping based on pattern recognition with high accuracy. Previously, we developed two novel suggestions for subgrouping patients with low back pain based on Latent Class Analysis of patient baseline characteristics (patient history and physical examination), which resulted in 7 subgroups when using a single-stage analysis, and 9 subgroups when using a two-stage approach. However, their prognostic capacity was unexplored. This study (i) determined whether the subgrouping approaches were associated with the future outcomes of pain intensity, pain frequency and disability, (ii) assessed whether one of these two approaches was more strongly or more consistently associated with these outcomes, and (iii) assessed the performance of the novel subgroupings as compared to the following variables: two existing subgrouping tools (STarT Back Tool and Quebec Task Force classification), four baseline characteristics and a group of previously identified domain-specific patient categorisations (collectively, the 'comparator variables'). This was a longitudinal cohort study of 928 patients consulting for low back pain in primary care. The associations between each subgroup approach and outcomes at 2 weeks, 3 and 12 months, and with weekly SMS responses were tested in linear regression models, and their prognostic capacity (variance explained) was compared to that of the comparator variables listed above. The two previously identified subgroupings were similarly associated with all outcomes. The prognostic capacity of both subgroupings was better than that of the comparator variables, except for participants' recovery beliefs and the domain-specific categorisations, but was still limited. The explained variance ranged from 4.3%-6.9% for pain intensity and from 6.8%-20.3% for disability, and highest at the 2 weeks follow-up. Latent Class-derived subgroups provided additional prognostic information when compared to a range of variables, but the improvements were not substantial enough to warrant further development into a new prognostic tool. Further research could investigate if these novel subgrouping approaches may help to improve existing tools that subgroup low back pain patients.

  10. Enhancing Critical Thinking by Teaching Two Distinct Approaches to Management

    ERIC Educational Resources Information Center

    Dyck, Bruno; Walker, Kent; Starke, Frederick A.; Uggerslev, Krista

    2012-01-01

    The authors explore the effect on students' critical thinking of teaching only one approach to management versus teaching two approaches to management. Results from a quasiexperiment--which included a survey, interviews, and case analysis--suggest that compared with students who are taught only a conventional approach to management (which…

  11. A Cultural and Comparative Perspective on Outdoor Education in New Zealand and "Friluftsliv" in Denmark

    ERIC Educational Resources Information Center

    Andkjaer, Soren

    2012-01-01

    The paper is based on a comparative and qualitative case study of "friluftsliv" in Denmark and outdoor education in New Zealand. Cultural analysis with a comparative cultural perspective informed the research approach. Configurational analysis was used as an important supplement to focus on cultural patterns linked to bodily movement. It…

  12. Quantitative maps of genetic interactions in yeast - comparative evaluation and integrative analysis.

    PubMed

    Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero

    2011-03-24

    High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.

  13. Developing Multidimensional Likert Scales Using Item Factor Analysis: The Case of Four-Point Items

    ERIC Educational Resources Information Center

    Asún, Rodrigo A.; Rdz-Navarro, Karina; Alvarado, Jesús M.

    2016-01-01

    This study compares the performance of two approaches in analysing four-point Likert rating scales with a factorial model: the classical factor analysis (FA) and the item factor analysis (IFA). For FA, maximum likelihood and weighted least squares estimations using Pearson correlation matrices among items are compared. For IFA, diagonally weighted…

  14. Analysis of delamination related fracture processes in composites

    NASA Technical Reports Server (NTRS)

    Armanios, Erian A.

    1992-01-01

    This is a final report that summarizes the results achieved under this grant. The first major accomplishment is the development of the sublaminate modeling approach and shear deformation theory. The sublaminate approach allows the flexibility of considering one ply or groups of plies as a single laminated unit with effective properties. This approach is valid when the characteristic length of the response is small compared to the sublaminate thickness. The sublaminate approach was validated comparing its predictions with a finite element solution. A shear deformation theory represents an optimum compromise between accuracy and computational effort in delamination analysis of laminated composites. This conclusion was reached by applying several theories with increasing level of complexity to the prediction of interlaminar stresses and strain energy release rate in a double cracked-lap-shear configuration.

  15. Enhanced annotations and features for comparing thousands of Pseudomonas genomes in the Pseudomonas genome database.

    PubMed

    Winsor, Geoffrey L; Griffiths, Emma J; Lo, Raymond; Dhillon, Bhavjinder K; Shay, Julie A; Brinkman, Fiona S L

    2016-01-04

    The Pseudomonas Genome Database (http://www.pseudomonas.com) is well known for the application of community-based annotation approaches for producing a high-quality Pseudomonas aeruginosa PAO1 genome annotation, and facilitating whole-genome comparative analyses with other Pseudomonas strains. To aid analysis of potentially thousands of complete and draft genome assemblies, this database and analysis platform was upgraded to integrate curated genome annotations and isolate metadata with enhanced tools for larger scale comparative analysis and visualization. Manually curated gene annotations are supplemented with improved computational analyses that help identify putative drug targets and vaccine candidates or assist with evolutionary studies by identifying orthologs, pathogen-associated genes and genomic islands. The database schema has been updated to integrate isolate metadata that will facilitate more powerful analysis of genomes across datasets in the future. We continue to place an emphasis on providing high-quality updates to gene annotations through regular review of the scientific literature and using community-based approaches including a major new Pseudomonas community initiative for the assignment of high-quality gene ontology terms to genes. As we further expand from thousands of genomes, we plan to provide enhancements that will aid data visualization and analysis arising from whole-genome comparative studies including more pan-genome and population-based approaches. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Retrospective Comparison of Single-Port Sleeve Gastrectomy Versus Three-Port Laparoscopic Sleeve Gastrectomy: a Propensity Score Adjustment Analysis.

    PubMed

    Mauriello, Claudio; Chouillard, Elie; d'alessandro, Antonio; Marte, Gianpaolo; Papadimitriou, Argyri; Chahine, Elias; Kassir, Radwan

    2018-04-16

    Evaluate the efficacy of single-port sleeve gastrectomy (SPSG) and then compare it to a less-invasive sleeve approach (three-port) (3PSG) according to a propensity score (PS) matching analysis. We analyzed all patients who underwent SG through a three-port or a single-port laparoscopic approach. After 2 years, the follow-up was completed in 84% patients treated with 3PSG and 95% patients of the SPSG group. Excess weight loss (EWL) was comparable for the first year of follow-up within the two groups except for the controls at 3 months in which the SPSG group showed a higher EWL (p = 0.0243). We demonstrated the efficacy of SPSG in bariatric surgery even compared to another, less invasive, laparoscopic SG approach (three-port).

  17. Using cognitive work analysis to explore activity allocation within military domains.

    PubMed

    Jenkins, D P; Stanton, N A; Salmon, P M; Walker, G H; Young, M S

    2008-06-01

    Cognitive work analysis (CWA) is frequently advocated as an approach for the analysis of complex socio-technical systems. Much of the current CWA literature within the military domain pays particular attention to its initial phases; work domain analysis and contextual task analysis. Comparably, the analysis of the social and organisational constraints receives much less attention. Through the study of a helicopter mission planning system software tool, this paper describes an approach for investigating the constraints affecting the distribution of work. The paper uses this model to evaluate the potential benefits of the social and organisational analysis phase within a military context. The analysis shows that, through its focus on constraints, the approach provides a unique description of the factors influencing the social organisation within a complex domain. This approach appears to be compatible with existing approaches and serves as a validation of more established social analysis techniques. As part of the ergonomic design of mission planning systems, the social organisation and cooperation analysis phase of CWA provides a constraint-based description informing allocation of function between key actor groups. This approach is useful because it poses questions related to the transfer of information and optimum working practices.

  18. Comparative Pedagogical Analysis of Philologists' Professional Training at American and Ukrainian Universities

    ERIC Educational Resources Information Center

    Bidyuk, Natalya; Ikonnikova, Maryna

    2017-01-01

    The article deals with comparative and pedagogical analysis of philologists' professional training at American and Ukrainian universities on the conceptual (philosophical and pedagogical paradigms, concepts, theories, approaches, teaching goals and strategies), organizational and pedagogical (tuition fee, training duration and modes, entry…

  19. The Effect of Laminar Flow on Rotor Hover Performance

    NASA Technical Reports Server (NTRS)

    Overmeyer, Austin D.; Martin, Preston B.

    2017-01-01

    The topic of laminar flow effects on hover performance is introduced with respect to some historical efforts where laminar flow was either measured or attempted. An analysis method is outlined using combined blade element, momentum method coupled to an airfoil analysis method, which includes the full e(sup N) transition model. The analysis results compared well with the measured hover performance including the measured location of transition on both the upper and lower blade surfaces. The analysis method is then used to understand the upper limits of hover efficiency as a function of disk loading. The impact of laminar flow is higher at low disk loading, but significant improvement in terms of power loading appears possible even up to high disk loading approaching 20 ps f. A optimum planform design equation is derived for cases of zero profile drag and finite drag levels. These results are intended to be a guide for design studies and as a benchmark to compare higher fidelity analysis results. The details of the analysis method are given to enable other researchers to use the same approach for comparison to other approaches.

  20. Automatic Identification of Character Types from Film Dialogs

    PubMed Central

    Skowron, Marcin; Trapp, Martin; Payr, Sabine; Trappl, Robert

    2016-01-01

    ABSTRACT We study the detection of character types from fictional dialog texts such as screenplays. As approaches based on the analysis of utterances’ linguistic properties are not sufficient to identify all fictional character types, we develop an integrative approach that complements linguistic analysis with interactive and communication characteristics, and show that it can improve the identification performance. The interactive characteristics of fictional characters are captured by the descriptive analysis of semantic graphs weighted by linguistic markers of expressivity and social role. For this approach, we introduce a new data set of action movie character types with their corresponding sequences of dialogs. The evaluation results demonstrate that the integrated approach outperforms baseline approaches on the presented data set. Comparative in-depth analysis of a single screenplay leads on to the discussion of possible limitations of this approach and to directions for future research. PMID:29118463

  1. Development and application of a comparative fatty acid analysis method to investigate voriconazole-induced hepatotoxicity.

    PubMed

    Chen, Guan-yuan; Chiu, Huai-hsuan; Lin, Shu-wen; Tseng, Yufeng Jane; Tsai, Sung-jeng; Kuo, Ching-hua

    2015-01-01

    As fatty acids play an important role in biological regulation, the profiling of fatty acid expression has been used to discover various disease markers and to understand disease mechanisms. This study developed an effective and accurate comparative fatty acid analysis method using differential labeling to speed up the metabolic profiling of fatty acids. Fatty acids were derivatized with unlabeled (D0) or deuterated (D3) methanol, followed by GC-MS analysis. The comparative fatty acid analysis method was validated using a series of samples with different ratios of D0/D3-labeled fatty acid standards and with mouse liver extracts. Using a lipopolysaccharide (LPS)-treated mouse model, we found that the fatty acid profiles after LPS treatment were similar between the conventional single-sample analysis approach and the proposed comparative approach, with a Pearson's correlation coefficient of approximately 0.96. We applied the comparative method to investigate voriconazole-induced hepatotoxicity and revealed the toxicity mechanism as well as the potential of using fatty acids as toxicity markers. In conclusion, the comparative fatty acid profiling technique was determined to be fast and accurate and allowed the discovery of potential fatty acid biomarkers in a more economical and efficient manner. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. EDGAR: A software framework for the comparative analysis of prokaryotic genomes

    PubMed Central

    Blom, Jochen; Albaum, Stefan P; Doppmeier, Daniel; Pühler, Alfred; Vorhölter, Frank-Jörg; Zakrzewski, Martha; Goesmann, Alexander

    2009-01-01

    Background The introduction of next generation sequencing approaches has caused a rapid increase in the number of completely sequenced genomes. As one result of this development, it is now feasible to analyze large groups of related genomes in a comparative approach. A main task in comparative genomics is the identification of orthologous genes in different genomes and the classification of genes as core genes or singletons. Results To support these studies EDGAR – "Efficient Database framework for comparative Genome Analyses using BLAST score Ratios" – was developed. EDGAR is designed to automatically perform genome comparisons in a high throughput approach. Comparative analyses for 582 genomes across 75 genus groups taken from the NCBI genomes database were conducted with the software and the results were integrated into an underlying database. To demonstrate a specific application case, we analyzed ten genomes of the bacterial genus Xanthomonas, for which phylogenetic studies were awkward due to divergent taxonomic systems. The resultant phylogeny EDGAR provided was consistent with outcomes from traditional approaches performed recently and moreover, it was possible to root each strain with unprecedented accuracy. Conclusion EDGAR provides novel analysis features and significantly simplifies the comparative analysis of related genomes. The software supports a quick survey of evolutionary relationships and simplifies the process of obtaining new biological insights into the differential gene content of kindred genomes. Visualization features, like synteny plots or Venn diagrams, are offered to the scientific community through a web-based and therefore platform independent user interface , where the precomputed data sets can be browsed. PMID:19457249

  3. Detection of lobular structures in normal breast tissue.

    PubMed

    Apou, Grégory; Schaadt, Nadine S; Naegel, Benoît; Forestier, Germain; Schönmeyer, Ralf; Feuerhake, Friedrich; Wemmert, Cédric; Grote, Anne

    2016-07-01

    Ongoing research into inflammatory conditions raises an increasing need to evaluate immune cells in histological sections in biologically relevant regions of interest (ROIs). Herein, we compare different approaches to automatically detect lobular structures in human normal breast tissue in digitized whole slide images (WSIs). This automation is required to perform objective and consistent quantitative studies on large data sets. In normal breast tissue from nine healthy patients immunohistochemically stained for different markers, we evaluated and compared three different image analysis methods to automatically detect lobular structures in WSIs: (1) a bottom-up approach using the cell-based data for subsequent tissue level classification, (2) a top-down method starting with texture classification at tissue level analysis of cell densities in specific ROIs, and (3) a direct texture classification using deep learning technology. All three methods result in comparable overall quality allowing automated detection of lobular structures with minor advantage in sensitivity (approach 3), specificity (approach 2), or processing time (approach 1). Combining the outputs of the approaches further improved the precision. Different approaches of automated ROI detection are feasible and should be selected according to the individual needs of biomarker research. Additionally, detected ROIs could be used as a basis for quantification of immune infiltration in lobular structures. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Comparative Performance Analysis of a Hyper-Temporal Ndvi Analysis Approach and a Landscape-Ecological Mapping Approach

    NASA Astrophysics Data System (ADS)

    Ali, A.; de Bie, C. A. J. M.; Scarrott, R. G.; Ha, N. T. T.; Skidmore, A. K.

    2012-07-01

    Both agricultural area expansion and intensification are necessary to cope with the growing demand for food, and the growing threat of food insecurity which is rapidly engulfing poor and under-privileged sections of the global population. Therefore, it is of paramount importance to have the ability to accurately estimate crop area and spatial distribution. Remote sensing has become a valuable tool for estimating and mapping cropland areas, useful in food security monitoring. This work contributes to addressing this broad issue, focusing on the comparative performance analysis of two mapping approaches (i) a hyper-temporal Normalized Difference Vegetation Index (NDVI) analysis approach and (ii) a Landscape-ecological approach. The hyper-temporal NDVI analysis approach utilized SPOT 10-day NDVI imagery from April 1998-December 2008, whilst the Landscape-ecological approach used multitemporal Landsat-7 ETM+ imagery acquired intermittently between 1992 and 2002. Pixels in the time-series NDVI dataset were clustered using an ISODATA clustering algorithm adapted to determine the optimal number of pixel clusters to successfully generalize hyper-temporal datasets. Clusters were then characterized with crop cycle information, and flooding information to produce an NDVI unit map of rice classes with flood regime and NDVI profile information. A Landscape-ecological map was generated using a combination of digitized homogenous map units in the Landsat-7 ETM+ imagery, a Land use map 2005 of the Mekong delta, and supplementary datasets on the regions terrain, geo-morphology and flooding depths. The output maps were validated using reported crop statistics, and regression analyses were used to ascertain the relationship between land use area estimated from maps, and those reported in district crop statistics. The regression analysis showed that the hyper-temporal NDVI analysis approach explained 74% and 76% of the variability in reported crop statistics in two rice crop and three rice crop land use systems respectively. In contrast, 64% and 63% of the variability was explained respectively by the Landscape-ecological map. Overall, the results indicate the hyper-temporal NDVI analysis approach is more accurate and more useful in exploring when, why and how agricultural land use manifests itself in space and time. Furthermore, the NDVI analysis approach was found to be easier to implement, was more cost effective, and involved less subjective user intervention than the landscape-ecological approach.

  5. Anterior Versus Posterior Approach for Multilevel Degenerative Cervical Disease: A Retrospective Propensity Score-Matched Study of the MarketScan Database.

    PubMed

    Cole, Tyler; Veeravagu, Anand; Zhang, Michael; Azad, Tej D; Desai, Atman; Ratliff, John K

    2015-07-01

    Retrospective 2:1 propensity score-matched analysis on a national longitudinal database between 2006 and 2010. To compare rates of adverse events, revisions procedure rates, and payment differences in anterior cervical fusion procedures compared with posterior laminectomy and fusion procedures with at least 3 levels of instrumentation. The comparative benefits of anterior versus posterior approach to multilevel degenerative cervical disease remain controversial. Recent systematic reviews have reached conflicting conclusions. We demonstrate the comparative economic and clinical outcomes of anterior and posterior approaches for multilevel cervical degenerative disk disease. We identified 13,662 patients in a national billing claims database who underwent anterior or posterior cervical fusion procedures with 3 or more levels of instrumentation. Cohorts were balanced using 2:1 propensity score matching and outcomes were compared using bivariate analysis. With the exception of dysphagia (6.4% in anterior and 1.4% in posterior), overall 30-day complication rates were lower in the anterior approach group. The rate of any complication excluding dysphagia with anterior approaches was 12.3%, significantly lower (P < 0.0001) than that of posterior approaches, 17.8%. Anterior approaches resulted in lower hospital ($18,346 vs. $23,638) and total payments ($28,963 vs. $33,526). Patients receiving an anterior surgical approach demonstrated significantly lower rate of 30-day readmission (5.1% vs. 9.9%, P < 0.0001), were less likely to require revision surgery (12.8% vs. 18.1%, P < 0.0001), and had a shorter length of stay by 1.5 nights (P < 0.0001). Anterior approaches in the surgical management of multilevel degenerative cervical disease provide clinical advantages over posterior approaches, including lower overall complication rates, revision procedure rates, and decreased length of stay. Anterior approach procedures are also associated with decreased overall payments. These findings must be interpreted in light of limitations inherent to retrospective longitudinal studies including absence of subjective and radiographical outcomes. 3.

  6. Culture-Independent Analysis of Probiotic Products by Denaturing Gradient Gel Electrophoresis

    PubMed Central

    Temmerman, R.; Scheirlinck, I.; Huys, G.; Swings, J.

    2003-01-01

    In order to obtain functional and safe probiotic products for human consumption, fast and reliable quality control of these products is crucial. Currently, analysis of most probiotics is still based on culture-dependent methods involving the use of specific isolation media and identification of a limited number of isolates, which makes this approach relatively insensitive, laborious, and time-consuming. In this study, a collection of 10 probiotic products, including four dairy products, one fruit drink, and five freeze-dried products, were subjected to microbial analysis by using a culture-independent approach, and the results were compared with the results of a conventional culture-dependent analysis. The culture-independent approach involved extraction of total bacterial DNA directly from the product, PCR amplification of the V3 region of the 16S ribosomal DNA, and separation of the amplicons on a denaturing gradient gel. Digital capturing and processing of denaturing gradient gel electrophoresis (DGGE) band patterns allowed direct identification of the amplicons at the species level. This whole culture-independent approach can be performed in less than 30 h. Compared with culture-dependent analysis, the DGGE approach was found to have a much higher sensitivity for detection of microbial strains in probiotic products in a fast, reliable, and reproducible manner. Unfortunately, as reported in previous studies in which the culture-dependent approach was used, a rather high percentage of probiotic products suffered from incorrect labeling and yielded low bacterial counts, which may decrease their probiotic potential. PMID:12513998

  7. Nonlinear Stochastic PDEs: Analysis and Approximations

    DTIC Science & Technology

    2016-05-23

    numerical performance. Main theoretical and experimental advances include: 1.Introduction of a number of effective approaches to numerical analysis of...Stokes and Euler SPDEs, quasi -geostrophic SPDE, Ginzburg-Landau SPDE and Duffing oscillator REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT...compare their numerical performance. Main theoretical and experimental advances include: 1.Introduction of a number of effective approaches to

  8. A Comparison of Alternative Approaches to the Analysis of Interrupted Time-Series.

    ERIC Educational Resources Information Center

    Harrop, John W.; Velicer, Wayne F.

    1985-01-01

    Computer generated data representative of 16 Auto Regressive Integrated Moving Averages (ARIMA) models were used to compare the results of interrupted time-series analysis using: (1) the known model identification, (2) an assumed (l,0,0) model, and (3) an assumed (3,0,0) model as an approximation to the General Transformation approach. (Author/BW)

  9. Quality, Pedagogy and Governance in Private Higher Education Institutions in Egypt

    ERIC Educational Resources Information Center

    Barsoum, Ghada

    2017-01-01

    Building on a mixed method research approach, this article reports on an analysis of the difference between public and private higher education institutions (HEIs) in Egypt in terms of teaching methods, quality assessment approaches and alumni engagement. An analysis of the survey data compared the experiences of 1,713 graduates of both private…

  10. Bayesian Mediation Analysis

    ERIC Educational Resources Information Center

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  11. Transperitoneal approach versus retroperitoneal approach: a meta-analysis of laparoscopic partial nephrectomy for renal cell carcinoma.

    PubMed

    Ren, Tong; Liu, Yan; Zhao, Xiaowen; Ni, Shaobin; Zhang, Cheng; Guo, Changgang; Ren, Minghua

    2014-01-01

    To compare the efficiency and safety of the transperitoneal approaches with retroperitoneal approaches in laparoscopic partial nephrectomy for renal cell carcinoma and provide evidence-based medicine support for clinical treatment. A systematic computer search of PUBMED, EMBASE, and the Cochrane Library was executed to identify retrospective observational and prospective randomized controlled trials studies that compared the outcomes of the two approaches in laparoscopic partial nephrectomy. Two reviewers independently screened, extracted, and evaluated the included studies and executed statistical analysis by using software STATA 12.0. Outcomes of interest included perioperative and postoperative variables, surgical complications and oncological variables. There were 8 studies assessed transperitoneal laparoscopic partial nephrectomy (TLPN) versus retroperitoneal laparoscopic partial nephrectomy (RLPN) were included. RLPN had a shorter operating time (SMD = 1.001,95%confidence interval[CI] 0.609-1.393,P<0.001), a lower estimated blood loss (SMD = 0.403,95%CI 0.015-0.791,P = 0.042) and a shorter length of hospital stay (WMD = 0.936 DAYS,95%CI 0.609-1.263,P<0.001) than TLPN. There were no significant differences between the transperitoneal and retroperitoneal approaches in other outcomes of interest. This meta-analysis indicates that, in appropriately selected patients, especially patients with intraperitoneal procedures history or posteriorly located renal tumors, the RLPN can shorten the operation time, reduce the estimated blood loss and shorten the length of hospital stay. RLPN may be equally safe and be faster compared with the TLPN.

  12. Global GNSS processing based on the raw observation approach

    NASA Astrophysics Data System (ADS)

    Strasser, Sebastian; Zehentner, Norbert; Mayer-Gürr, Torsten

    2017-04-01

    Many global navigation satellite system (GNSS) applications, e.g. Precise Point Positioning (PPP), require high-quality GNSS products, such as precise GNSS satellite orbits and clocks. These products are routinely determined by analysis centers of the International GNSS Service (IGS). The current processing methods of the analysis centers make use of the ionosphere-free linear combination to reduce the ionospheric influence. Some of the analysis centers also form observation differences, in general double-differences, to eliminate several additional error sources. The raw observation approach is a new GNSS processing approach that was developed at Graz University of Technology for kinematic orbit determination of low Earth orbit (LEO) satellites and subsequently adapted to global GNSS processing in general. This new approach offers some benefits compared to well-established approaches, such as a straightforward incorporation of new observables due to the avoidance of observation differences and linear combinations. This becomes especially important in view of the changing GNSS landscape with two new systems, the European system Galileo and the Chinese system BeiDou, currently in deployment. GNSS products generated at Graz University of Technology using the raw observation approach currently comprise precise GNSS satellite orbits and clocks, station positions and clocks, code and phase biases, and Earth rotation parameters. To evaluate the new approach, products generated using the Global Positioning System (GPS) constellation and observations from the global IGS station network are compared to those of the IGS analysis centers. The comparisons show that the products generated at Graz University of Technology are on a similar level of quality to the products determined by the IGS analysis centers. This confirms that the raw observation approach is applicable to global GNSS processing. Some areas requiring further work have been identified, enabling future improvements of the method.

  13. One step versus two step approach for gestational diabetes screening: systematic review and meta-analysis of the randomized trials.

    PubMed

    Saccone, Gabriele; Caissutti, Claudia; Khalifeh, Adeeb; Meltzer, Sara; Scifres, Christina; Simhan, Hyagriv N; Kelekci, Sefa; Sevket, Osman; Berghella, Vincenzo

    2017-12-03

    To compare both the prevalence of gestational diabetes mellitus (GDM) as well as maternal and neonatal outcomes by either the one-step or the two-step approaches. Electronic databases were searched from their inception until June 2017. We included all randomized controlled trials (RCTs) comparing the one-step with the two-step approaches for the screening and diagnosis of GDM. The primary outcome was the incidence of GDM. Three RCTs (n = 2333 participants) were included in the meta-analysis. 910 were randomized to the one step approach (75 g, 2 hrs), and 1423 to the two step approach. No significant difference in the incidence of GDM was found comparing the one step versus the two step approaches (8.4 versus 4.3%; relative risk (RR) 1.64, 95%CI 0.77-3.48). Women screened with the one step approach had a significantly lower risk of preterm birth (PTB) (3.7 versus 7.6%; RR 0.49, 95%CI 0.27-0.88), cesarean delivery (16.3 versus 22.0%; RR 0.74, 95%CI 0.56-0.99), macrosomia (2.9 versus 6.9%; RR 0.43, 95%CI 0.22-0.82), neonatal hypoglycemia (1.7 versus 4.5%; RR 0.38, 95%CI 0.16-0.90), and admission to neonatal intensive care unit (NICU) (4.4 versus 9.0%; RR 0.49, 95%CI 0.29-0.84), compared to those randomized to screening with the two step approach. The one and the two step approaches were not associated with a significant difference in the incidence of GDM. However, the one step approach was associated with better maternal and perinatal outcomes.

  14. The Safety and Efficacy of Approaches to Liver Resection: A Meta-Analysis

    PubMed Central

    Hauch, Adam; Hu, Tian; Buell, Joseph F.; Slakey, Douglas P.; Kandil, Emad

    2015-01-01

    Background: The aim of this study is to compare the safety and efficacy of conventional laparotomy with those of robotic and laparoscopic approaches to hepatectomy. Database: Independent reviewers conducted a systematic review of publications in PubMed and Embase, with searches limited to comparative articles of laparoscopic hepatectomy with either conventional or robotic liver approaches. Outcomes included total operative time, estimated blood loss, length of hospitalization, resection margins, postoperative complications, perioperative mortality rates, and cost measures. Outcome comparisons were calculated using random-effects models to pool estimates of mean net differences or of the relative risk between group outcomes. Forty-nine articles, representing 3702 patients, comprise this analysis: 1901 (51.35%) underwent a laparoscopic approach, 1741 (47.03%) underwent an open approach, and 60 (1.62%) underwent a robotic approach. There was no difference in total operative times, surgical margins, or perioperative mortality rates among groups. Across all outcome measures, laparoscopic and robotic approaches showed no difference. As compared with the minimally invasive groups, patients undergoing laparotomy had a greater estimated blood loss (pooled mean net change, 152.0 mL; 95% confidence interval, 103.3–200.8 mL), a longer length of hospital stay (pooled mean difference, 2.22 days; 95% confidence interval, 1.78–2.66 days), and a higher total complication rate (odds ratio, 0.5; 95% confidence interval, 0.42–0.57). Conclusion: Minimally invasive approaches to liver resection are as safe as conventional laparotomy, affording less estimated blood loss, shorter lengths of hospitalization, lower perioperative complication rates, and equitable oncologic integrity and postoperative mortality rates. There was no proven advantage of robotic approaches compared with laparoscopic approaches. PMID:25848191

  15. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  16. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Mammogram registration using the Cauchy-Navier spline

    NASA Astrophysics Data System (ADS)

    Wirth, Michael A.; Choi, Christopher

    2001-07-01

    The process of comparative analysis involves inspecting mammograms for characteristic signs of potential cancer by comparing various analogous mammograms. Factors such as the deformable behavior of the breast, changes in breast positioning, and the amount/geometry of compression may contribute to spatial differences between corresponding structures in corresponding mammograms, thereby significantly complicating comparative analysis. Mammogram registration is a process whereby spatial differences between mammograms can be reduced. Presented in this paper is a nonrigid approach to matching corresponding mammograms based on a physical registration model. Many of the earliest approaches to mammogram registration used spatial transformations which were innately rigid or affine in nature. More recently algorithms have incorporated radial basis functions such as the Thin-Plate Spline to match mammograms. The approach presented here focuses on the use of the Cauchy-Navier Spline, a deformable registration model which offers approximate nonrigid registration. The utility of the Cauchy-Navier Spline is illustrated by matching both temporal and bilateral mammograms.

  18. Humanistic therapies versus other psychological therapies for depression

    PubMed Central

    Churchill, Rachel; Davies, Philippa; Caldwell, Deborah; Moore, Theresa HM; Jones, Hannah; Lewis, Glyn; Hunot, Vivien

    2014-01-01

    This is the protocol for a review and there is no abstract. The objectives are as follows: To examine the effectiveness and acceptability of all humanistic therapies compared with all other psychological therapy approaches for acute depression.To examine the effectiveness and acceptability of different humanistic therapy models (person-centred, gestalt, process-experiential, transactional analysis, existential and non-directive therapies) compared with all other psychological therapy approaches for acute depression.To examine the effectiveness and acceptability of all humanistic therapies compared with different psychological therapy approaches (psychodynamic, behavioural, humanistic, integrative, cognitive-behavioural) for acute depression. PMID:25278809

  19. Proposing a sequential comparative analysis for assessing multilateral health agency transformation and sustainable capacity: exploring the advantages of institutional theory

    PubMed Central

    2014-01-01

    Background This article proposes an approach to comparing and assessing the adaptive capacity of multilateral health agencies in meeting country and individual healthcare needs. Most studies comparing multilateral health agencies have failed to clearly propose a method for conducting agency comparisons. Methods This study conducted a qualitative case study methodological approach, such that secondary and primary case study literature was used to conduct case study comparisons of multilateral health agencies. Results Through the proposed Sequential Comparative Analysis (SCA), the author found a more effective way to justify the selection of cases, compare and assess organizational transformative capacity, and to learn from agency success in policy sustainability processes. Conclusions To more affectively understand and explain why some multilateral health agencies are more capable of adapting to country and individual healthcare needs, SCA provides a methodological approach that may help to better understand why these agencies are so different and what we can learn from successful reform processes. As funding challenges continue to hamper these agencies' adaptive capacity, learning from each other will become increasingly important. PMID:24886283

  20. Proposing a sequential comparative analysis for assessing multilateral health agency transformation and sustainable capacity: exploring the advantages of institutional theory.

    PubMed

    Gómez, Eduardo J

    2014-05-20

    This article proposes an approach to comparing and assessing the adaptive capacity of multilateral health agencies in meeting country and individual healthcare needs. Most studies comparing multilateral health agencies have failed to clearly propose a method for conducting agency comparisons. This study conducted a qualitative case study methodological approach, such that secondary and primary case study literature was used to conduct case study comparisons of multilateral health agencies. Through the proposed Sequential Comparative Analysis (SCA), the author found a more effective way to justify the selection of cases, compare and assess organizational transformative capacity, and to learn from agency success in policy sustainability processes. To more affectively understand and explain why some multilateral health agencies are more capable of adapting to country and individual healthcare needs, SCA provides a methodological approach that may help to better understand why these agencies are so different and what we can learn from successful reform processes. As funding challenges continue to hamper these agencies' adaptive capacity, learning from each other will become increasingly important.

  1. Categorical data processing for real estate objects valuation using statistical analysis

    NASA Astrophysics Data System (ADS)

    Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.

    2018-05-01

    Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.

  2. Analysis of composite plates by using mechanics of structure genome and comparison with ANSYS

    NASA Astrophysics Data System (ADS)

    Zhao, Banghua

    Motivated by a recently discovered concept, Structure Genome (SG) which is defined as the smallest mathematical building block of a structure, a new approach named Mechanics of Structure Genome (MSG) to model and analyze composite plates is introduced. MSG is implemented in a general-purpose code named SwiftComp(TM), which provides the constitutive models needed in structural analysis by homogenization and pointwise local fields by dehomogenization. To improve the user friendliness of SwiftComp(TM), a simple graphic user interface (GUI) based on ANSYS Mechanical APDL platform, called ANSYS-SwiftComp GUI is developed, which provides a convenient way to create some common SG models or arbitrary customized SG models in ANSYS and invoke SwiftComp(TM) to perform homogenization and dehomogenization. The global structural analysis can also be handled in ANSYS after homogenization, which could predict the global behavior and provide needed inputs for dehomogenization. To demonstrate the accuracy and efficiency of the MSG approach, several numerical cases are studied and compared using both MSG and ANSYS. In the ANSYS approach, 3D solid element models (ANSYS 3D approach) are used as reference models and the 2D shell element models created by ANSYS Composite PrepPost (ACP approach) are compared with the MSG approach. The results of the MSG approach agree well with the ANSYS 3D approach while being as efficient as the ACP approach. Therefore, the MSG approach provides an efficient and accurate new way to model composite plates.

  3. Anterior approach versus posterior approach for Pipkin I and II femoral head fractures: A systemic review and meta-analysis.

    PubMed

    Wang, Chen-guang; Li, Yao-min; Zhang, Hua-feng; Li, Hui; Li, Zhi-jun

    2016-03-01

    We performed a meta-analysis, pooling the results from controlled clinical trials to compare the efficiency of anterior and posterior surgical approaches to Pipkin I and II fractures of the femoral head. Potential academic articles were identified from the Cochrane Library, Medline (1966-2015.5), PubMed (1966-2015.5), Embase (1980-2015.5) and ScienceDirect (1966-2015.5) databases. Gray studies were identified from the references of the included literature. Pooling of the data was performed and analyzed by RevMan software, version 5.1. Five case-control trials (CCTs) met the inclusion criteria. There were significant differences in the incidence of heterotopic ossification (HO) between the approaches, but no significant differences were found between the two groups regarding functional outcomes of the hip, general postoperative complications, osteonecrosis of the femoral head or post-traumatic arthritis. The present meta-analysis indicated that the posterior approach decreased the risk of heterotopic ossification compared with the anterior approach for the treatment of Pipkin I and II femoral head fractures. No other complications were related to anterior and posterior approaches. Future high-quality randomized, controlled trials (RCTs) are needed to determine the optimal surgical approach and to predict other postoperative complications. III. Copyright © 2016 IJS Publishing Group Limited. Published by Elsevier Ltd. All rights reserved.

  4. Gearbox Tooth Cut Fault Diagnostics Using Acoustic Emission and Vibration Sensors — A Comparative Study

    PubMed Central

    Qu, Yongzhi; He, David; Yoon, Jae; Van Hecke, Brandon; Bechhoefer, Eric; Zhu, Junda

    2014-01-01

    In recent years, acoustic emission (AE) sensors and AE-based techniques have been developed and tested for gearbox fault diagnosis. In general, AE-based techniques require much higher sampling rates than vibration analysis-based techniques for gearbox fault diagnosis. Therefore, it is questionable whether an AE-based technique would give a better or at least the same performance as the vibration analysis-based techniques using the same sampling rate. To answer the question, this paper presents a comparative study for gearbox tooth damage level diagnostics using AE and vibration measurements, the first known attempt to compare the gearbox fault diagnostic performance of AE- and vibration analysis-based approaches using the same sampling rate. Partial tooth cut faults are seeded in a gearbox test rig and experimentally tested in a laboratory. Results have shown that the AE-based approach has the potential to differentiate gear tooth damage levels in comparison with the vibration-based approach. While vibration signals are easily affected by mechanical resonance, the AE signals show more stable performance. PMID:24424467

  5. Making Sense and Nonsense: Comparing Mediated Discourse and Agential Realist Approaches to Materiality in a Preschool Makerspace

    ERIC Educational Resources Information Center

    Wohlwend, Karen E.; Peppler, Kylie A.; Keune, Anna; Thompson, Naomi

    2017-01-01

    Two approaches to materiality (i.e. mediated discourse and agential realism) are compared to explore their usefulness in tracking literacies in action and artefacts produced during a play and design activity in a preschool makerspace. Mediated discourse analysis has relied on linguistic framing and social semiotics to make sense of multimodality.…

  6. Approach to proliferation risk assessment based on multiple objective analysis framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrianov, A.; Kuptsov, I.; Studgorodok 1, Obninsk, Kaluga region, 249030

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materialsmore » circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.« less

  7. EXTRACTING PRINCIPLE COMPONENTS FOR DISCRIMINANT ANALYSIS OF FMRI IMAGES

    PubMed Central

    Liu, Jingyu; Xu, Lai; Caprihan, Arvind; Calhoun, Vince D.

    2009-01-01

    This paper presents an approach for selecting optimal components for discriminant analysis. Such an approach is useful when further detailed analyses for discrimination or characterization requires dimensionality reduction. Our approach can accommodate a categorical variable such as diagnosis (e.g. schizophrenic patient or healthy control), or a continuous variable like severity of the disorder. This information is utilized as a reference for measuring a component’s discriminant power after principle component decomposition. After sorting each component according to its discriminant power, we extract the best components for discriminant analysis. An application of our reference selection approach is shown using a functional magnetic resonance imaging data set in which the sample size is much less than the dimensionality. The results show that the reference selection approach provides an improved discriminant component set as compared to other approaches. Our approach is general and provides a solid foundation for further discrimination and classification studies. PMID:20582334

  8. EXTRACTING PRINCIPLE COMPONENTS FOR DISCRIMINANT ANALYSIS OF FMRI IMAGES.

    PubMed

    Liu, Jingyu; Xu, Lai; Caprihan, Arvind; Calhoun, Vince D

    2008-05-12

    This paper presents an approach for selecting optimal components for discriminant analysis. Such an approach is useful when further detailed analyses for discrimination or characterization requires dimensionality reduction. Our approach can accommodate a categorical variable such as diagnosis (e.g. schizophrenic patient or healthy control), or a continuous variable like severity of the disorder. This information is utilized as a reference for measuring a component's discriminant power after principle component decomposition. After sorting each component according to its discriminant power, we extract the best components for discriminant analysis. An application of our reference selection approach is shown using a functional magnetic resonance imaging data set in which the sample size is much less than the dimensionality. The results show that the reference selection approach provides an improved discriminant component set as compared to other approaches. Our approach is general and provides a solid foundation for further discrimination and classification studies.

  9. Mapping Copper and Lead Concentrations at Abandoned Mine Areas Using Element Analysis Data from ICP–AES and Portable XRF Instruments: A Comparative Study

    PubMed Central

    Lee, Hyeongyu; Choi, Yosoon; Suh, Jangwon; Lee, Seung-Ho

    2016-01-01

    Understanding spatial variation of potentially toxic trace elements (PTEs) in soil is necessary to identify the proper measures for preventing soil contamination at both operating and abandoned mining areas. Many studies have been conducted worldwide to explore the spatial variation of PTEs and to create soil contamination maps using geostatistical methods. However, they generally depend only on inductively coupled plasma atomic emission spectrometry (ICP–AES) analysis data, therefore such studies are limited by insufficient input data owing to the disadvantages of ICP–AES analysis such as its costly operation and lengthy period required for analysis. To overcome this limitation, this study used both ICP–AES and portable X-ray fluorescence (PXRF) analysis data, with relatively low accuracy, for mapping copper and lead concentrations at a section of the Busan abandoned mine in Korea and compared the prediction performances of four different approaches: the application of ordinary kriging to ICP–AES analysis data, PXRF analysis data, both ICP–AES and transformed PXRF analysis data by considering the correlation between the ICP–AES and PXRF analysis data, and co-kriging to both the ICP–AES (primary variable) and PXRF analysis data (secondary variable). Their results were compared using an independent validation data set. The results obtained in this case study showed that the application of ordinary kriging to both ICP–AES and transformed PXRF analysis data is the most accurate approach when considers the spatial distribution of copper and lead contaminants in the soil and the estimation errors at 11 sampling points for validation. Therefore, when generating soil contamination maps for an abandoned mine, it is beneficial to use the proposed approach that incorporates the advantageous aspects of both ICP–AES and PXRF analysis data. PMID:27043594

  10. Mapping Copper and Lead Concentrations at Abandoned Mine Areas Using Element Analysis Data from ICP-AES and Portable XRF Instruments: A Comparative Study.

    PubMed

    Lee, Hyeongyu; Choi, Yosoon; Suh, Jangwon; Lee, Seung-Ho

    2016-03-30

    Understanding spatial variation of potentially toxic trace elements (PTEs) in soil is necessary to identify the proper measures for preventing soil contamination at both operating and abandoned mining areas. Many studies have been conducted worldwide to explore the spatial variation of PTEs and to create soil contamination maps using geostatistical methods. However, they generally depend only on inductively coupled plasma atomic emission spectrometry (ICP-AES) analysis data, therefore such studies are limited by insufficient input data owing to the disadvantages of ICP-AES analysis such as its costly operation and lengthy period required for analysis. To overcome this limitation, this study used both ICP-AES and portable X-ray fluorescence (PXRF) analysis data, with relatively low accuracy, for mapping copper and lead concentrations at a section of the Busan abandoned mine in Korea and compared the prediction performances of four different approaches: the application of ordinary kriging to ICP-AES analysis data, PXRF analysis data, both ICP-AES and transformed PXRF analysis data by considering the correlation between the ICP-AES and PXRF analysis data, and co-kriging to both the ICP-AES (primary variable) and PXRF analysis data (secondary variable). Their results were compared using an independent validation data set. The results obtained in this case study showed that the application of ordinary kriging to both ICP-AES and transformed PXRF analysis data is the most accurate approach when considers the spatial distribution of copper and lead contaminants in the soil and the estimation errors at 11 sampling points for validation. Therefore, when generating soil contamination maps for an abandoned mine, it is beneficial to use the proposed approach that incorporates the advantageous aspects of both ICP-AES and PXRF analysis data.

  11. Learner Performance in Multimedia Learning Arrangements: An Analysis across Instructional Approaches

    ERIC Educational Resources Information Center

    Eysink, Tessa H. S.; de Jong, Ton; Berthold, Kirsten; Kolloffel, Bas; Opfermann, Maria; Wouters, Pieter

    2009-01-01

    In this study, the authors compared four multimedia learning arrangements differing in instructional approach on effectiveness and efficiency for learning: (a) hypermedia learning, (b) observational learning, (c) self-explanation-based learning, and (d) inquiry learning. The approaches all advocate learners' active attitude toward the learning…

  12. Aerodynamic design and analysis of small horizontal axis wind turbine blades

    NASA Astrophysics Data System (ADS)

    Tang, Xinzi

    This work investigates the aerodynamic design and analysis of small horizontal axis wind turbine blades via the blade element momentum (BEM) based approach and the computational fluid dynamics (CFD) based approach. From this research, it is possible to draw a series of detailed guidelines on small wind turbine blade design and analysis. The research also provides a platform for further comprehensive study using these two approaches. The wake induction corrections and stall corrections of the BEM method were examined through a case study of the NREL/NASA Phase VI wind turbine. A hybrid stall correction model was proposed to analyse wind turbine power performance. The proposed model shows improvement in power prediction for the validation case, compared with the existing stall correction models. The effects of the key rotor parameters of a small wind turbine as well as the blade chord and twist angle distributions on power performance were investigated through two typical wind turbines, i.e. a fixed-pitch variable-speed (FPVS) wind turbine and a fixed-pitch fixed-speed (FPFS) wind turbine. An engineering blade design and analysis code was developed in MATLAB to accommodate aerodynamic design and analysis of the blades.. The linearisation for radial profiles of blade chord and twist angle for the FPFS wind turbine blade design was discussed. Results show that, the proposed linearisation approach leads to reduced manufacturing cost and higher annual energy production (AEP), with minimal effects on the low wind speed performance. Comparative studies of mesh and turbulence models in 2D and 3D CFD modelling were conducted. The CFD predicted lift and drag coefficients of the airfoil S809 were compared with wind tunnel test data and the 3D CFD modelling method of the NREL/NASA Phase VI wind turbine were validated against measurements. Airfoil aerodynamic characterisation and wind turbine power performance as well as 3D flow details were studied. The detailed flow characteristics from the CFD modelling are quantitatively comparable to the measurements, such as blade surface pressure distribution and integrated forces and moments. It is confirmed that the CFD approach is able to provide a more detailed qualitative and quantitative analysis for wind turbine airfoils and rotors..

  13. A Comparative Analysis of Method Books for Class Jazz Instruction

    ERIC Educational Resources Information Center

    Watson, Kevin E.

    2017-01-01

    The purpose of this study was to analyze and compare instructional topics and teaching approaches included in selected class method books for jazz pedagogy through content analysis methodology. Frequency counts for the number of pages devoted to each defined instructional content category were compiled and percentages of pages allotted to each…

  14. Case Problems for Problem-Based Pedagogical Approaches: A Comparative Analysis

    ERIC Educational Resources Information Center

    Dabbagh, Nada; Dass, Susan

    2013-01-01

    A comparative analysis of 51 case problems used in five problem-based pedagogical models was conducted to examine whether there are differences in their characteristics and the implications of such differences on the selection and generation of ill-structured case problems. The five pedagogical models were: situated learning, goal-based scenario,…

  15. Dealing with Complex Causality in Realist Synthesis: The Promise of Qualitative Comparative Analysis

    ERIC Educational Resources Information Center

    Sager, Fritz; Andereggen, Celine

    2012-01-01

    In this article, the authors state two arguments: first, that the four categories of context, politics, polity, and policy make an adequate framework for systematic review being both exhaustive and parsimonious; second, that the method of qualitative comparative analysis (QCA) is an appropriate methodical approach for gaining realistic results…

  16. Learning Competences in Open Mobile Environments: A Comparative Analysis between Formal and Non-Formal Spaces

    ERIC Educational Resources Information Center

    Figaredo, Daniel Domínguez; Miravalles, Paz Trillo

    2014-01-01

    As a result of the increasing use of mobile devices in education, new approaches to define the learning competences in the field of digitally mediated learning have emerged. This paper examines these approaches, using data obtained from empirical research with a group of Spanish university students. The analysis is focused on the experiences of…

  17. Realist identification of group-level latent variables for perinatal social epidemiology theory building.

    PubMed

    Eastwood, John Graeme; Jalaludin, Bin Badrudin; Kemp, Lynn Ann; Phung, Hai Ngoc

    2014-01-01

    We have previously reported in this journal on an ecological study of perinatal depressive symptoms in South Western Sydney. In that article, we briefly reported on a factor analysis that was utilized to identify empirical indicators for analysis. In this article, we report on the mixed method approach that was used to identify those latent variables. Social epidemiology has been slow to embrace a latent variable approach to the study of social, political, economic, and cultural structures and mechanisms, partly for philosophical reasons. Critical realist ontology and epistemology have been advocated as an appropriate methodological approach to both theory building and theory testing in the health sciences. We describe here an emergent mixed method approach that uses qualitative methods to identify latent constructs followed by factor analysis using empirical indicators chosen to measure identified qualitative codes. Comparative analysis of the findings is reported together with a limited description of realist approaches to abstract reasoning.

  18. Microbiome and Culture Based Analysis of Chronic Rhinosinusitis Compared to Healthy Sinus Mucosa.

    PubMed

    Koeller, Kerstin; Herlemann, Daniel P R; Schuldt, Tobias; Ovari, Attila; Guder, Ellen; Podbielski, Andreas; Kreikemeyer, Bernd; Olzowy, Bernhard

    2018-01-01

    The role of bacteria in chronic rhinosinusitis (CRS) is still not well understood. Whole microbiome analysis adds new aspects to our current understanding that is mainly based on isolated bacteria. It is still unclear how the results of microbiome analysis and the classical culture based approaches interrelate. To address this, middle meatus swabs and tissue samples were obtained during sinus surgery in 5 patients with CRS with nasal polyps (CRSwNP), 5 patients with diffuse CRS without nasal polyps (CRSsNP), 5 patients with unilateral purulent maxillary CRS (upm CRS) and 3 patients with healthy sinus mucosa. Swabs were cultured, and associated bacteria were identified. Additionally, parts of each tissue sample also underwent culture approaches, and in parallel DNA was extracted for 16S rRNA gene amplicon-based microbiome analysis. From tissue samples 4.2 ± 1.2 distinct species per patient were cultured, from swabs 5.4 ± 1.6. The most frequently cultured species from the swabs were Propionibacterium acnes, Staphylococcus epidermidis, Corynebacterium spp. and Staphylococcus aureus . The 16S-RNA gene analysis revealed no clear differentiation of the bacterial community of healthy compared to CRS samples of unilateral purulent maxillary CRS and CRSwNP. However, the bacterial community of CRSsNP differed significantly from the healthy controls. In the CRSsNP samples Flavobacterium, Pseudomonas, Pedobacter, Porphyromonas, Stenotrophomonas , and Brevundimonas were significantly enriched compared to the healthy controls. Species isolated from culture did not generally correspond with the most abundant genera in microbiome analysis. Only Fusobacteria, Parvimonas , and Prevotella found in 2 unilateral purulent maxillary CRS samples by the cultivation dependent approach were also found in the cultivation independent approach in high abundance, suggesting a classic infectious pathogenesis of odontogenic origin in these two specific cases. Alterations of the bacterial community might be a more crucial factor for the development of CRSsNP compared to CRSwNP. Further studies are needed to investigate the relation between bacterial community characteristics and the development of CRSsNP.

  19. Microbiome and Culture Based Analysis of Chronic Rhinosinusitis Compared to Healthy Sinus Mucosa

    PubMed Central

    Koeller, Kerstin; Herlemann, Daniel P. R.; Schuldt, Tobias; Ovari, Attila; Guder, Ellen; Podbielski, Andreas; Kreikemeyer, Bernd; Olzowy, Bernhard

    2018-01-01

    The role of bacteria in chronic rhinosinusitis (CRS) is still not well understood. Whole microbiome analysis adds new aspects to our current understanding that is mainly based on isolated bacteria. It is still unclear how the results of microbiome analysis and the classical culture based approaches interrelate. To address this, middle meatus swabs and tissue samples were obtained during sinus surgery in 5 patients with CRS with nasal polyps (CRSwNP), 5 patients with diffuse CRS without nasal polyps (CRSsNP), 5 patients with unilateral purulent maxillary CRS (upm CRS) and 3 patients with healthy sinus mucosa. Swabs were cultured, and associated bacteria were identified. Additionally, parts of each tissue sample also underwent culture approaches, and in parallel DNA was extracted for 16S rRNA gene amplicon-based microbiome analysis. From tissue samples 4.2 ± 1.2 distinct species per patient were cultured, from swabs 5.4 ± 1.6. The most frequently cultured species from the swabs were Propionibacterium acnes, Staphylococcus epidermidis, Corynebacterium spp. and Staphylococcus aureus. The 16S-RNA gene analysis revealed no clear differentiation of the bacterial community of healthy compared to CRS samples of unilateral purulent maxillary CRS and CRSwNP. However, the bacterial community of CRSsNP differed significantly from the healthy controls. In the CRSsNP samples Flavobacterium, Pseudomonas, Pedobacter, Porphyromonas, Stenotrophomonas, and Brevundimonas were significantly enriched compared to the healthy controls. Species isolated from culture did not generally correspond with the most abundant genera in microbiome analysis. Only Fusobacteria, Parvimonas, and Prevotella found in 2 unilateral purulent maxillary CRS samples by the cultivation dependent approach were also found in the cultivation independent approach in high abundance, suggesting a classic infectious pathogenesis of odontogenic origin in these two specific cases. Alterations of the bacterial community might be a more crucial factor for the development of CRSsNP compared to CRSwNP. Further studies are needed to investigate the relation between bacterial community characteristics and the development of CRSsNP. PMID:29755418

  20. Bayesian Factor Analysis as a Variable Selection Problem: Alternative Priors and Consequences

    PubMed Central

    Lu, Zhao-Hua; Chow, Sy-Miin; Loken, Eric

    2016-01-01

    Factor analysis is a popular statistical technique for multivariate data analysis. Developments in the structural equation modeling framework have enabled the use of hybrid confirmatory/exploratory approaches in which factor loading structures can be explored relatively flexibly within a confirmatory factor analysis (CFA) framework. Recently, a Bayesian structural equation modeling (BSEM) approach (Muthén & Asparouhov, 2012) has been proposed as a way to explore the presence of cross-loadings in CFA models. We show that the issue of determining factor loading patterns may be formulated as a Bayesian variable selection problem in which Muthén and Asparouhov’s approach can be regarded as a BSEM approach with ridge regression prior (BSEM-RP). We propose another Bayesian approach, denoted herein as the Bayesian structural equation modeling with spike and slab prior (BSEM-SSP), which serves as a one-stage alternative to the BSEM-RP. We review the theoretical advantages and disadvantages of both approaches and compare their empirical performance relative to two modification indices-based approaches and exploratory factor analysis with target rotation. A teacher stress scale data set (Byrne, 2012; Pettegrew & Wolf, 1982) is used to demonstrate our approach. PMID:27314566

  1. Environmental Barcoding: A Next-Generation Sequencing Approach for Biomonitoring Applications Using River Benthos

    PubMed Central

    Hajibabaei, Mehrdad; Shokralla, Shadi; Zhou, Xin; Singer, Gregory A. C.; Baird, Donald J.

    2011-01-01

    Timely and accurate biodiversity analysis poses an ongoing challenge for the success of biomonitoring programs. Morphology-based identification of bioindicator taxa is time consuming, and rarely supports species-level resolution especially for immature life stages. Much work has been done in the past decade to develop alternative approaches for biodiversity analysis using DNA sequence-based approaches such as molecular phylogenetics and DNA barcoding. On-going assembly of DNA barcode reference libraries will provide the basis for a DNA-based identification system. The use of recently introduced next-generation sequencing (NGS) approaches in biodiversity science has the potential to further extend the application of DNA information for routine biomonitoring applications to an unprecedented scale. Here we demonstrate the feasibility of using 454 massively parallel pyrosequencing for species-level analysis of freshwater benthic macroinvertebrate taxa commonly used for biomonitoring. We designed our experiments in order to directly compare morphology-based, Sanger sequencing DNA barcoding, and next-generation environmental barcoding approaches. Our results show the ability of 454 pyrosequencing of mini-barcodes to accurately identify all species with more than 1% abundance in the pooled mixture. Although the approach failed to identify 6 rare species in the mixture, the presence of sequences from 9 species that were not represented by individuals in the mixture provides evidence that DNA based analysis may yet provide a valuable approach in finding rare species in bulk environmental samples. We further demonstrate the application of the environmental barcoding approach by comparing benthic macroinvertebrates from an urban region to those obtained from a conservation area. Although considerable effort will be required to robustly optimize NGS tools to identify species from bulk environmental samples, our results indicate the potential of an environmental barcoding approach for biomonitoring programs. PMID:21533287

  2. Comparing Indirect Effects in Different Groups in Single-Group and Multi-Group Structural Equation Models

    PubMed Central

    Ryu, Ehri; Cheong, Jeewon

    2017-01-01

    In this article, we evaluated the performance of statistical methods in single-group and multi-group analysis approaches for testing group difference in indirect effects and for testing simple indirect effects in each group. We also investigated whether the performance of the methods in the single-group approach was affected when the assumption of equal variance was not satisfied. The assumption was critical for the performance of the two methods in the single-group analysis: the method using a product term for testing the group difference in a single path coefficient, and the Wald test for testing the group difference in the indirect effect. Bootstrap confidence intervals in the single-group approach and all methods in the multi-group approach were not affected by the violation of the assumption. We compared the performance of the methods and provided recommendations. PMID:28553248

  3. Self-Editing: On the Relation Between behavioral and Psycholinguistic Approaches

    PubMed Central

    Kimberly Epting, L; Critchfield, Thomas S

    2006-01-01

    In Skinner's (1957) conceptual analysis, the process of self-editing is integral to the dynamic complexities of multiply determined verbal behavior, but the analysis has generated little in the way of an experimental analysis. The majority of scientific work on self-editing has taken place within linguistics and cognitive psycholinguistics. Here we compare and contrast behavioral and cognitive psycholinguistic approaches to self-editing, highlighting points of contact that can be identified despite fundamental differences in theoretical styles. We conclude that the two approaches are not mutually exclusive on all dimensions, and suggest that a consideration of cognitive psycholinguistic research may help to spur an experimental analysis of self-editing from a behavioral perspective. PMID:22478464

  4. Hybrid Wavelet De-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series

    NASA Astrophysics Data System (ADS)

    WANG, D.; Wang, Y.; Zeng, X.

    2017-12-01

    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, Wavelet De-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series.

  5. [Efficacy of racecadotril vs. smectite, probiotics or zinc as an integral part of treatment of acute diarrhea in children under five years: A meta-analysis of multiple treatments].

    PubMed

    Gutiérrez-Castrellón, Pedro; Ortíz-Hernández, Anna Alejandra; Llamosas-Gallardo, Beatriz; Acosta-Bastidas, Mario A; Jiménez-Gutiérrez, Carlos; Diaz-García, Luisa; Anzo-Osorio, Anahí; Estevez-Jiménez, Juliana; Jiménez-Escobar, Irma; Vidal-Vázquez, Rosa Patricia

    2015-01-01

    Despite major advances in treatment, acute diarrhea continues to be a public health problem in children under five years. There is no systematic approach to treatment and most evidence is assembled comparing active treatment vs. placebo. Systematic review of evidence on efficacy of adjuvants for treatment of acute diarrhea through a network meta-analysis. A systematic search of multiple databases searching clinical trials related to the use of racecadotril, smectite, Lactobacillus GG, Lactobacillus reuteri, Saccharomyces boulardii and zinc as adjuvants in acute diarrhea was done. The primary endpoint was duration of diarrhea. Information is displayed through network meta-analysis.The superiority of each coadjutant was analyzed by Sucra approach. Network meta-analysis showed race cadotril was better when compared with placebo and other adjuvants. Sucra analysis showed racecadotril as the first option followed by smectite and Lactobacillus reuteri. Considering a strategic decision making approach, network meta-analysis allows us to establish the therapeutic superiority of racecadotril as an adjunct for the comprehensive management of acute diarrhea in children aged less than five years.

  6. Augmenting Qualitative Text Analysis with Natural Language Processing: Methodological Study.

    PubMed

    Guetterman, Timothy C; Chang, Tammy; DeJonckheere, Melissa; Basu, Tanmay; Scruggs, Elizabeth; Vydiswaran, V G Vinod

    2018-06-29

    Qualitative research methods are increasingly being used across disciplines because of their ability to help investigators understand the perspectives of participants in their own words. However, qualitative analysis is a laborious and resource-intensive process. To achieve depth, researchers are limited to smaller sample sizes when analyzing text data. One potential method to address this concern is natural language processing (NLP). Qualitative text analysis involves researchers reading data, assigning code labels, and iteratively developing findings; NLP has the potential to automate part of this process. Unfortunately, little methodological research has been done to compare automatic coding using NLP techniques and qualitative coding, which is critical to establish the viability of NLP as a useful, rigorous analysis procedure. The purpose of this study was to compare the utility of a traditional qualitative text analysis, an NLP analysis, and an augmented approach that combines qualitative and NLP methods. We conducted a 2-arm cross-over experiment to compare qualitative and NLP approaches to analyze data generated through 2 text (short message service) message survey questions, one about prescription drugs and the other about police interactions, sent to youth aged 14-24 years. We randomly assigned a question to each of the 2 experienced qualitative analysis teams for independent coding and analysis before receiving NLP results. A third team separately conducted NLP analysis of the same 2 questions. We examined the results of our analyses to compare (1) the similarity of findings derived, (2) the quality of inferences generated, and (3) the time spent in analysis. The qualitative-only analysis for the drug question (n=58) yielded 4 major findings, whereas the NLP analysis yielded 3 findings that missed contextual elements. The qualitative and NLP-augmented analysis was the most comprehensive. For the police question (n=68), the qualitative-only analysis yielded 4 primary findings and the NLP-only analysis yielded 4 slightly different findings. Again, the augmented qualitative and NLP analysis was the most comprehensive and produced the highest quality inferences, increasing our depth of understanding (ie, details and frequencies). In terms of time, the NLP-only approach was quicker than the qualitative-only approach for the drug (120 vs 270 minutes) and police (40 vs 270 minutes) questions. An approach beginning with qualitative analysis followed by qualitative- or NLP-augmented analysis took longer time than that beginning with NLP for both drug (450 vs 240 minutes) and police (390 vs 220 minutes) questions. NLP provides both a foundation to code qualitatively more quickly and a method to validate qualitative findings. NLP methods were able to identify major themes found with traditional qualitative analysis but were not useful in identifying nuances. Traditional qualitative text analysis added important details and context. ©Timothy C Guetterman, Tammy Chang, Melissa DeJonckheere, Tanmay Basu, Elizabeth Scruggs, VG Vinod Vydiswaran. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.06.2018.

  7. Global analysis of the yeast lipidome by quantitative shotgun mass spectrometry.

    PubMed

    Ejsing, Christer S; Sampaio, Julio L; Surendranath, Vineeth; Duchoslav, Eva; Ekroos, Kim; Klemm, Robin W; Simons, Kai; Shevchenko, Andrej

    2009-02-17

    Although the transcriptome, proteome, and interactome of several eukaryotic model organisms have been described in detail, lipidomes remain relatively uncharacterized. Using Saccharomyces cerevisiae as an example, we demonstrate that automated shotgun lipidomics analysis enabled lipidome-wide absolute quantification of individual molecular lipid species by streamlined processing of a single sample of only 2 million yeast cells. By comparative lipidomics, we achieved the absolute quantification of 250 molecular lipid species covering 21 major lipid classes. This analysis provided approximately 95% coverage of the yeast lipidome achieved with 125-fold improvement in sensitivity compared with previous approaches. Comparative lipidomics demonstrated that growth temperature and defects in lipid biosynthesis induce ripple effects throughout the molecular composition of the yeast lipidome. This work serves as a resource for molecular characterization of eukaryotic lipidomes, and establishes shotgun lipidomics as a powerful platform for complementing biochemical studies and other systems-level approaches.

  8. Improved statistical fluctuation analysis for measurement-device-independent quantum key distribution with four-intensity decoy-state method.

    PubMed

    Mao, Chen-Chen; Zhou, Xing-Yu; Zhu, Jian-Rong; Zhang, Chun-Hui; Zhang, Chun-Mei; Wang, Qin

    2018-05-14

    Recently Zhang et al [ Phys. Rev. A95, 012333 (2017)] developed a new approach to estimate the failure probability for the decoy-state BB84 QKD system when taking finite-size key effect into account, which offers security comparable to Chernoff bound, while results in an improved key rate and transmission distance. Based on Zhang et al's work, now we extend this approach to the case of the measurement-device-independent quantum key distribution (MDI-QKD), and for the first time implement it onto the four-intensity decoy-state MDI-QKD system. Moreover, through utilizing joint constraints and collective error-estimation techniques, we can obviously increase the performance of practical MDI-QKD systems compared with either three- or four-intensity decoy-state MDI-QKD using Chernoff bound analysis, and achieve much higher level security compared with those applying Gaussian approximation analysis.

  9. ALTERNATIVE FUTURES ANALYSIS: A FRAMEWORK FOR COMMUNITY DECISION-MAKING

    EPA Science Inventory

    Alternative futures analysis is an assessment approach designed to inform community decisions about land and water use. We conducted an alternative futures analysis in Oregon's Willamette River Basin. Three alternative future landscapes for the year 2050 were depicted and compare...

  10. A hybrid wavelet de-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series.

    PubMed

    Wang, Dong; Borthwick, Alistair G; He, Handan; Wang, Yuankun; Zhu, Jieyu; Lu, Yuan; Xu, Pengcheng; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-01-01

    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, wavelet de-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. Compared to three other generic methods, the results generated by WD-REPA model presented invariably smaller error measures which means the forecasting capability of the WD-REPA model is better than other models. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Validation of a robust proteomic analysis carried out on formalin-fixed paraffin-embedded tissues of the pancreas obtained from mouse and human.

    PubMed

    Kojima, Kyoko; Bowersock, Gregory J; Kojima, Chinatsu; Klug, Christopher A; Grizzle, William E; Mobley, James A

    2012-11-01

    A number of reports have recently emerged with focus on extraction of proteins from formalin-fixed paraffin-embedded (FFPE) tissues for MS analysis; however, reproducibility and robustness as compared to flash frozen controls is generally overlooked. The goal of this study was to identify and validate a practical and highly robust approach for the proteomics analysis of FFPE tissues. FFPE and matched frozen pancreatic tissues obtained from mice (n = 8) were analyzed using 1D-nanoLC-MS(MS)(2) following work up with commercially available kits. The chosen approach for FFPE tissues was found to be highly comparable to that of frozen. In addition, the total number of unique peptides identified between the two groups was highly similar, with 958 identified for FFPE and 1070 identified for frozen, with protein identifications that corresponded by approximately 80%. This approach was then applied to archived human FFPE pancreatic cancer specimens (n = 11) as compared to uninvolved tissues (n = 8), where 47 potential pancreatic ductal adenocarcinoma markers were identified as significantly increased, of which 28 were previously reported. Further, these proteins share strongly overlapping pathway associations to pancreatic cancer that include estrogen receptor α. Together, these data support the validation of an approach for the proteomic analysis of FFPE tissues that is straightforward and highly robust, which can also be effectively applied toward translational studies of disease. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Vibrations Detection in Industrial Pumps Based on Spectral Analysis to Increase Their Efficiency

    NASA Astrophysics Data System (ADS)

    Rachid, Belhadef; Hafaifa, Ahmed; Boumehraz, Mohamed

    2016-03-01

    Spectral analysis is the key tool for the study of vibration signals in rotating machinery. In this work, the vibration analysis applied for conditional preventive maintenance of such machines is proposed, as part of resolved problems related to vibration detection on the organs of these machines. The vibration signal of a centrifugal pump was treated to mount the benefits of the approach proposed. The obtained results present the signal estimation of a pump vibration using Fourier transform technique compared by the spectral analysis methods based on Prony approach.

  13. Comparing digital data processing techniques for surface mine and reclamation monitoring

    NASA Technical Reports Server (NTRS)

    Witt, R. G.; Bly, B. G.; Campbell, W. J.; Bloemer, H. H. L.; Brumfield, J. O.

    1982-01-01

    The results of three techniques used for processing Landsat digital data are compared for their utility in delineating areas of surface mining and subsequent reclamation. An unsupervised clustering algorithm (ISOCLS), a maximum-likelihood classifier (CLASFY), and a hybrid approach utilizing canonical analysis (ISOCLS/KLTRANS/ISOCLS) were compared by means of a detailed accuracy assessment with aerial photography at NASA's Goddard Space Flight Center. Results show that the hybrid approach was superior to the traditional techniques in distinguishing strip mined and reclaimed areas.

  14. [Approach and complications associated with suburethral synthetic slings in women: Systematic review and meta-analysis].

    PubMed

    Biardeau, X; Zanaty, M; Aoun, F; Benbouzid, S; Peyronnet, B

    2016-03-01

    We aim to assess the complications associated with different approaches used in female suburethral sling surgery. We performed a research on Medline using the following keywords: "suburethral slings", "complications", "safety" and "randomized". Only randomized clinical trials including women and reporting intra- and postoperative complications associated with the retropubic (RP) approach; TOT and/or TVT-O were included. The meta-analysis was conducted using the Review Manager (RevMan 5.3) software delivered by the "Cochrane Library". Out of 176 articles, 23 were included in synthesis. Risks of bladder perforation during surgery (60/1482 vs 5/1479; OR=6.44; 95% CI [3.32-12.50]) and postoperative urinary retention (48/1160 vs 24/1159; OR=1.93; 95% CI [1.26-3.12]) were significantly higher with the RP approach, when compared with the transobturator (TO) approach (TOT or TVT-O). Conversely, the risk of prolonged postoperative pain was significantly lower after RP approach, when compared with TO approach (24/1156 vs 69/1149; OR=0.36; 95% CI [0.23-0.56]). Risks of intraoperative urethral injury, postoperative erosion and de novo overactive bladder were comparable between the two approaches. Data regarding the comparison between TOT and TVT-O were scarce and did not allow us to conclude about complications associated with. The RP approach was associated with a significant risk of bladder perforation and postoperative urinary retention. The TO approach was associated with a higher risk of prolonged postoperative pain. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  15. Approaches to the simulation of unconfined flow and perched groundwater flow in MODFLOW

    USGS Publications Warehouse

    Bedekar, Vivek; Niswonger, Richard G.; Kipp, Kenneth; Panday, Sorab; Tonkin, Matthew

    2012-01-01

    Various approaches have been proposed to manage the nonlinearities associated with the unconfined flow equation and to simulate perched groundwater conditions using the MODFLOW family of codes. The approaches comprise a variety of numerical techniques to prevent dry cells from becoming inactive and to achieve a stable solution focused on formulations of the unconfined, partially-saturated, groundwater flow equation. Keeping dry cells active avoids a discontinuous head solution which in turn improves the effectiveness of parameter estimation software that relies on continuous derivatives. Most approaches implement an upstream weighting of intercell conductance and Newton-Raphson linearization to obtain robust convergence. In this study, several published approaches were implemented in a stepwise manner into MODFLOW for comparative analysis. First, a comparative analysis of the methods is presented using synthetic examples that create convergence issues or difficulty in handling perched conditions with the more common dry-cell simulation capabilities of MODFLOW. Next, a field-scale three-dimensional simulation is presented to examine the stability and performance of the discussed approaches in larger, practical, simulation settings.

  16. Screening for single nucleotide variants, small indels and exon deletions with a next-generation sequencing based gene panel approach for Usher syndrome

    PubMed Central

    Krawitz, Peter M; Schiska, Daniela; Krüger, Ulrike; Appelt, Sandra; Heinrich, Verena; Parkhomchuk, Dmitri; Timmermann, Bernd; Millan, Jose M; Robinson, Peter N; Mundlos, Stefan; Hecht, Jochen; Gross, Manfred

    2014-01-01

    Usher syndrome is an autosomal recessive disorder characterized both by deafness and blindness. For the three clinical subtypes of Usher syndrome causal mutations in altogether 12 genes and a modifier gene have been identified. Due to the genetic heterogeneity of Usher syndrome, the molecular analysis is predestined for a comprehensive and parallelized analysis of all known genes by next-generation sequencing (NGS) approaches. We describe here the targeted enrichment and deep sequencing for exons of Usher genes and compare the costs and workload of this approach compared to Sanger sequencing. We also present a bioinformatics analysis pipeline that allows us to detect single-nucleotide variants, short insertions and deletions, as well as copy number variations of one or more exons on the same sequence data. Additionally, we present a flexible in silico gene panel for the analysis of sequence variants, in which newly identified genes can easily be included. We applied this approach to a cohort of 44 Usher patients and detected biallelic pathogenic mutations in 35 individuals and monoallelic mutations in eight individuals of our cohort. Thirty-nine of the sequence variants, including two heterozygous deletions comprising several exons of USH2A, have not been reported so far. Our NGS-based approach allowed us to assess single-nucleotide variants, small indels, and whole exon deletions in a single test. The described diagnostic approach is fast and cost-effective with a high molecular diagnostic yield. PMID:25333064

  17. Screening for single nucleotide variants, small indels and exon deletions with a next-generation sequencing based gene panel approach for Usher syndrome.

    PubMed

    Krawitz, Peter M; Schiska, Daniela; Krüger, Ulrike; Appelt, Sandra; Heinrich, Verena; Parkhomchuk, Dmitri; Timmermann, Bernd; Millan, Jose M; Robinson, Peter N; Mundlos, Stefan; Hecht, Jochen; Gross, Manfred

    2014-09-01

    Usher syndrome is an autosomal recessive disorder characterized both by deafness and blindness. For the three clinical subtypes of Usher syndrome causal mutations in altogether 12 genes and a modifier gene have been identified. Due to the genetic heterogeneity of Usher syndrome, the molecular analysis is predestined for a comprehensive and parallelized analysis of all known genes by next-generation sequencing (NGS) approaches. We describe here the targeted enrichment and deep sequencing for exons of Usher genes and compare the costs and workload of this approach compared to Sanger sequencing. We also present a bioinformatics analysis pipeline that allows us to detect single-nucleotide variants, short insertions and deletions, as well as copy number variations of one or more exons on the same sequence data. Additionally, we present a flexible in silico gene panel for the analysis of sequence variants, in which newly identified genes can easily be included. We applied this approach to a cohort of 44 Usher patients and detected biallelic pathogenic mutations in 35 individuals and monoallelic mutations in eight individuals of our cohort. Thirty-nine of the sequence variants, including two heterozygous deletions comprising several exons of USH2A, have not been reported so far. Our NGS-based approach allowed us to assess single-nucleotide variants, small indels, and whole exon deletions in a single test. The described diagnostic approach is fast and cost-effective with a high molecular diagnostic yield.

  18. Translating Knowledge through Blended Learning: A Comparative Analysis of Face-to-Face and Blended Learning Methods

    ERIC Educational Resources Information Center

    Golden, Thomas P.; Karpur, Arun

    2012-01-01

    This study is a comparative analysis of the impact of traditional face-to-face training contrasted with a blended learning approach, as it relates to improving skills, knowledge and attitudes for enhancing practices for achieving improved employment outcomes for individuals with disabilities. The study included two intervention groups: one…

  19. Approaching German Culture: A Tentative Analysis

    ERIC Educational Resources Information Center

    Tinsley, Royal; Woloshin, David

    1974-01-01

    A comparative analysis of the five universal problems of cultural orientation: 1) human nature, 2) social relations, 3) man and nature, 4) time, 5) space, as they are reflected in German and American culture. (PP)

  20. Comparison of reverse transcription-quantitative polymerase chain reaction methods and platforms for single cell gene expression analysis.

    PubMed

    Fox, Bridget C; Devonshire, Alison S; Baradez, Marc-Olivier; Marshall, Damian; Foy, Carole A

    2012-08-15

    Single cell gene expression analysis can provide insights into development and disease progression by profiling individual cellular responses as opposed to reporting the global average of a population. Reverse transcription-quantitative polymerase chain reaction (RT-qPCR) is the "gold standard" for the quantification of gene expression levels; however, the technical performance of kits and platforms aimed at single cell analysis has not been fully defined in terms of sensitivity and assay comparability. We compared three kits using purification columns (PicoPure) or direct lysis (CellsDirect and Cells-to-CT) combined with a one- or two-step RT-qPCR approach using dilutions of cells and RNA standards to the single cell level. Single cell-level messenger RNA (mRNA) analysis was possible using all three methods, although the precision, linearity, and effect of lysis buffer and cell background differed depending on the approach used. The impact of using a microfluidic qPCR platform versus a standard instrument was investigated for potential variability introduced by preamplification of template or scaling down of the qPCR to nanoliter volumes using laser-dissected single cell samples. The two approaches were found to be comparable. These studies show that accurate gene expression analysis is achievable at the single cell level and highlight the importance of well-validated experimental procedures for low-level mRNA analysis. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool

    PubMed Central

    Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi

    2016-01-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405

  2. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    PubMed

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  3. Meta-analysis of individual registry results enhances international registry collaboration.

    PubMed

    Paxton, Elizabeth W; Mohaddes, Maziar; Laaksonen, Inari; Lorimer, Michelle; Graves, Stephen E; Malchau, Henrik; Namba, Robert S; Kärrholm, John; Rolfson, Ola; Cafri, Guy

    2018-03-28

    Background and purpose - Although common in medical research, meta-analysis has not been widely adopted in registry collaborations. A meta-analytic approach in which each registry conducts a standardized analysis on its own data followed by a meta-analysis to calculate a weighted average of the estimates allows collaboration without sharing patient-level data. The value of meta-analysis as an alternative to individual patient data analysis is illustrated in this study by comparing the risk of revision of porous tantalum cups versus other uncemented cups in primary total hip arthroplasties from Sweden, Australia, and a US registry (2003-2015). Patients and methods - For both individual patient data analysis and meta-analysis approaches a Cox proportional hazard model was fit for time to revision, comparing porous tantalum (n = 23,201) with other uncemented cups (n = 128,321). Covariates included age, sex, diagnosis, head size, and stem fixation. In the meta-analysis approach, treatment effect size (i.e., Cox model hazard ratio) was calculated within each registry and a weighted average for the individual registries' estimates was calculated. Results - Patient-level data analysis and meta-analytic approaches yielded the same results with the porous tantalum cups having a higher risk of revision than other uncemented cups (HR (95% CI) 1.6 (1.4-1.7) and HR (95% CI) 1.5 (1.4-1.7), respectively). Adding the US cohort to the meta-analysis led to greater generalizability, increased precision of the treatment effect, and similar findings (HR (95% CI) 1.6 (1.4-1.7)) with increased risk of porous tantalum cups. Interpretation - The meta-analytic technique is a viable option to address privacy, security, and data ownership concerns allowing more expansive registry collaboration, greater generalizability, and increased precision of treatment effects.

  4. Modeling energy/economy interactions for conservation and renewable energy-policy analysis

    NASA Astrophysics Data System (ADS)

    Groncki, P. J.

    Energy policy and the implications for policy analysis and the methodological tools are discussed. The evolution of one methodological approach and the combined modeling system of the component models, their evolution in response to changing analytic needs, and the development of the integrated framework are reported. The analyses performed over the past several years are summarized. The current philosophy behind energy policy is discussed and compared to recent history. Implications for current policy analysis and methodological approaches are drawn.

  5. Order-crossing removal in Gabor order tracking by independent component analysis

    NASA Astrophysics Data System (ADS)

    Guo, Yu; Tan, Kok Kiong

    2009-08-01

    Order-crossing problems in Gabor order tracking (GOT) of rotating machinery often occur when noise due to power-frequency interference, local structure resonance, etc., is prominent in applications. They can render the analysis results and the waveform-reconstruction tasks in GOT inaccurate or even meaningless. An approach is proposed in this paper to address the order-crossing problem by independent component analysis (ICA). With the approach, accurate order analysis results can be obtained and the waveforms of the order components of interest can be reconstructed or extracted from the recorded noisy data series. In addition, the ambiguities (permutation and scaling) of ICA results are also solved with the approach. The approach is amenable to applications in condition monitoring and fault diagnosis of rotating machinery. The evaluation of the approach is presented in detail based on simulations and an experiment on a rotor test rig. The results obtained using the proposed approach are compared with those obtained using the standard GOT. The comparison shows that the presented approach is more effective to solve order-crossing problems in GOT.

  6. Comparing Laser Interferometry and Atom Interferometry Approaches to Space-Based Gravitational-Wave Measurement

    NASA Technical Reports Server (NTRS)

    Baker, John; Thorpe, Ira

    2012-01-01

    Thoroughly studied classic space-based gravitational-wave missions concepts such as the Laser Interferometer Space Antenna (LISA) are based on laser-interferometry techniques. Ongoing developments in atom-interferometry techniques have spurred recently proposed alternative mission concepts. These different approaches can be understood on a common footing. We present an comparative analysis of how each type of instrument responds to some of the noise sources which may limiting gravitational-wave mission concepts. Sensitivity to laser frequency instability is essentially the same for either approach. Spacecraft acceleration reference stability sensitivities are different, allowing smaller spacecraft separations in the atom interferometry approach, but acceleration noise requirements are nonetheless similar. Each approach has distinct additional measurement noise issues.

  7. Work schedule manager gap analysis : assessing the future training needs of work schedule managers using a strategic job analysis approach.

    DOT National Transportation Integrated Search

    2010-05-01

    This report documents the results of a strategic job analysis that examined the job tasks and knowledge, skills, abilities, and other characteristics (KSAOs) needed to perform the job of a work schedule manager. The strategic job analysis compared in...

  8. Work schedule manager gap analysis : assessing the future training needs of work schedule managers using a strategic job analysis approach

    DOT National Transportation Integrated Search

    2010-05-01

    This report documents the results of a strategic job analysis that examined the job tasks and knowledge, skills, abilities, and other characteristics (KSAOs) needed to perform the job of a work schedule manager. The strategic job analysis compared in...

  9. A Comparison of Component and Factor Patterns: A Monte Carlo Approach.

    ERIC Educational Resources Information Center

    Velicer, Wayne F.; And Others

    1982-01-01

    Factor analysis, image analysis, and principal component analysis are compared with respect to the factor patterns they would produce under various conditions. The general conclusion that is reached is that the three methods produce results that are equivalent. (Author/JKS)

  10. Waveform Similarity Analysis: A Simple Template Comparing Approach for Detecting and Quantifying Noisy Evoked Compound Action Potentials.

    PubMed

    Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira

    2015-01-01

    Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies.

  11. Waveform Similarity Analysis: A Simple Template Comparing Approach for Detecting and Quantifying Noisy Evoked Compound Action Potentials

    PubMed Central

    Potas, Jason Robert; de Castro, Newton Gonçalves; Maddess, Ted; de Souza, Marcio Nogueira

    2015-01-01

    Experimental electrophysiological assessment of evoked responses from regenerating nerves is challenging due to the typical complex response of events dispersed over various latencies and poor signal-to-noise ratio. Our objective was to automate the detection of compound action potential events and derive their latencies and magnitudes using a simple cross-correlation template comparison approach. For this, we developed an algorithm called Waveform Similarity Analysis. To test the algorithm, challenging signals were generated in vivo by stimulating sural and sciatic nerves, whilst recording evoked potentials at the sciatic nerve and tibialis anterior muscle, respectively, in animals recovering from sciatic nerve transection. Our template for the algorithm was generated based on responses evoked from the intact side. We also simulated noisy signals and examined the output of the Waveform Similarity Analysis algorithm with imperfect templates. Signals were detected and quantified using Waveform Similarity Analysis, which was compared to event detection, latency and magnitude measurements of the same signals performed by a trained observer, a process we called Trained Eye Analysis. The Waveform Similarity Analysis algorithm could successfully detect and quantify simple or complex responses from nerve and muscle compound action potentials of intact or regenerated nerves. Incorrectly specifying the template outperformed Trained Eye Analysis for predicting signal amplitude, but produced consistent latency errors for the simulated signals examined. Compared to the trained eye, Waveform Similarity Analysis is automatic, objective, does not rely on the observer to identify and/or measure peaks, and can detect small clustered events even when signal-to-noise ratio is poor. Waveform Similarity Analysis provides a simple, reliable and convenient approach to quantify latencies and magnitudes of complex waveforms and therefore serves as a useful tool for studying evoked compound action potentials in neural regeneration studies. PMID:26325291

  12. Aligning Collaborative and Culturally Responsive Evaluation Approaches

    ERIC Educational Resources Information Center

    Askew, Karyl; Beverly, Monifa Green; Jay, Michelle L.

    2012-01-01

    The authors, three African-American women trained as collaborative evaluators, offer a comparative analysis of collaborative evaluation (O'Sullivan, 2004) and culturally responsive evaluation approaches (Frierson, Hood, & Hughes, 2002; Kirkhart & Hopson, 2010). Collaborative evaluation techniques immerse evaluators in the cultural milieu…

  13. Comparing positive and negative reinforcement: A fantasy experiment.

    PubMed

    Nevin, John A; Mandell, Charlotte

    2017-01-01

    We propose quantitative experimental approaches to the question of whether positive and negative reinforcement are functionally different, and discuss scientific and ethical concerns that would arise if these approaches were pursued. © 2017 Society for the Experimental Analysis of Behavior.

  14. Laparoscopic anterior versus endoscopic posterior approach for adrenalectomy: a shift to a new golden standard?

    PubMed

    Vrielink, O M; Wevers, K P; Kist, J W; Borel Rinkes, I H M; Hemmer, P H J; Vriens, M R; de Vries, J; Kruijff, S

    2017-08-01

    There has been an increased utilization of the posterior retroperitoneal approach (PRA) for adrenalectomy alongside the "classic" laparoscopic transabdominal technique (LTA). The aim of this study was to compare both procedures based on outcome variables at various ranges of tumor size. A retrospective analysis was performed on 204 laparoscopic transabdominal (UMC Groningen) and 57 retroperitoneal (UMC Utrecht) adrenalectomies between 1998 and 2013. We applied a univariate and multivariate regression analysis. Mann-Whitney and chi-squared tests were used to compare outcome variables between both approaches. Both mean operation time and median blood loss were significantly lower in the PRA group with 102.1 (SD 33.5) vs. 173.3 (SD 59.1) minutes (p < 0.001) and 0 (0-200) vs. 50 (0-1000) milliliters (p < 0.001), respectively. The shorter operation time in PRA was independent of tumor size. Complication rates were higher in the LTA (19.1%) compared to PRA (8.8%). There was no significant difference in recovery time between both approaches. Application of the PRA decreases operation time, blood loss, and complication rates compared to LTA. This might encourage institutions that use the LTA to start using PRA in patients with adrenal tumors, independent of tumor size.

  15. Analysis of two different surgical approaches for fractures of the mandibular condyle.

    PubMed

    Kumaran, S; Thambiah, L J

    2012-01-01

    Fractures of the condyle account for one third of all the mandibular fractures. Different surgical approaches to the condyle described hitherto testify to the advantages and disadvantages of the different surgical techniques used for approaching the condyle in such cases of fractures. We have described and compared two of such surgical techniques in this study. The aim of this study is to compare the outcome of dealing with condylar fractures by two different surgical techniques: the mini retromandibular approach, and the preauricular approach. A prospective study of 31 patients who had suffered with mandibular condylar fractures was carried out. Of these, 26 patients had unilateral condylar fractures, and 5 patients had a bilateral fracture. Further, 19 of these patients were treated by the mini retromandibular approach and 12 by the preauricular approach. The treated patients were followed up and evaluated for a minimum period of 1 year and assessed for parameters such as the maximum mouth opening, lateral movement on the fractured side, mandibular movements such as protrusion, dental occlusion, scar formation, facial nerve weakness, salivary fistula formation and time taken for the completion of the surgical procedure. t- test was used for statistical analysis of the data obtained in the study. Dental occlusion was restored in all the cases, and good anatomical reduction was achieved. The mean operating time was higher 63.53 (mean) ± 18.12 minutes standard deviation (SD) in the preauricular approach compared to 45.22 (mean) ± 18.86 minutes SD in the mini retromandibular approach. Scar formation was satisfactory in almost all the cases.

  16. Comparative analysis of aging policy reforms in Argentina, Chile, Costa Rica, and Mexico.

    PubMed

    Calvo, Esteban; Berho, Maureen; Roqué, Mónica; Amaro, Juan Sebastián; Morales, Fernando; Rivera, Emiliana; Gutiérrez Robledo, Luis Miguel F; López, Elizabeth Caro; Canals, Bernardita; Kornfeld, Rosa

    2018-04-16

    This investigation uses case studies and comparative analysis to review and analyze aging policy in Argentina, Chile, Costa Rica, and Mexico, and uncovers similarities and relevant trends in the substance of historical and current aging policy across countries. Initial charity-based approaches to poverty and illness have been gradually replaced by a rights-based approach considering broader notions of well-being, and recent reforms emphasize the need for national, intersectoral, evidence-based policy. The results of this study have implications for understanding aging policy in Latin America from a welfare regime and policymakers' perspective, identifying priorities for intervention, and informing policy reforms in developing countries worldwide.

  17. A modal approach to the prediction of the sound reduction index

    NASA Astrophysics Data System (ADS)

    Tisseyre, Alain; Courné, Cécile; Buzzy, Thomas; Moulinier, André

    2003-04-01

    The calculation of the sound reduction index in modal analysis is presented in a general way; different possible approaches are described. These calculations are done in two steps: a vibratory study to determine the transverse displacement of the plate and a study of radiation. The specificity of orthotropic plates is presented. This study led to programming a calculation algorithm. Initial hypotheses are indicated, as well as results obtained for various plates or partitions. Modal analysis calculation results are then compared to the Cremer-Sewell approach results.

  18. Analysis of the Temperature and Strain-Rate Dependences of Strain Hardening

    NASA Astrophysics Data System (ADS)

    Kreyca, Johannes; Kozeschnik, Ernst

    2018-01-01

    A classical constitutive modeling-based Ansatz for the impact of thermal activation on the stress-strain response of metallic materials is compared with the state parameter-based Kocks-Mecking model. The predicted functional dependencies suggest that, in the first approach, only the dislocation storage mechanism is a thermally activated process, whereas, in the second approach, only the mechanism of dynamic recovery is. In contradiction to each of these individual approaches, our analysis and comparison with experimental evidence shows that thermal activation contributes both to dislocation generation and annihilation.

  19. Comparative shotgun proteomics using spectral count data and quasi-likelihood modeling.

    PubMed

    Li, Ming; Gray, William; Zhang, Haixia; Chung, Christine H; Billheimer, Dean; Yarbrough, Wendell G; Liebler, Daniel C; Shyr, Yu; Slebos, Robbert J C

    2010-08-06

    Shotgun proteomics provides the most powerful analytical platform for global inventory of complex proteomes using liquid chromatography-tandem mass spectrometry (LC-MS/MS) and allows a global analysis of protein changes. Nevertheless, sampling of complex proteomes by current shotgun proteomics platforms is incomplete, and this contributes to variability in assessment of peptide and protein inventories by spectral counting approaches. Thus, shotgun proteomics data pose challenges in comparing proteomes from different biological states. We developed an analysis strategy using quasi-likelihood Generalized Linear Modeling (GLM), included in a graphical interface software package (QuasiTel) that reads standard output from protein assemblies created by IDPicker, an HTML-based user interface to query shotgun proteomic data sets. This approach was compared to four other statistical analysis strategies: Student t test, Wilcoxon rank test, Fisher's Exact test, and Poisson-based GLM. We analyzed the performance of these tests to identify differences in protein levels based on spectral counts in a shotgun data set in which equimolar amounts of 48 human proteins were spiked at different levels into whole yeast lysates. Both GLM approaches and the Fisher Exact test performed adequately, each with their unique limitations. We subsequently compared the proteomes of normal tonsil epithelium and HNSCC using this approach and identified 86 proteins with differential spectral counts between normal tonsil epithelium and HNSCC. We selected 18 proteins from this comparison for verification of protein levels between the individual normal and tumor tissues using liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM-MS). This analysis confirmed the magnitude and direction of the protein expression differences in all 6 proteins for which reliable data could be obtained. Our analysis demonstrates that shotgun proteomic data sets from different tissue phenotypes are sufficiently rich in quantitative information and that statistically significant differences in proteins spectral counts reflect the underlying biology of the samples.

  20. Comparative Shotgun Proteomics Using Spectral Count Data and Quasi-Likelihood Modeling

    PubMed Central

    2010-01-01

    Shotgun proteomics provides the most powerful analytical platform for global inventory of complex proteomes using liquid chromatography−tandem mass spectrometry (LC−MS/MS) and allows a global analysis of protein changes. Nevertheless, sampling of complex proteomes by current shotgun proteomics platforms is incomplete, and this contributes to variability in assessment of peptide and protein inventories by spectral counting approaches. Thus, shotgun proteomics data pose challenges in comparing proteomes from different biological states. We developed an analysis strategy using quasi-likelihood Generalized Linear Modeling (GLM), included in a graphical interface software package (QuasiTel) that reads standard output from protein assemblies created by IDPicker, an HTML-based user interface to query shotgun proteomic data sets. This approach was compared to four other statistical analysis strategies: Student t test, Wilcoxon rank test, Fisher’s Exact test, and Poisson-based GLM. We analyzed the performance of these tests to identify differences in protein levels based on spectral counts in a shotgun data set in which equimolar amounts of 48 human proteins were spiked at different levels into whole yeast lysates. Both GLM approaches and the Fisher Exact test performed adequately, each with their unique limitations. We subsequently compared the proteomes of normal tonsil epithelium and HNSCC using this approach and identified 86 proteins with differential spectral counts between normal tonsil epithelium and HNSCC. We selected 18 proteins from this comparison for verification of protein levels between the individual normal and tumor tissues using liquid chromatography−multiple reaction monitoring mass spectrometry (LC−MRM-MS). This analysis confirmed the magnitude and direction of the protein expression differences in all 6 proteins for which reliable data could be obtained. Our analysis demonstrates that shotgun proteomic data sets from different tissue phenotypes are sufficiently rich in quantitative information and that statistically significant differences in proteins spectral counts reflect the underlying biology of the samples. PMID:20586475

  1. Multivariate Analysis of Schools and Educational Policy.

    ERIC Educational Resources Information Center

    Kiesling, Herbert J.

    This report describes a multivariate analysis technique that approaches the problems of educational production function analysis by (1) using comparable measures of output across large experiments, (2) accounting systematically for differences in socioeconomic background, and (3) treating the school as a complete system in which different…

  2. Estimating Effectiveness of the Control of Violence and Socioeconomic Development in Colombia: An Application of Dynamic Data Envelopment Analysis and Data Panel Approach

    ERIC Educational Resources Information Center

    Poveda, Alexander Cotte

    2012-01-01

    This paper develops an index to evaluate the level of effectiveness of the control of violence based on the data envelopment analysis approach. The index is used to examine the grade of effectiveness of the control of violence at the level of Colombian departments between 1993 and 2007. Comparing the results across Colombian departments, we find…

  3. Enabling Design for Affordability: An Epoch-Era Analysis Approach

    DTIC Science & Technology

    2013-04-01

    Analysis on the DoD Pre-Milestone B Acquisition Processes Danielle Worger and Teresa Wu, Arizona State University Eugene Rex Jalao, Arizona State...Management Best Practices Brandon Keller and J. Robert Wirthlin Air Force Institute of Technology The RITE Approach to Agile Acquisition Timothy Boyce...Change Kathryn Aten and John T . Dillard Naval Postgraduate School A Comparative Assessment of the Navy’s Future Naval Capabilities (FNC) Process

  4. Numerical Analysis of Flood modeling of upper Citarum River under Extreme Flood Condition

    NASA Astrophysics Data System (ADS)

    Siregar, R. I.

    2018-02-01

    This paper focuses on how to approach the numerical method and computation to analyse flood parameters. Water level and flood discharge are the flood parameters solved by numerical methods approach. Numerical method performed on this paper for unsteady flow conditions have strengths and weaknesses, among others easily applied to the following cases in which the boundary irregular flow. The study area is in upper Citarum Watershed, Bandung, West Java. This paper uses computation approach with Force2 programming and HEC-RAS to solve the flow problem in upper Citarum River, to investigate and forecast extreme flood condition. Numerical analysis based on extreme flood events that have occurred in the upper Citarum watershed. The result of water level parameter modeling and extreme flood discharge compared with measurement data to analyse validation. The inundation area about flood that happened in 2010 is about 75.26 square kilometres. Comparing two-method show that the FEM analysis with Force2 programs has the best approach to validation data with Nash Index is 0.84 and HEC-RAS that is 0.76 for water level. For discharge data Nash Index obtained the result analysis use Force2 is 0.80 and with use HEC-RAS is 0.79.

  5. Improving Iranian High School Students' Reading Comprehension Using the Tenets of Genre Analysis

    ERIC Educational Resources Information Center

    Adelnia, Rezvan; Salehi, Hadi

    2016-01-01

    This study is an attempt to investigate impact of using a technique, namely, genre-based approach on improving reading ability on Iranian EFL learners' achievement. Therefore, an attempt was made to compare genre-based approach to teaching reading with traditional approaches. For achieving this purpose, by administering the Oxford Quick Placement…

  6. Composite scores in comparative effectiveness research: counterbalancing parsimony and dimensionality in patient-reported outcomes.

    PubMed

    Schwartz, Carolyn E; Patrick, Donald L

    2014-07-01

    When planning a comparative effectiveness study comparing disease-modifying treatments, competing demands influence choice of outcomes. Current practice emphasizes parsimony, although understanding multidimensional treatment impact can help to personalize medical decision-making. We discuss both sides of this 'tug of war'. We discuss the assumptions, advantages and drawbacks of composite scores and multidimensional outcomes. We describe possible solutions to the multiple comparison problem, including conceptual hierarchy distinctions, statistical approaches, 'real-world' benchmarks of effectiveness and subgroup analysis. We conclude that comparative effectiveness research should consider multiple outcome dimensions and compare different approaches that fit the individual context of study objectives.

  7. A Comparative Meta-Analysis of 5E and Traditional Approaches in Turkey

    ERIC Educational Resources Information Center

    Anil, Özgür; Batdi, Veli

    2015-01-01

    The aim of this study is to compare the 5E learning model with traditional learning methods in terms of their effect on students' academic achievement, retention and attitude scores. In this context, the meta-analytic method known as the "analysis of analyses" was used and a review undertaken of the studies and theses (N = 14) executed…

  8. Replication Research in Comparative Genre Analysis in English for Academic Purposes

    ERIC Educational Resources Information Center

    Basturkmen, Helen

    2014-01-01

    In recent years a number of comparative studies based on an established approach to genre analysis have been published in the English for Academic Purposes (EAP) literature. Studies in this emerging strand of research typically aim to identify how the rhetorical structure of a particular genre (a text type) or part of a genre may vary across…

  9. A Comparative Analysis of Collaborative Leadership Skills Employed by Graduates of Cohort Based and Non-Cohort Based Doctoral Programs in Eduational Leadership

    ERIC Educational Resources Information Center

    Breton Caminos, Michelle Evangeline

    2015-01-01

    This qualitative comparative case analysis investigates the leadership approaches of the graduates of two educational leadership doctoral programs in Upstate New York--one a cohort-modeled program, the other a non-cohort program--with specific attention to collaboration. Responses from participants indicate key differences in Engaging Communities,…

  10. How to Construct More Accurate Student Models: Comparing and Optimizing Knowledge Tracing and Performance Factor Analysis

    ERIC Educational Resources Information Center

    Gong, Yue; Beck, Joseph E.; Heffernan, Neil T.

    2011-01-01

    Student modeling is a fundamental concept applicable to a variety of intelligent tutoring systems (ITS). However, there is not a lot of practical guidance on how to construct and train such models. This paper compares two approaches for student modeling, Knowledge Tracing (KT) and Performance Factors Analysis (PFA), by evaluating their predictive…

  11. Measuring What People Value: A Comparison of “Attitude” and “Preference” Surveys

    PubMed Central

    Phillips, Kathryn A; Johnson, F Reed; Maddala, Tara

    2002-01-01

    Objective To compare and contrast methods and findings from two approaches to valuation used in the same survey: measurement of “attitudes” using simple rankings and ratings versus measurement of “preferences” using conjoint analysis. Conjoint analysis, a stated preference method, involves comparing scenarios composed of attribute descriptions by ranking, rating, or choosing scenarios. We explore possible explanations for our findings using focus groups conducted after the quantitative survey. Methods A self-administered survey, measuring attitudes and preferences for HIV tests, was conducted at HIV testing sites in San Francisco in 1999–2000 (n = 365, response rate=96 percent). Attitudes were measured and analyzed using standard approaches. Conjoint analysis scenarios were developed using a fractional factorial design and results analyzed using random effects probit models. We examined how the results using the two approaches were both similar and different. Results We found that “attitudes” and “preferences” were generally consistent, but there were some important differences. Although rankings based on the attitude and conjoint analysis surveys were similar, closer examination revealed important differences in how respondents valued price and attributes with “halo” effects, variation in how attribute levels were valued, and apparent differences in decision-making processes. Conclusions To our knowledge, this is the first study to compare attitude surveys and conjoint analysis surveys and to explore the meaning of the results using post-hoc focus groups. Although the overall findings for attitudes and preferences were similar, the two approaches resulted in some different conclusions. Health researchers should consider the advantages and limitations of both methods when determining how to measure what people value. PMID:12546291

  12. Functional MRI Preprocessing in Lesioned Brains: Manual Versus Automated Region of Interest Analysis

    PubMed Central

    Garrison, Kathleen A.; Rogalsky, Corianne; Sheng, Tong; Liu, Brent; Damasio, Hanna; Winstein, Carolee J.; Aziz-Zadeh, Lisa S.

    2015-01-01

    Functional magnetic resonance imaging (fMRI) has significant potential in the study and treatment of neurological disorders and stroke. Region of interest (ROI) analysis in such studies allows for testing of strong a priori clinical hypotheses with improved statistical power. A commonly used automated approach to ROI analysis is to spatially normalize each participant’s structural brain image to a template brain image and define ROIs using an atlas. However, in studies of individuals with structural brain lesions, such as stroke, the gold standard approach may be to manually hand-draw ROIs on each participant’s non-normalized structural brain image. Automated approaches to ROI analysis are faster and more standardized, yet are susceptible to preprocessing error (e.g., normalization error) that can be greater in lesioned brains. The manual approach to ROI analysis has high demand for time and expertise, but may provide a more accurate estimate of brain response. In this study, commonly used automated and manual approaches to ROI analysis were directly compared by reanalyzing data from a previously published hypothesis-driven cognitive fMRI study, involving individuals with stroke. The ROI evaluated is the pars opercularis of the inferior frontal gyrus. Significant differences were identified in task-related effect size and percent-activated voxels in this ROI between the automated and manual approaches to ROI analysis. Task interactions, however, were consistent across ROI analysis approaches. These findings support the use of automated approaches to ROI analysis in studies of lesioned brains, provided they employ a task interaction design. PMID:26441816

  13. Comparative Analysis of Automatic Exudate Detection between Machine Learning and Traditional Approaches

    NASA Astrophysics Data System (ADS)

    Sopharak, Akara; Uyyanonvara, Bunyarit; Barman, Sarah; Williamson, Thomas

    To prevent blindness from diabetic retinopathy, periodic screening and early diagnosis are neccessary. Due to lack of expert ophthalmologists in rural area, automated early exudate (one of visible sign of diabetic retinopathy) detection could help to reduce the number of blindness in diabetic patients. Traditional automatic exudate detection methods are based on specific parameter configuration, while the machine learning approaches which seems more flexible may be computationally high cost. A comparative analysis of traditional and machine learning of exudates detection, namely, mathematical morphology, fuzzy c-means clustering, naive Bayesian classifier, Support Vector Machine and Nearest Neighbor classifier are presented. Detected exudates are validated with expert ophthalmologists' hand-drawn ground-truths. The sensitivity, specificity, precision, accuracy and time complexity of each method are also compared.

  14. Hip joint center localisation: A biomechanical application to hip arthroplasty population

    PubMed Central

    Bouffard, Vicky; Begon, Mickael; Champagne, Annick; Farhadnia, Payam; Vendittoli, Pascal-André; Lavigne, Martin; Prince, François

    2012-01-01

    AIM: To determine hip joint center (HJC) location on hip arthroplasty population comparing predictive and functional approaches with radiographic measurements. METHODS: The distance between the HJC and the mid-pelvis was calculated and compared between the three approaches. The localisation error between the predictive and functional approach was compared using the radiographic measurements as the reference. The operated leg was compared to the non-operated leg. RESULTS: A significant difference was found for the distance between the HJC and the mid-pelvis when comparing the predictive and functional method. The functional method leads to fewer errors. A statistical difference was found for the localization error between the predictive and functional method. The functional method is twice more precise. CONCLUSION: Although being more individualized, the functional method improves HJC localization and should be used in three-dimensional gait analysis. PMID:22919569

  15. Discrete Fourier Transform Analysis in a Complex Vector Space

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H.

    2009-01-01

    Alternative computational strategies for the Discrete Fourier Transform (DFT) have been developed using analysis of geometric manifolds. This approach provides a general framework for performing DFT calculations, and suggests a more efficient implementation of the DFT for applications using iterative transform methods, particularly phase retrieval. The DFT can thus be implemented using fewer operations when compared to the usual DFT counterpart. The software decreases the run time of the DFT in certain applications such as phase retrieval that iteratively call the DFT function. The algorithm exploits a special computational approach based on analysis of the DFT as a transformation in a complex vector space. As such, this approach has the potential to realize a DFT computation that approaches N operations versus Nlog(N) operations for the equivalent Fast Fourier Transform (FFT) calculation.

  16. Cluster analysis of quantitative parametric maps from DCE-MRI: application in evaluating heterogeneity of tumor response to antiangiogenic treatment.

    PubMed

    Longo, Dario Livio; Dastrù, Walter; Consolino, Lorena; Espak, Miklos; Arigoni, Maddalena; Cavallo, Federica; Aime, Silvio

    2015-07-01

    The objective of this study was to compare a clustering approach to conventional analysis methods for assessing changes in pharmacokinetic parameters obtained from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) during antiangiogenic treatment in a breast cancer model. BALB/c mice bearing established transplantable her2+ tumors were treated with a DNA-based antiangiogenic vaccine or with an empty plasmid (untreated group). DCE-MRI was carried out by administering a dose of 0.05 mmol/kg of Gadocoletic acid trisodium salt, a Gd-based blood pool contrast agent (CA) at 1T. Changes in pharmacokinetic estimates (K(trans) and vp) in a nine-day interval were compared between treated and untreated groups on a voxel-by-voxel analysis. The tumor response to therapy was assessed by a clustering approach and compared with conventional summary statistics, with sub-regions analysis and with histogram analysis. Both the K(trans) and vp estimates, following blood-pool CA injection, showed marked and spatial heterogeneous changes with antiangiogenic treatment. Averaged values for the whole tumor region, as well as from the rim/core sub-regions analysis were unable to assess the antiangiogenic response. Histogram analysis resulted in significant changes only in the vp estimates (p<0.05). The proposed clustering approach depicted marked changes in both the K(trans) and vp estimates, with significant spatial heterogeneity in vp maps in response to treatment (p<0.05), provided that DCE-MRI data are properly clustered in three or four sub-regions. This study demonstrated the value of cluster analysis applied to pharmacokinetic DCE-MRI parametric maps for assessing tumor response to antiangiogenic therapy. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Evaluation of Parallel Analysis Methods for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Crawford, Aaron V.; Green, Samuel B.; Levy, Roy; Lo, Wen-Juo; Scott, Lietta; Svetina, Dubravka; Thompson, Marilyn S.

    2010-01-01

    Population and sample simulation approaches were used to compare the performance of parallel analysis using principal component analysis (PA-PCA) and parallel analysis using principal axis factoring (PA-PAF) to identify the number of underlying factors. Additionally, the accuracies of the mean eigenvalue and the 95th percentile eigenvalue criteria…

  18. A comparative analysis of alternative approaches for quantifying nonlinear dynamics in cardiovascular system.

    PubMed

    Chen, Yun; Yang, Hui

    2013-01-01

    Heart rate variability (HRV) analysis has emerged as an important research topic to evaluate autonomic cardiac function. However, traditional time and frequency-domain analysis characterizes and quantify only linear and stationary phenomena. In the present investigation, we made a comparative analysis of three alternative approaches (i.e., wavelet multifractal analysis, Lyapunov exponents and multiscale entropy analysis) for quantifying nonlinear dynamics in heart rate time series. Note that these extracted nonlinear features provide information about nonlinear scaling behaviors and the complexity of cardiac systems. To evaluate the performance, we used 24-hour HRV recordings from 54 healthy subjects and 29 heart failure patients, available in PhysioNet. Three nonlinear methods are evaluated not only individually but also in combination using three classification algorithms, i.e., linear discriminate analysis, quadratic discriminate analysis and k-nearest neighbors. Experimental results show that three nonlinear methods capture nonlinear dynamics from different perspectives and the combined feature set achieves the best performance, i.e., sensitivity 97.7% and specificity 91.5%. Collectively, nonlinear HRV features are shown to have the promise to identify the disorders in autonomic cardiovascular function.

  19. An Approach to the Comparative Analysis of the US and Soviet Economies.

    DTIC Science & Technology

    1980-09-01

    compare aithority exer:isud in different forms is hasic. Designing such a i,,easuring system necessitates development of a weighting system to measure the...Proper flie maintenance is diffi,.lt 34 a because of poor file organization, inade;tiately designed inflow of updating information, and manual methods...asiSsnet 5.1 Issues Areas for Comparative Analysis Comparativ ,. issioes relatc to i, soc-cicty’s mobjilization of resources and use of institution’, Lo a

  20. Mediation analysis allowing for exposure-mediator interactions and causal interpretation: theoretical assumptions and implementation with SAS and SPSS macros

    PubMed Central

    Valeri, Linda; VanderWeele, Tyler J.

    2012-01-01

    Mediation analysis is a useful and widely employed approach to studies in the field of psychology and in the social and biomedical sciences. The contributions of this paper are several-fold. First we seek to bring the developments in mediation analysis for non linear models within the counterfactual framework to the psychology audience in an accessible format and compare the sorts of inferences about mediation that are possible in the presence of exposure-mediator interaction when using a counterfactual versus the standard statistical approach. Second, the work by VanderWeele and Vansteelandt (2009, 2010) is extended here to allow for dichotomous mediators and count outcomes. Third, we provide SAS and SPSS macros to implement all of these mediation analysis techniques automatically and we compare the types of inferences about mediation that are allowed by a variety of software macros. PMID:23379553

  1. Mediation analysis allowing for exposure-mediator interactions and causal interpretation: theoretical assumptions and implementation with SAS and SPSS macros.

    PubMed

    Valeri, Linda; Vanderweele, Tyler J

    2013-06-01

    Mediation analysis is a useful and widely employed approach to studies in the field of psychology and in the social and biomedical sciences. The contributions of this article are several-fold. First we seek to bring the developments in mediation analysis for nonlinear models within the counterfactual framework to the psychology audience in an accessible format and compare the sorts of inferences about mediation that are possible in the presence of exposure-mediator interaction when using a counterfactual versus the standard statistical approach. Second, the work by VanderWeele and Vansteelandt (2009, 2010) is extended here to allow for dichotomous mediators and count outcomes. Third, we provide SAS and SPSS macros to implement all of these mediation analysis techniques automatically, and we compare the types of inferences about mediation that are allowed by a variety of software macros. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  2. Global spectral graph wavelet signature for surface analysis of carpal bones

    NASA Astrophysics Data System (ADS)

    Masoumi, Majid; Rezaei, Mahsa; Ben Hamza, A.

    2018-02-01

    Quantitative shape comparison is a fundamental problem in computer vision, geometry processing and medical imaging. In this paper, we present a spectral graph wavelet approach for shape analysis of carpal bones of the human wrist. We employ spectral graph wavelets to represent the cortical surface of a carpal bone via the spectral geometric analysis of the Laplace-Beltrami operator in the discrete domain. We propose global spectral graph wavelet (GSGW) descriptor that is isometric invariant, efficient to compute, and combines the advantages of both low-pass and band-pass filters. We perform experiments on shapes of the carpal bones of ten women and ten men from a publicly-available database of wrist bones. Using one-way multivariate analysis of variance (MANOVA) and permutation testing, we show through extensive experiments that the proposed GSGW framework gives a much better performance compared to the global point signature embedding approach for comparing shapes of the carpal bones across populations.

  3. Global spectral graph wavelet signature for surface analysis of carpal bones.

    PubMed

    Masoumi, Majid; Rezaei, Mahsa; Ben Hamza, A

    2018-02-05

    Quantitative shape comparison is a fundamental problem in computer vision, geometry processing and medical imaging. In this paper, we present a spectral graph wavelet approach for shape analysis of carpal bones of the human wrist. We employ spectral graph wavelets to represent the cortical surface of a carpal bone via the spectral geometric analysis of the Laplace-Beltrami operator in the discrete domain. We propose global spectral graph wavelet (GSGW) descriptor that is isometric invariant, efficient to compute, and combines the advantages of both low-pass and band-pass filters. We perform experiments on shapes of the carpal bones of ten women and ten men from a publicly-available database of wrist bones. Using one-way multivariate analysis of variance (MANOVA) and permutation testing, we show through extensive experiments that the proposed GSGW framework gives a much better performance compared to the global point signature embedding approach for comparing shapes of the carpal bones across populations.

  4. Statistical Approaches to Adjusting Weights for Dependent Arms in Network Meta-analysis.

    PubMed

    Su, Yu-Xuan; Tu, Yu-Kang

    2018-05-22

    Network meta-analysis compares multiple treatments in terms of their efficacy and harm by including evidence from randomized controlled trials. Most clinical trials use parallel design, where patients are randomly allocated to different treatments and receive only one treatment. However, some trials use within person designs such as split-body, split-mouth and cross-over designs, where each patient may receive more than one treatment. Data from treatment arms within these trials are no longer independent, so the correlations between dependent arms need to be accounted for within the statistical analyses. Ignoring these correlations may result in incorrect conclusions. The main objective of this study is to develop statistical approaches to adjusting weights for dependent arms within special design trials. In this study, we demonstrate the following three approaches: the data augmentation approach, the adjusting variance approach, and the reducing weight approach. These three methods could be perfectly applied in current statistic tools such as R and STATA. An example of periodontal regeneration was used to demonstrate how these approaches could be undertaken and implemented within statistical software packages, and to compare results from different approaches. The adjusting variance approach can be implemented within the network package in STATA, while reducing weight approach requires computer software programming to set up the within-study variance-covariance matrix. This article is protected by copyright. All rights reserved.

  5. How should health service organizations respond to diversity? A content analysis of six approaches.

    PubMed

    Seeleman, Conny; Essink-Bot, Marie-Louise; Stronks, Karien; Ingleby, David

    2015-11-16

    Health care organizations need to be responsive to the needs of increasingly diverse patient populations. We compared the contents of six publicly available approaches to organizational responsiveness to diversity. The central questions addressed in this paper are: what are the most consistently recommended issues for health care organizations to address in order to be responsive to the needs of diverse groups that differ from the majority population? How much consensus is there between various approaches? We purposively sampled six approaches from the US, Australia and Europe and used qualitative textual analysis to categorize the content of each approach into domains (conceptually distinct topic areas) and, within each domain, into dimensions (operationalizations). The resulting classification framework was used for comparative analysis of the content of the six approaches. We identified seven domains that were represented in most or all approaches: organizational commitment, empirical evidence on inequalities and needs, a competent and diverse workforce, ensuring access for all users, ensuring responsiveness in care provision, fostering patient and community participation, and actively promoting responsiveness. Variations in the operationalization of these domains related to different scopes, contexts and types of diversity. For example, approaches that focus on ethnic diversity mostly provide recommendations to handle cultural and language differences; approaches that take an intersectional approach and broaden their target population to vulnerable groups in a more general sense also pay attention to factors such as socio-economic status and gender. Despite differences in labeling, there is a broad consensus about what health care organizations need to do in order to be responsive to patient diversity. This opens the way to full scale implementation of organizational responsiveness in healthcare and structured evaluation of its effectiveness in improving patient outcomes.

  6. A generalized least-squares framework for rare-variant analysis in family data.

    PubMed

    Li, Dalin; Rotter, Jerome I; Guo, Xiuqing

    2014-01-01

    Rare variants may, in part, explain some of the hereditability missing in current genome-wide association studies. Many gene-based rare-variant analysis approaches proposed in recent years are aimed at population-based samples, although analysis strategies for family-based samples are clearly warranted since the family-based design has the potential to enhance our ability to enrich for rare causal variants. We have recently developed the generalized least squares, sequence kernel association test, or GLS-SKAT, approach for the rare-variant analyses in family samples, in which the kinship matrix that was computed from the high dimension genetic data was used to decorrelate the family structure. We then applied the SKAT-O approach for gene-/region-based inference in the decorrelated data. In this study, we applied this GLS-SKAT method to the systolic blood pressure data in the simulated family sample distributed by the Genetic Analysis Workshop 18. We compared the GLS-SKAT approach to the rare-variant analysis approach implemented in family-based association test-v1 and demonstrated that the GLS-SKAT approach provides superior power and good control of type I error rate.

  7. Complementarity and Area-Efficiency in the Prioritization of the Global Protected Area Network.

    PubMed

    Kullberg, Peter; Toivonen, Tuuli; Montesino Pouzols, Federico; Lehtomäki, Joona; Di Minin, Enrico; Moilanen, Atte

    2015-01-01

    Complementarity and cost-efficiency are widely used principles for protected area network design. Despite the wide use and robust theoretical underpinnings, their effects on the performance and patterns of priority areas are rarely studied in detail. Here we compare two approaches for identifying the management priority areas inside the global protected area network: 1) a scoring-based approach, used in recently published analysis and 2) a spatial prioritization method, which accounts for complementarity and area-efficiency. Using the same IUCN species distribution data the complementarity method found an equal-area set of priority areas with double the mean species ranges covered compared to the scoring-based approach. The complementarity set also had 72% more species with full ranges covered, and lacked any coverage only for half of the species compared to the scoring approach. Protected areas in our complementarity-based solution were on average smaller and geographically more scattered. The large difference between the two solutions highlights the need for critical thinking about the selected prioritization method. According to our analysis, accounting for complementarity and area-efficiency can lead to considerable improvements when setting management priorities for the global protected area network.

  8. Bioequivalence evaluation of two brands of amoxicillin/clavulanic acid 250/125 mg combination tablets in healthy human volunteers: use of replicate design approach.

    PubMed

    Idkaidek, Nasir M; Al-Ghazawi, Ahmad; Najib, Naji M

    2004-12-01

    The purpose of this study was to apply a replicate design approach to a bioequivalence study of amoxicillin/clavulanic acid combination following a 250/125 mg oral dose to 23 subjects, and to compare the analysis of individual bioequivalence with average bioequivalence. This was conducted as a 2-treatment 2-sequence 4-period crossover study. Average bioequivalence was shown, while the results from the individual bioequivalence approach had no success in showing bioequivalence. In conclusion, the individual bioequivalence approach is a strong statistical tool to test for intra-subject variances and also subject-by-formulation interaction variance compared with the average bioequivalence approach. copyright (c) 2004 John Wiley & Sons, Ltd.

  9. Evaluation of the use of nodal methods for MTR neutronic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reitsma, F.; Mueller, E.Z.

    1997-08-01

    Although modern nodal methods are used extensively in the nuclear power industry, their use for research reactor analysis has been very limited. The suitability of nodal methods for material testing reactor analysis is investigated with the emphasis on the modelling of the core region (fuel assemblies). The nodal approach`s performance is compared with that of the traditional finite-difference fine mesh approach. The advantages of using nodal methods coupled with integrated cross section generation systems are highlighted, especially with respect to data preparation, simplicity of use and the possibility of performing a great variety of reactor calculations subject to strict timemore » limitations such as are required for the RERTR program.« less

  10. A Focus on Problems of National Interest in the College General Chemistry Laboratory: The Effects of the Problem-Oriented Method Compared with Those of the Traditional Approach.

    ERIC Educational Resources Information Center

    Neman, Robert Lynn

    This study was designed to assess the effects of the problem-oriented method compared to those of the traditional approach in general chemistry at the college level. The problem-oriented course included topics such as air and water pollution, drug addiction and analysis, tetraethyl-lead additives, insecticides in the environment, and recycling of…

  11. Laparoendoscopic single-site surgery varicocelectomy versus conventional laparoscopic varicocele ligation: A meta-analysis

    PubMed Central

    Li, Mingchao; Wang, Zhengyun

    2016-01-01

    Objective To perform a meta-analysis of data from available published studies comparing laparoendoscopic single-site surgery varicocelectomy (LESSV) with conventional transperitoneal laparoscopic varicocele ligation. Methods A comprehensive data search was performed in PubMed and Embase to identify randomized controlled trials and comparative studies that compared the two surgical approaches for the treatment of varicoceles. Results Six studies were included in the meta-analysis. LESSV required a significantly longer operative time than conventional laparoscopic varicocelectomy but was associated with significantly less postoperative pain at 6 h and 24 h, a shorter recovery time and greater patient satisfaction with the cosmetic outcome. There was no difference between the two surgical approaches in terms of postoperative semen quality or the incidence of complications. Conclusion These data suggest that LESSV offers a well tolerated and efficient alternative to conventional laparoscopic varicocelectomy, with less pain, a shorter recovery time and better cosmetic satisfaction. Further well-designed studies are required to confirm these findings and update the results of this meta-analysis. PMID:27688686

  12. Assessing commercial opportunities for autologous and allogeneic cell-based products.

    PubMed

    Smith, Devyn M

    2012-09-01

    The two primary cell sources used to produce cell-based therapies are autologous (self-derived) and allogeneic (derived from a donor). This analysis attempts to compare and contrast the two approaches in order to understand whether there is an emerging preference in the market. While the current clinical trials underway are slightly biased to autologous approaches, it is clear that both cell-based approaches are being aggressively pursued. This analysis also breaks down the commercial advantages of each cell-based approach, comparing both cost of goods and the ideal indication type for each. While allogeneic therapies have considerable advantages over autologous therapies, they do have a distinct disadvantage regarding potential immunogenicity. The introduction of the hybrid autologous business model provides the ability for autologous-based therapies to mitigate some of the advantages that allogeneic cell-based therapies enjoy, including cost of goods. Finally, two case studies are presented that demonstrate that there is sufficient space for both autologous and allogeneic cell-based therapies within a single disease area.

  13. Sinus tarsi approach (STA) versus extensile lateral approach (ELA) for treatment of closed displaced intra-articular calcaneal fractures (DIACF): A meta-analysis.

    PubMed

    Bai, L; Hou, Y-L; Lin, G-H; Zhang, X; Liu, G-Q; Yu, B

    2018-04-01

    Our aim was to compare the effect of sinus tarsi approach (STA) vs extensile lateral approach (ELA) for treatment of closed displaced intra-articular calcaneal fractures (DIACF) is still being debated. A thorough research was carried out in the MEDLINE, EMBASE and Cochrane library databases from inception to December 2016. Only prospective or retrospective comparative studies were selected in this meta-analysis. Two independent reviewers conducted literature search, data extraction and quality assessment. The primary outcomes were anatomical restoration and prevalence of complications. Secondary outcomes included operation time and functional recovery. Four randomized controlled trials involving 326 patients and three cohort studies involving 206 patients were included. STA technique for DIACFs led to a decline in both operation time and incidence of complications. There were no significant differences between the groups in American Orthopedic Foot and Ankle Society scores, nor changes in Böhler angle. This meta-analysis suggests that STA technique may reduce the operation time and incidence of complications. In conclusion, STA technique is reasonably an optimal choice for DIACF. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  14. Comparing top-down and bottom-up costing approaches for economic evaluation within social welfare.

    PubMed

    Olsson, Tina M

    2011-10-01

    This study compares two approaches to the estimation of social welfare intervention costs: one "top-down" and the other "bottom-up" for a group of social welfare clients with severe problem behavior participating in a randomized trial. Intervention costs ranging over a two-year period were compared by intervention category (foster care placement, institutional placement, mentorship services, individual support services and structured support services), estimation method (price, micro costing, average cost) and treatment group (intervention, control). Analyses are based upon 2007 costs for 156 individuals receiving 404 interventions. Overall, both approaches were found to produce reliable estimates of intervention costs at the group level but not at the individual level. As choice of approach can greatly impact the estimate of mean difference, adjustment based on estimation approach should be incorporated into sensitivity analyses. Analysts must take care in assessing the purpose and perspective of the analysis when choosing a costing approach for use within economic evaluation.

  15. Comparative genomic analysis identified a mutation related to enhanced heterologous protein production in the filamentous fungus Aspergillus oryzae.

    PubMed

    Jin, Feng-Jie; Katayama, Takuya; Maruyama, Jun-Ichi; Kitamoto, Katsuhiko

    2016-11-01

    Genomic mapping of mutations using next-generation sequencing technologies has facilitated the identification of genes contributing to fundamental biological processes, including human diseases. However, few studies have used this approach to identify mutations contributing to heterologous protein production in industrial strains of filamentous fungi, such as Aspergillus oryzae. In a screening of A. oryzae strains that hyper-produce human lysozyme (HLY), we previously isolated an AUT1 mutant that showed higher production of various heterologous proteins; however, the underlying factors contributing to the increased heterologous protein production remained unclear. Here, using a comparative genomic approach performed with whole-genome sequences, we attempted to identify the genes responsible for the high-level production of heterologous proteins in the AUT1 mutant. The comparative sequence analysis led to the detection of a gene (AO090120000003), designated autA, which was predicted to encode an unknown cytoplasmic protein containing an alpha/beta-hydrolase fold domain. Mutation or deletion of autA was associated with higher production levels of HLY. Specifically, the HLY yields of the autA mutant and deletion strains were twofold higher than that of the control strain during the early stages of cultivation. Taken together, these results indicate that combining classical mutagenesis approaches with comparative genomic analysis facilitates the identification of novel genes involved in heterologous protein production in filamentous fungi.

  16. Feature-level sentiment analysis by using comparative domain corpora

    NASA Astrophysics Data System (ADS)

    Quan, Changqin; Ren, Fuji

    2016-06-01

    Feature-level sentiment analysis (SA) is able to provide more fine-grained SA on certain opinion targets and has a wider range of applications on E-business. This study proposes an approach based on comparative domain corpora for feature-level SA. The proposed approach makes use of word associations for domain-specific feature extraction. First, we assign a similarity score for each candidate feature to denote its similarity extent to a domain. Then we identify domain features based on their similarity scores on different comparative domain corpora. After that, dependency grammar and a general sentiment lexicon are applied to extract and expand feature-oriented opinion words. Lastly, the semantic orientation of a domain-specific feature is determined based on the feature-oriented opinion lexicons. In evaluation, we compare the proposed method with several state-of-the-art methods (including unsupervised and semi-supervised) using a standard product review test collection. The experimental results demonstrate the effectiveness of using comparative domain corpora.

  17. Comparative study of landslides susceptibility mapping methods: Multi-Criteria Decision Making (MCDM) and Artificial Neural Network (ANN)

    NASA Astrophysics Data System (ADS)

    Salleh, S. A.; Rahman, A. S. A. Abd; Othman, A. N.; Mohd, W. M. N. Wan

    2018-02-01

    As different approach produces different results, it is crucial to determine the methods that are accurate in order to perform analysis towards the event. This research aim is to compare the Rank Reciprocal (MCDM) and Artificial Neural Network (ANN) analysis techniques in determining susceptible zones of landslide hazard. The study is based on data obtained from various sources such as local authority; Dewan Bandaraya Kuala Lumpur (DBKL), Jabatan Kerja Raya (JKR) and other agencies. The data were analysed and processed using Arc GIS. The results were compared by quantifying the risk ranking and area differential. It was also compared with the zonation map classified by DBKL. The results suggested that ANN method gives better accuracy compared to MCDM with 18.18% higher accuracy assessment of the MCDM approach. This indicated that ANN provides more reliable results and it is probably due to its ability to learn from the environment thus portraying realistic and accurate result.

  18. Metabolic De-Isotoping for Improved LC-MS Characterization of Modified RNAs

    NASA Astrophysics Data System (ADS)

    Wetzel, Collin; Li, Siwei; Limbach, Patrick A.

    2014-07-01

    Mapping, sequencing, and quantifying individual noncoding ribonucleic acids (ncRNAs), including post-transcriptionally modified nucleosides, by mass spectrometry is a challenge that often requires rigorous sample preparation prior to analysis. Previously, we have described a simplified method for the comparative analysis of RNA digests (CARD) that is applicable to relatively complex mixtures of ncRNAs. In the CARD approach for transfer RNA (tRNA) analysis, two complete sets of digestion products from total tRNA are compared using the enzymatic incorporation of 16O/18O isotopic labels. This approach allows one to rapidly screen total tRNAs from gene deletion mutants or comparatively sequence total tRNA from two related bacterial organisms. However, data analysis can be challenging because of convoluted mass spectra arising from the natural 13C and 15 N isotopes present in the ribonuclease-digested tRNA samples. Here, we demonstrate that culturing in 12C-enriched/13C-depleted media significantly reduces the isotope patterns that must be interpreted during the CARD experiment. Improvements in data quality yield a 35 % improvement in detection of tRNA digestion products that can be uniquely assigned to particular tRNAs. These mass spectral improvements lead to a significant reduction in data processing attributable to the ease of spectral identification of labeled digestion products and will enable improvements in the relative quantification of modified RNAs by the 16O/18O differential labeling approach.

  19. Low-Cost Propellant Launch to LEO from a Tethered Balloon - Economic and Thermal Analysis

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian H.; Schneider, Evan G.; Vaughan, David A.; Hall, Jeffrey L.

    2010-01-01

    This paper provides new analysis of the economics of low-cost propellant launch coupled with dry hardware re-use, and of the thermal control of the liquid hydrogen once on-orbit. One conclusion is that this approach enables an overall reduction in the cost-permission by as much as a factor of five as compared to current approaches for human exploration of the moon, Mars, and near-Earth asteroids.

  20. A Comparative Study of Learning Strategies Used by Romanian and Hungarian Preuniversity Students in Science Learning

    ERIC Educational Resources Information Center

    Lingvay, Mónika; Timofte, Roxana S.; Ciascai, Liliana; Predescu, Constantin

    2015-01-01

    Development of pupils' deep learning approach is an important goal of education nowadays, considering that a deep learning approach is mediating conceptual understanding and transfer. Different performance at PISA tests of Romanian and Hungarian pupils cause us to commence a study for the analysis of learning approaches employed by these pupils.…

  1. Manage Work Better to Better Manage Human Resources: A Comparative Study of Two Approaches to Job Analysis.

    ERIC Educational Resources Information Center

    Clifford, James P.

    1996-01-01

    The Position Classification Questionnaire (PCQ) and task inventory method (TI) were used to study the same work functions. PCQ divided functions into 16 classes for wage/salary determination. TI divided them into 28 classifications for training purposes. The two approaches were considerably different in their approach to how work should be…

  2. Determination of apparent coupling factors for adhesive bonded acrylic plates using SEAL approach

    NASA Astrophysics Data System (ADS)

    Pankaj, Achuthan. C.; Shivaprasad, M. V.; Murigendrappa, S. M.

    2018-04-01

    Apparent coupling loss factors (CLF) and velocity responses has been computed for two lap joined adhesive bonded plates using finite element and experimental statistical energy analysis like approach. A finite element model of the plates has been created using ANSYS software. The statistical energy parameters have been computed using the velocity responses obtained from a harmonic forced excitation analysis. Experiments have been carried out for two different cases of adhesive bonded joints and the results have been compared with the apparent coupling factors and velocity responses obtained from finite element analysis. The results obtained from the studies signify the importance of modeling of adhesive bonded joints in computation of the apparent coupling factors and its further use in computation of energies and velocity responses using statistical energy analysis like approach.

  3. Phenotypic mapping of metabolic profiles using self-organizing maps of high-dimensional mass spectrometry data.

    PubMed

    Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A

    2014-07-01

    A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.

  4. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  5. Comparative Analysis of the Approaches Used By Prospective Music Teachers in Turkey in Practicing the Piano Works of Contemporary Turkish Composers

    ERIC Educational Resources Information Center

    Sönmezöz, Feyza

    2015-01-01

    This study determined the levels of recognizing piano works of contemporary Turkish composers, the importance of practicing these works and difficulty in playing these works by the prospective music teachers in the Departments of Music Education in Turkey. Furthermore, this study performed a comparative analysis on the opinions of the prospective…

  6. Environmental Justice analysis in Hydraulic Fracturing Analysis, June 13, 2011

    EPA Pesticide Factsheets

    This planning document describes the quality assurance/quality control activities and technical requirements that will be used during the research study, using an index-based approach to compare a nationally representative set of well sites fractured.

  7. Advantages and limitations of classic and 3D QSAR approaches in nano-QSAR studies based on biological activity of fullerene derivatives

    DOE PAGES

    Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta; ...

    2016-08-29

    In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less

  8. Advantages and limitations of classic and 3D QSAR approaches in nano-QSAR studies based on biological activity of fullerene derivatives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jagiello, Karolina; Grzonkowska, Monika; Swirog, Marta

    In this contribution, the advantages and limitations of two computational techniques that can be used for the investigation of nanoparticles activity and toxicity: classic nano-QSAR (Quantitative Structure–Activity Relationships employed for nanomaterials) and 3D nano-QSAR (three-dimensional Quantitative Structure–Activity Relationships, such us Comparative Molecular Field Analysis, CoMFA/Comparative Molecular Similarity Indices Analysis, CoMSIA analysis employed for nanomaterials) have been briefly summarized. Both approaches were compared according to the selected criteria, including: efficiency, type of experimental data, class of nanomaterials, time required for calculations and computational cost, difficulties in the interpretation. Taking into account the advantages and limitations of each method, we provide themore » recommendations for nano-QSAR modellers and QSAR model users to be able to determine a proper and efficient methodology to investigate biological activity of nanoparticles in order to describe the underlying interactions in the most reliable and useful manner.« less

  9. Rural water supply and related services in developing countries — Comparative analysis of several approaches

    NASA Astrophysics Data System (ADS)

    Bajard, Y.; Draper, M.; Viens, P.

    1981-05-01

    The proposed paper deals with a comparative analysis of several approaches possible and actually used for a joint action of local institutions and foreign aid in the field of water supply and related services such as sanitation to villages and small rural agglomerations (market towns, etc.) in developing countries. This comparative analysis is based on examples of actual programmes in this field. The authors have participated in most of the programmes selected as examples, at various levels and in various capacities, from conception to design, implementation and/or evaluation (i.e. rural development programmes in Ivory Coast, Ghana (upper region), Benin and Ethiopia. The authors were not involved in other examples such as water supply and/or sanitation to small urban centres in Benin, Ivory Coast, etc. They have, however, witnessed them directly and have obtained, therefore, first-hand information on their organization, execution and results. Several typical examples of actual projects are briefly defined and characterized. The paper undertakes, then, to compare, in a clinical fashion, the advantages and drawbacks of the approaches taken in the various examples presented. The paper finally proposes a recommendation for a realistic approach to joint action between local/domestic and foreign financing/assistance agencies and executing bodies (consultants, contractors) in the field of rural water supply, sanitation, and more generally, health improvement. The definition of this line of approach is made in terms of logical framework, i.e. goals, purposes, outputs and inputs at the various stages of the project, up to actual evaluation of execution and impact if possible; description of practical indicators of the two types of evaluation. A particular attention is given to the problems of technological choices, in view of the constraints imposed by the natural environment, by the human and social patterns; in view also of the institutions and the economy. Another point of importance taken into consideration by the paper is the problem of information, education, and support to users for the introduction, implementation, operation and maintenance of technical developments at village level. Conclusions are drawn as to the relative advantages of this approach over the "classical" approach and its replicability.

  10. Propensity Score Analysis Comparing Videothoracoscopic Lobectomy With Thoracotomy: A French Nationwide Study.

    PubMed

    Pagès, Pierre-Benoit; Delpy, Jean-Philippe; Orsini, Bastien; Gossot, Dominique; Baste, Jean-Marc; Thomas, Pascal; Dahan, Marcel; Bernard, Alain

    2016-04-01

    Video-assisted thoracoscopic surgery (VATS) lobectomy has recently become the recommended approach for stage I non-small cell lung cancer. However, these guidelines are not based on any large randomized control trial. Our study used propensity scores and a sensitivity analysis to compare VATS lobectomy with open thoracotomy. From 2005 to 2012, 24,811 patients (95.1%) were operated on by open thoracotomy and 1,278 (4.9%) by VATS. The end points were 30-day postoperative death, postoperative complications, hospital stay, overall survival, and disease-free survival. Two propensity scores analyses were performed: matching and inverse probability of treatment weighting, and one sensitivity analysis to unmask potential hidden bias. A subgroup analysis was performed to compare "high-risk" with "low-risk" patients. Results are reported by odds ratios or hazard ratios and their 95% confidence intervals. Postoperative death was not significantly reduced by VATS whatever the analysis. Concerning postoperative complications, VATS significantly decreased the occurrence of atelectasis and pneumopathy with both analysis methods, but there were no differences in the occurrence of other postoperative complications. VATS did not provide a benefit for high-risk patients. The VATS approach decreased the hospital length of stay from 2.4 days (95% confidence interval, -1.7 to -3 days) to -4.68 days (95% confidence interval, -8.5 to 0.9 days). Overall survival and disease-free survival were not influenced by the surgical approach. The sensitivity analysis showed potential biases. The results must be interpreted carefully because of the differences observed according to the propensity scores method used. A multicenter randomized controlled trial is necessary to limit the biases. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  11. Low Thrust Orbital Maneuvers Using Ion Propulsion

    NASA Astrophysics Data System (ADS)

    Ramesh, Eric

    2011-10-01

    Low-thrust maneuver options, such as electric propulsion, offer specific challenges within mission-level Modeling, Simulation, and Analysis (MS&A) tools. This project seeks to transition techniques for simulating low-thrust maneuvers from detailed engineering level simulations such as AGI's Satellite ToolKit (STK) Astrogator to mission level simulations such as the System Effectiveness Analysis Simulation (SEAS). Our project goals are as follows: A) Assess different low-thrust options to achieve various orbital changes; B) Compare such approaches to more conventional, high-thrust profiles; C) Compare computational cost and accuracy of various approaches to calculate and simulate low-thrust maneuvers; D) Recommend methods for implementing low-thrust maneuvers in high-level mission simulations; E) prototype recommended solutions.

  12. Numerical Uncertainty Analysis for Computational Fluid Dynamics using Student T Distribution -- Application of CFD Uncertainty Analysis Compared to Exact Analytical Solution

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.

    2014-01-01

    Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.

  13. Data analysis strategies for reducing the influence of the bias in cross-cultural research.

    PubMed

    Sindik, Josko

    2012-03-01

    In cross-cultural research, researchers have to adjust the constructs and associated measurement instruments that have been developed in one culture and then imported for use in another culture. Importing concepts from other cultures is often simply reduced to language adjustment of the content in the items of the measurement instruments that define a certain (psychological) construct. In the context of cross-cultural research, test bias can be defined as a generic term for all nuisance factors that threaten the validity of cross-cultural comparisons. Bias can be an indicator that instrument scores based on the same items measure different traits and characteristics across different cultural groups. To reduce construct, method and item bias,the researcher can consider these strategies: (1) simply comparing average results in certain measuring instruments; (2) comparing only the reliability of certain dimensions of the measurement instruments, applied to the "target" and "source" samples of participants, i.e. from different cultures; (3) comparing the "framed" factor structure (fixed number of factors) of the measurement instruments, applied to the samples from the "target" and "source" cultures, using explorative factor analysis strategy on separate samples; (4) comparing the complete constructs ("unframed" factor analysis, i.e. unlimited number of factors) in relation to their best psychometric properties and the possibility of interpreting (best suited to certain cultures, applying explorative strategy of factor analysis); or (5) checking the similarity of the constructs in the samples from different cultures (using structural equation modeling approach). Each approach has its advantages and disadvantages. The advantages and lacks of each approach are discussed.

  14. Are Sex Effects on Ethical Decision-Making Fake or Real? A Meta-Analysis on the Contaminating Role of Social Desirability Response Bias.

    PubMed

    Yang, Jianfeng; Ming, Xiaodong; Wang, Zhen; Adams, Susan M

    2017-02-01

    A meta-analysis of 143 studies was conducted to explore how the social desirability response bias may influence sex effects on ratings on measures of ethical decision-making. Women rated themselves as more ethical than did men; however, this sex effect on ethical decision-making was no longer significant when social desirability response bias was controlled. The indirect questioning approach was compared with the direct measurement approach for effectiveness in controlling social desirability response bias. The indirect questioning approach was found to be more effective.

  15. Analysis of Advanced Modular Power Systems (AMPS) for Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard; Soeder, James F.; Beach, Ray

    2014-01-01

    The Advanced Modular Power Systems (AMPS) project is developing a modular approach to spacecraft power systems for exploration beyond Earth orbit. AMPS is intended to meet the need of reducing the cost of design development, test and integration and also reducing the operational logistics cost of supporting exploration missions. AMPS seeks to establish modular power building blocks with standardized electrical, mechanical, thermal and data interfaces that can be applied across multiple exploration vehicles. The presentation discusses the results of a cost analysis that compares the cost of the modular approach against a traditional non-modular approach.

  16. Systematic text condensation: a strategy for qualitative analysis.

    PubMed

    Malterud, Kirsti

    2012-12-01

    To present background, principles, and procedures for a strategy for qualitative analysis called systematic text condensation and discuss this approach compared with related strategies. Giorgi's psychological phenomenological analysis is the point of departure and inspiration for systematic text condensation. The basic elements of Giorgi's method and the elaboration of these in systematic text condensation are presented, followed by a detailed description of procedures for analysis according to systematic text condensation. Finally, similarities and differences compared with other frequently applied methods for qualitative analysis are identified, as the foundation of a discussion of strengths and limitations of systematic text condensation. Systematic text condensation is a descriptive and explorative method for thematic cross-case analysis of different types of qualitative data, such as interview studies, observational studies, and analysis of written texts. The method represents a pragmatic approach, although inspired by phenomenological ideas, and various theoretical frameworks can be applied. The procedure consists of the following steps: 1) total impression - from chaos to themes; 2) identifying and sorting meaning units - from themes to codes; 3) condensation - from code to meaning; 4) synthesizing - from condensation to descriptions and concepts. Similarities and differences comparing systematic text condensation with other frequently applied qualitative methods regarding thematic analysis, theoretical methodological framework, analysis procedures, and taxonomy are discussed. Systematic text condensation is a strategy for analysis developed from traditions shared by most of the methods for analysis of qualitative data. The method offers the novice researcher a process of intersubjectivity, reflexivity, and feasibility, while maintaining a responsible level of methodological rigour.

  17. Current concepts in cleft care: A multicenter analysis.

    PubMed

    Thiele, Oliver C; Kreppel, Matthias; Dunsche, Anton; Eckardt, Andre M; Ehrenfeld, Michael; Fleiner, Bernd; Gaßling, Volker; Gehrke, Gerd; Gerressen, Marcus; Gosau, Martin; Gröbe, Alexander; Haßfeld, Stefan; Heiland, Max; Hoffmeister, Bodo; Hölzle, Frank; Klein, Cornelius; Krüger, Maximilian; Kübler, Alexander C; Kübler, Norbert R; Kuttenberger, Johannes J; Landes, Constantin; Lauer, Günter; Martini, Markus; Merholz, Erich T; Mischkowski, Robert A; Al-Nawas, Bilal; Nkenke, Emeka; Piesold, Jörn U; Pradel, Winnie; Rasse, Michael; Rachwalski, Martin; Reich, Rudolf H; Rothamel, Daniel; Rustemeyer, Jan; Scheer, Martin; Schliephake, Henning; Schmelzeisen, Rainer; Schramm, Alexander; Schupp, Wiebke; Spitzer, Wolfgang J; Stocker, Erwin; Stoll, Christian; Terheyden, Hendrik; Voigt, Alexander; Wagner, Wilfried; Weingart, Dieter; Werkmeister, Richard; Wiltfang, Jörg; Ziegler, Christoph M; Zöller, Joachim E

    2018-04-01

    The current surgical techniques used in cleft repair are well established, but different centers use different approaches. To determine the best treatment for patients, a multi-center comparative study is required. In this study, we surveyed all craniofacial departments registered with the German Society of Maxillofacial Surgery to determine which cleft repair techniques are currently in use. Our findings revealed much variation in cleft repair between different centers. Although most centers did use a two-stage approach, the operative techniques and timing of lip and palate closure were different in every center. This shows that a retrospective comparative analysis of patient outcome between the participating centers is not possible and illustrates the need for prospective comparative studies to establish the optimal technique for reconstructive cleft surgery. Copyright © 2018 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  18. Ice Accretion Modeling using an Eulerian Approach for Droplet Impingement

    NASA Technical Reports Server (NTRS)

    Kim, Joe Woong; Garza, Dennis P.; Sankar, Lakshmi N.; Kreeger, Richard E.

    2012-01-01

    A three-dimensional Eulerian analysis has been developed for modeling droplet impingement on lifting bodes. The Eulerian model solves the conservation equations of mass and momentum to obtain the droplet flow field properties on the same mesh used in CFD simulations. For complex configurations such as a full rotorcraft, the Eulerian approach is more efficient because the Lagrangian approach would require a significant amount of seeding for accurate estimates of collection efficiency. Simulations are done for various benchmark cases such as NACA0012 airfoil, MS317 airfoil and oscillating SC2110 airfoil to illustrate its use. The present results are compared with results from the Lagrangian approach used in an industry standard analysis called LEWICE.

  19. Two-tiered design analysis of a radiator for a solar dynamic powered Stirling engine

    NASA Technical Reports Server (NTRS)

    Hainley, Donald C.

    1989-01-01

    Two separate design approaches for a pumped loop radiator used to transfer heat from the cold end of a solar dynamic powered Stirling engine are described. The first approach uses a standard method to determine radiator requirements to meet specified end of mission conditions. Trade-off studies conducted for the analysis are included. Justification of this concept within the specified parameters of the analysis is provided. The second design approach determines the life performance of the radiator/Stirling system. In this approach, the system performance was altered by reducing the radiator heat transfer area. Performance effects and equilibrium points were determined as radiator segments were removed. This simulates the effect of loss of radiator sections due to micro-meteoroid and space debris penetration. The two designs were compared on the basis of overall system requirements and goals.

  20. Two-tiered design analysis of a radiator for a solar dynamic powered Stirling engine

    NASA Technical Reports Server (NTRS)

    Hainley, Donald C.

    1989-01-01

    Two separate design approaches for a pumped loop radiator used to transfer heat from the cold end of a solar dynamic powered Stirling engine are described. The first approach uses a standard method to determine radiator requirements to meet specified end of mission conditions. Trade-off studies conducted for the analysis are included. Justification of this concept within the specified parameters of the analysis is provided. The second design approach determines the life performance of the radiator/Stirling system. In this approach, the system performance was altered by reducing the radiator heat transfer area. Performance effects and equilibrium points were determined as radiator segments were removed. This simulates the effect of loss of radiator sections due to micro-meteoroid and space debris penetration. The two designs are compared on the basis of overall system requirements and goals.

  1. Comparing child protective investigation performance between law enforcement agencies and child welfare agencies.

    PubMed

    Jordan, Neil; Yampolskaya, Svetlana; Gustafson, Mara; Armstrong, Mary; McNeish, Roxann; Vargo, Amy

    2011-01-01

    This study examines the comparative effectiveness of using law enforcement agencies for child protective investigation (CPI), in contrast with the traditional approach of CPI conducted by the public child welfare agency. The analysis uses 2006-2007 data from a natural experiment conducted in Florida to show modest differences in performance and cost-efficiency between the two approaches to CPI. These findings may have implications for other states considering outsourcing CPI to law enforcement.

  2. QF-PCR as a substitute for karyotyping of cytotrophoblast for the analysis of chorionic villi: advantages and limitations from a cytogenetic retrospective audit of 44,727 first-trimester prenatal diagnoses.

    PubMed

    Grati, Francesca R; Malvestiti, Francesca; Grimi, Beatrice; Gaetani, Elisa; Di Meco, Anna Maria; Trotta, Anna; Liuti, Rosaria; Chinetti, Sara; Dulcetti, Francesca; Ruggeri, Anna Maria; Agrati, Cristina; Frascoli, Giuditta; Milani, Silvia; De Toffol, Simona; Martinoni, Lorenza; Paganini, Silvia; Marcato, Livia; Maggi, Federico; Simoni, Giuseppe

    2013-05-01

    Karyotyping on chorionic villous samples (CVS) includes the analysis of both cytotrophoblast (STC) and mesenchyme (LTC). This approach requires complex laboratory organization and trained technicians. The introduction of quantitative fluorescent polymerase chain reaction (QF-PCR) instead of conventional karyotyping in low-risk pregnancies opened its application in CVS analysis. Discordant QF-PCR and CVS cytogenetic results were reported, and strategies for CVS analysis were introduced to minimize this risk. The possibility to substitute the STC with QF-PCR was reported. The aim of this study is to evaluate benefits and limitations of the approach QF-PCR + LTC compared with the traditional method STC + LTC and to quantify the associated risks of false results. This study is based on a retrospective cytogenetic audit of CVS results (n = 44 727) generated by the STC + LTC analytic approach. False-negative risks related to true fetal mosaicism type IV, imprinting syndromes and maternal contamination in LTC were calculated. Compared with STC + LTC, QF-PCR + LTC approach is associated with a cumulative false-negative risk of ~1/3100-1/4400. Costs and reporting time of STC in a high-throughput cytogenetic lab are similar to a CE-IVD marked QF-PCR analysis. These results should be clearly highlighted in the pre-test counseling and extensively discussed with the couple prior to testing for informed consent. © 2013 John Wiley & Sons, Ltd.

  3. Comparing Networks from a Data Analysis Perspective

    NASA Astrophysics Data System (ADS)

    Li, Wei; Yang, Jing-Yu

    To probe network characteristics, two predominant ways of network comparison are global property statistics and subgraph enumeration. However, they suffer from limited information and exhaustible computing. Here, we present an approach to compare networks from the perspective of data analysis. Initially, the approach projects each node of original network as a high-dimensional data point, and the network is seen as clouds of data points. Then the dispersion information of the principal component analysis (PCA) projection of the generated data clouds can be used to distinguish networks. We applied this node projection method to the yeast protein-protein interaction networks and the Internet Autonomous System networks, two types of networks with several similar higher properties. The method can efficiently distinguish one from the other. The identical result of different datasets from independent sources also indicated that the method is a robust and universal framework.

  4. Strategy for reliable strain measurement in InAs/GaAs materials from high-resolution Z-contrast STEM images

    NASA Astrophysics Data System (ADS)

    Vatanparast, Maryam; Vullum, Per Erik; Nord, Magnus; Zuo, Jian-Min; Reenaas, Turid W.; Holmestad, Randi

    2017-09-01

    Geometric phase analysis (GPA), a fast and simple Fourier space method for strain analysis, can give useful information on accumulated strain and defect propagation in multiple layers of semiconductors, including quantum dot materials. In this work, GPA has been applied to high resolution Z-contrast scanning transmission electron microscopy (STEM) images. Strain maps determined from different g vectors of these images are compared to each other, in order to analyze and assess the GPA technique in terms of accuracy. The SmartAlign tool has been used to improve the STEM image quality getting more reliable results. Strain maps from template matching as a real space approach are compared with strain maps from GPA, and it is discussed that a real space analysis is a better approach than GPA for aberration corrected STEM images.

  5. A comparative study of multivariable robustness analysis methods as applied to integrated flight and propulsion control

    NASA Technical Reports Server (NTRS)

    Schierman, John D.; Lovell, T. A.; Schmidt, David K.

    1993-01-01

    Three multivariable robustness analysis methods are compared and contrasted. The focus of the analysis is on system stability and performance robustness to uncertainty in the coupling dynamics between two interacting subsystems. Of particular interest is interacting airframe and engine subsystems, and an example airframe/engine vehicle configuration is utilized in the demonstration of these approaches. The singular value (SV) and structured singular value (SSV) analysis methods are compared to a method especially well suited for analysis of robustness to uncertainties in subsystem interactions. This approach is referred to here as the interacting subsystem (IS) analysis method. This method has been used previously to analyze airframe/engine systems, emphasizing the study of stability robustness. However, performance robustness is also investigated here, and a new measure of allowable uncertainty for acceptable performance robustness is introduced. The IS methodology does not require plant uncertainty models to measure the robustness of the system, and is shown to yield valuable information regarding the effects of subsystem interactions. In contrast, the SV and SSV methods allow for the evaluation of the robustness of the system to particular models of uncertainty, and do not directly indicate how the airframe (engine) subsystem interacts with the engine (airframe) subsystem.

  6. [The safety and effect of transhepatic hilar approach for the treatment of bismuth type Ⅲ and Ⅳ hilar cholangiocarcinoma].

    PubMed

    He, M; Wang, H L; Yan, J Y; Xu, S W; Chen, W; Wang, J

    2018-05-01

    Objective: To compare the efficiency between the transhepatic hilar approach and conventional approach for the surgical treatment of Bismuth type Ⅲ and Ⅳ hilar cholangiocarcinoma. Methods: There were 42 consecutive patients with hilar cholangiocarcinoma of Bismuth type Ⅲ and Ⅳ who underwent surgical treatment at Department of Biliary-Pancreatic Surgery, Ren Ji Hospital, School of Medicine, Shanghai Jiao Tong University from January 2008 to December 2013.The transhepatic hilar approach was used in 19 patients and conventional approach was performed in 23 patients.There were no differences in clinical parameters between the two groups(all P >0.05). The t-test was used to analyze the measurement data, and the χ(2) test was used to analyze the count data.Kaplan-Meier analysis was used to analyze the survival period.Multivariate COX regression analysis was used to analyze the prognosis factors. Results: Among the 19 patients who underwent transhepatic hilar approach, 3 patients changed the operative planning after reevaluated by exposing the hepatic hilus.The intraoperative blood was 300(250-400)ml in the transhepatic hilar approach group, which was significantly less than the conventional approach group, 800(450-1 300)ml( t =4.276, P =0.00 1), meanwhile, the R0 resection rate was significantly higher in the transhepatic hilar approach group than in the conventional approach group(89.4% vs . 52.2; χ(2)=6.773, P =0.009) and the 3-year and 5-year cumulative survival rate was better in the transhepatic hilar approach group than in the conventional approach group(63.2% vs . 47.8%, 26.3% vs . 0; χ(2)=66.363, 127.185, P =0.000). On univariate analysis, transhepatic hilar approach, intraoperative blood loss, intraoperative blood transfusion, R0 resection and lymph node metastasis were significant risk factors for patient survival(all P <0.05). On multivariate analysis, use of transhepatic hilar approach, intraoperative blood loss, R0 resection and lymph node metastasis were significant independent risk factors for patient survival(all P <0.05). Conclusion: The transhepatic hilar approach is the preferred technique for surgical treatment for hilar cholangiocarcinoma because it can improve accuracy of surgical planning, safety of operation, R0 resection rate and survival rate compared with the conventional approach.

  7. Repressing the effects of variable speed harmonic orders in operational modal analysis

    NASA Astrophysics Data System (ADS)

    Randall, R. B.; Coats, M. D.; Smith, W. A.

    2016-10-01

    Discrete frequency components such as machine shaft orders can disrupt the operation of normal Operational Modal Analysis (OMA) algorithms. With constant speed machines, they have been removed using time synchronous averaging (TSA). This paper compares two approaches for varying speed machines. In one method, signals are transformed into the order domain, and after the removal of shaft speed related components by a cepstral notching method, are transformed back to the time domain to allow normal OMA. In the other simpler approach an exponential shortpass lifter is applied directly in the time domain cepstrum to enhance the modal information at the expense of other disturbances. For simulated gear signals with speed variations of both ±5% and ±15%, the simpler approach was found to give better results The TSA method is shown not to work in either case. The paper compares the results with those obtained using a stationary random excitation.

  8. Autoregressive modeling for the spectral analysis of oceanographic data

    NASA Technical Reports Server (NTRS)

    Gangopadhyay, Avijit; Cornillon, Peter; Jackson, Leland B.

    1989-01-01

    Over the last decade there has been a dramatic increase in the number and volume of data sets useful for oceanographic studies. Many of these data sets consist of long temporal or spatial series derived from satellites and large-scale oceanographic experiments. These data sets are, however, often 'gappy' in space, irregular in time, and always of finite length. The conventional Fourier transform (FT) approach to the spectral analysis is thus often inapplicable, or where applicable, it provides questionable results. Here, through comparative analysis with the FT for different oceanographic data sets, the possibilities offered by autoregressive (AR) modeling to perform spectral analysis of gappy, finite-length series, are discussed. The applications demonstrate that as the length of the time series becomes shorter, the resolving power of the AR approach as compared with that of the FT improves. For the longest data sets examined here, 98 points, the AR method performed only slightly better than the FT, but for the very short ones, 17 points, the AR method showed a dramatic improvement over the FT. The application of the AR method to a gappy time series, although a secondary concern of this manuscript, further underlines the value of this approach.

  9. Spectral Unmixing Analysis of Time Series Landsat 8 Images

    NASA Astrophysics Data System (ADS)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.

    2018-05-01

    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  10. Comparing multiple competing interventions in the absence of randomized trials using clinical risk-benefit analysis

    PubMed Central

    2012-01-01

    Background To demonstrate the use of risk-benefit analysis for comparing multiple competing interventions in the absence of randomized trials, we applied this approach to the evaluation of five anticoagulants to prevent thrombosis in patients undergoing orthopedic surgery. Methods Using a cost-effectiveness approach from a clinical perspective (i.e. risk benefit analysis) we compared thromboprophylaxis with warfarin, low molecular weight heparin, unfractionated heparin, fondaparinux or ximelagatran in patients undergoing major orthopedic surgery, with sub-analyses according to surgery type. Proportions and variances of events defining risk (major bleeding) and benefit (thrombosis averted) were obtained through a meta-analysis and used to define beta distributions. Monte Carlo simulations were conducted and used to calculate incremental risks, benefits, and risk-benefit ratios. Finally, net clinical benefit was calculated for all replications across a range of risk-benefit acceptability thresholds, with a reference range obtained by estimating the case fatality rate - ratio of thrombosis to bleeding. Results The analysis showed that compared to placebo ximelagatran was superior to other options but final results were influenced by type of surgery, since ximelagatran was superior in total knee replacement but not in total hip replacement. Conclusions Using simulation and economic techniques we demonstrate a method that allows comparing multiple competing interventions in the absence of randomized trials with multiple arms by determining the option with the best risk-benefit profile. It can be helpful in clinical decision making since it incorporates risk, benefit, and personal risk acceptance. PMID:22233221

  11. Training Needs Analysis: Weaknesses in the Conventional Approach.

    ERIC Educational Resources Information Center

    Leat, Michael James; Lovel, Murray Jack

    1997-01-01

    Identification of the training and development needs of administrative support staff is not aided by conventional performance appraisal, which measures summary or comparative effectiveness. Meaningful diagnostic evaluation integrates three levels of analysis (organization, task, and individual), using behavioral expectation scales. (SK)

  12. A Neuro-Fuzzy Approach in the Classification of Students' Academic Performance

    PubMed Central

    2013-01-01

    Classifying the student academic performance with high accuracy facilitates admission decisions and enhances educational services at educational institutions. The purpose of this paper is to present a neuro-fuzzy approach for classifying students into different groups. The neuro-fuzzy classifier used previous exam results and other related factors as input variables and labeled students based on their expected academic performance. The results showed that the proposed approach achieved a high accuracy. The results were also compared with those obtained from other well-known classification approaches, including support vector machine, Naive Bayes, neural network, and decision tree approaches. The comparative analysis indicated that the neuro-fuzzy approach performed better than the others. It is expected that this work may be used to support student admission procedures and to strengthen the services of educational institutions. PMID:24302928

  13. A neuro-fuzzy approach in the classification of students' academic performance.

    PubMed

    Do, Quang Hung; Chen, Jeng-Fung

    2013-01-01

    Classifying the student academic performance with high accuracy facilitates admission decisions and enhances educational services at educational institutions. The purpose of this paper is to present a neuro-fuzzy approach for classifying students into different groups. The neuro-fuzzy classifier used previous exam results and other related factors as input variables and labeled students based on their expected academic performance. The results showed that the proposed approach achieved a high accuracy. The results were also compared with those obtained from other well-known classification approaches, including support vector machine, Naive Bayes, neural network, and decision tree approaches. The comparative analysis indicated that the neuro-fuzzy approach performed better than the others. It is expected that this work may be used to support student admission procedures and to strengthen the services of educational institutions.

  14. Combined statistical analyses for long-term stability data with multiple storage conditions: a simulation study.

    PubMed

    Almalik, Osama; Nijhuis, Michiel B; van den Heuvel, Edwin R

    2014-01-01

    Shelf-life estimation usually requires that at least three registration batches are tested for stability at multiple storage conditions. The shelf-life estimates are often obtained by linear regression analysis per storage condition, an approach implicitly suggested by ICH guideline Q1E. A linear regression analysis combining all data from multiple storage conditions was recently proposed in the literature when variances are homogeneous across storage conditions. The combined analysis is expected to perform better than the separate analysis per storage condition, since pooling data would lead to an improved estimate of the variation and higher numbers of degrees of freedom, but this is not evident for shelf-life estimation. Indeed, the two approaches treat the observed initial batch results, the intercepts in the model, and poolability of batches differently, which may eliminate or reduce the expected advantage of the combined approach with respect to the separate approach. Therefore, a simulation study was performed to compare the distribution of simulated shelf-life estimates on several characteristics between the two approaches and to quantify the difference in shelf-life estimates. In general, the combined statistical analysis does estimate the true shelf life more consistently and precisely than the analysis per storage condition, but it did not outperform the separate analysis in all circumstances.

  15. A new metaphor for projection-based visual analysis and data exploration

    NASA Astrophysics Data System (ADS)

    Schreck, Tobias; Panse, Christian

    2007-01-01

    In many important application domains such as Business and Finance, Process Monitoring, and Security, huge and quickly increasing volumes of complex data are collected. Strong efforts are underway developing automatic and interactive analysis tools for mining useful information from these data repositories. Many data analysis algorithms require an appropriate definition of similarity (or distance) between data instances to allow meaningful clustering, classification, and retrieval, among other analysis tasks. Projection-based data visualization is highly interesting (a) for visual discrimination analysis of a data set within a given similarity definition, and (b) for comparative analysis of similarity characteristics of a given data set represented by different similarity definitions. We introduce an intuitive and effective novel approach for projection-based similarity visualization for interactive discrimination analysis, data exploration, and visual evaluation of metric space effectiveness. The approach is based on the convex hull metaphor for visually aggregating sets of points in projected space, and it can be used with a variety of different projection techniques. The effectiveness of the approach is demonstrated by application on two well-known data sets. Statistical evidence supporting the validity of the hull metaphor is presented. We advocate the hull-based approach over the standard symbol-based approach to projection visualization, as it allows a more effective perception of similarity relationships and class distribution characteristics.

  16. A two-step approach for the analysis of hybrids in comparative social policy analysis: a nuanced typology of childcare between policies and regimes.

    PubMed

    Ciccia, Rossella

    2017-01-01

    Typologies have represented an important tool for the development of comparative social policy research and continue to be widely used in spite of growing criticism of their ability to capture the complexity of welfare states and their internal heterogeneity. In particular, debates have focused on the presence of hybrid cases and the existence of distinct cross-national pattern of variation across areas of social policy. There is growing awareness around these issues, but empirical research often still relies on methodologies aimed at classifying countries in a limited number of unambiguous types. This article proposes a two-step approach based on fuzzy-set ideal type analysis for the systematic analysis of hybrids at the level of both policies (step 1) and policy configurations or combinations of policies (step 2). This approach is demonstrated by using the case of childcare policies in European economies. In the first step, parental leave policies are analysed using three methods-direct, indirect, and combinatory-to identify and describe specific hybrid forms at the level of policy analysis. In the second step, the analysis moves on to investigate the relationship between parental leave and childcare services. Clearly shows that many countries display characteristics normally associated with different types (hybrids and sub-types) . Therefore, this two-step approach demonstrates that disaggregated and aggregated analyses are equally important to account for hybrid welfare forms and make sense of the tensions and incongruences within and between policies.

  17. Robotic approach mitigates perioperative morbidity in obese patients following pancreaticoduodenectomy.

    PubMed

    Girgis, Mark D; Zenati, Mazen S; Steve, Jennifer; Bartlett, David L; Zureikat, Amer; Zeh, Herbert J; Hogg, Melissa E

    2017-02-01

    The aim was to evaluate the impact of obesity on perioperative outcomes in patients undergoing robotic pancreaticoduodenectomy (RPD) compared to open pancreaticoduodenectomy (OPD). A retrospective review of all pancreaticoduodenectomies from 9/2011 to 4/2015 was performed. Obesity was defined as body mass index (BMI) > 30 kg/m 2 . Of 474 pancreaticoduodenectomies performed: RPD = 213 (45%) and OPD = 261 (55%). A total of 145 (31%) patients were obese (70 RPD, 75 OPD). Obese patients had increased EBL (p = 0.03), pancreatic fistula (B&C; p = 0.077), and wound infection (p = 0.068) compared to the non-obese. For obese patients, RPD had decreased OR time (p = 0.0003), EBL (p < 0.001), and wound infection (p = 0.001) with no difference in Clavien ≥3 complications, margins, LOS or 30-day mortality compared with OPD. In multivariate analysis, obesity was the strongest predictor of Clavien ≥3 (OR 1.6; p = 0.041) and wound infection if BMI > 35 (OR 2.6; p = 0.03). The robotic approach was protective of Clavien ≥3 (OR 0.6; p = 0.03) on univariate analysis and wound infection (OR 0.3; p < 0.001) and grade B/C pancreatic fistula (OR 0.34; p < 0.001) on multivariate analysis. Obese patients are at risk for increased postoperative complications regardless of approach. However, the robotic approach mitigates some of the increased complication rate, while preserving other perioperative outcomes. Published by Elsevier Ltd.

  18. An approach to an analysis of the energy response of LiF-TLD to high energy electrons.

    PubMed

    Shiragai, A

    1977-05-01

    Responses of LiF-TLD to high energy electrons relative to 60Co gamma-rays were investigated experimentally and theoretically. The Burlin et al. theory, its modified version by Almond and McCray and the Holt et al. semi-empirical theory were examined in comparison with each experiment. An approximate approach to theoretical analysis of energy response of LiF-TLD was attempted and compared with some experimental results.

  19. Methodological issues underlying multiple decrement life table analysis.

    PubMed

    Mode, C J; Avery, R C; Littman, G S; Potter, R G

    1977-02-01

    In this paper, the actuarial method of multiple decrement life table analysis of censored, longitudinal data is examined. The discussion is organized in terms of the first segment of usage of an intrauterine device. Weaknesses of the actuarial approach are pointed out, and an alternative approach, based on the classical model of competing risks, is proposed. Finally, the actuarial and the alternative method of analyzing censored data are compared, using data from the Taichung Medical Study on Intrauterine Devices.

  20. Nested association mapping of stem rust resistance in wheat using genotyping by sequencing

    USDA-ARS?s Scientific Manuscript database

    Nested association mapping is an approach to map trait loci in which families within populations are interconnected by a common parent. By implementing joint-linkage association analysis, this approach is able to map causative loci with higher power and resolution compared to biparental linkage mapp...

  1. Notes on a Political Theory of Educational Organizations.

    ERIC Educational Resources Information Center

    Bacharach, Samuel B.

    This essay reviews major trends in methodological and theoretical approaches to the study of organizations since the mid-sixties and espouses the political analysis of organizations, a position representing a middle ground between comparative structuralism and the loosely coupled systems approach. This position emphasizes micropolitics as well as…

  2. Approaches to answering critical CER questions.

    PubMed

    Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y

    2015-01-01

    While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.

  3. Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling

    NASA Astrophysics Data System (ADS)

    Wada, Yoshihisa; Tsuji, Hiroshi

    In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.

  4. Development of a Conservative Model Validation Approach for Reliable Analysis

    DTIC Science & Technology

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  5. Comparing the Performance of Improved Classify-Analyze Approaches For Distal Outcomes in Latent Profile Analysis

    PubMed Central

    Dziak, John J.; Bray, Bethany C.; Zhang, Jieting; Zhang, Minqiang; Lanza, Stephanie T.

    2016-01-01

    Several approaches are available for estimating the relationship of latent class membership to distal outcomes in latent profile analysis (LPA). A three-step approach is commonly used, but has problems with estimation bias and confidence interval coverage. Proposed improvements include the correction method of Bolck, Croon, and Hagenaars (BCH; 2004), Vermunt’s (2010) maximum likelihood (ML) approach, and the inclusive three-step approach of Bray, Lanza, & Tan (2015). These methods have been studied in the related case of latent class analysis (LCA) with categorical indicators, but not as well studied for LPA with continuous indicators. We investigated the performance of these approaches in LPA with normally distributed indicators, under different conditions of distal outcome distribution, class measurement quality, relative latent class size, and strength of association between latent class and the distal outcome. The modified BCH implemented in Latent GOLD had excellent performance. The maximum likelihood and inclusive approaches were not robust to violations of distributional assumptions. These findings broadly agree with and extend the results presented by Bakk and Vermunt (2016) in the context of LCA with categorical indicators. PMID:28630602

  6. An EM-based semi-parametric mixture model approach to the regression analysis of competing-risks data.

    PubMed

    Ng, S K; McLachlan, G J

    2003-04-15

    We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright 2003 John Wiley & Sons, Ltd.

  7. Comparison of approaches for mobile document image analysis using server supported smartphones

    NASA Astrophysics Data System (ADS)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-03-01

    With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.

  8. [Development of new approaches for objective dental tissue characteristiсs reproduction for preparation of highly aesthetical restoration].

    PubMed

    Makeeva, I M; Moskalev, E E; Kuz'ko, E I

    2010-01-01

    A new method of color quality control based on spectrophotometry has been developed for dental restoration. A comparative analysis of quality of subjective color control by trained and non-trained observers has been made. Based on comparative analysis of the results of subjective color-control and spectrophotometry the maximum amount of allowed color difference has been set (dE=2.8).

  9. A theoretical-experimental methodology for assessing the sensitivity of biomedical spectral imaging platforms, assays, and analysis methods.

    PubMed

    Leavesley, Silas J; Sweat, Brenner; Abbott, Caitlyn; Favreau, Peter; Rich, Thomas C

    2018-01-01

    Spectral imaging technologies have been used for many years by the remote sensing community. More recently, these approaches have been applied to biomedical problems, where they have shown great promise. However, biomedical spectral imaging has been complicated by the high variance of biological data and the reduced ability to construct test scenarios with fixed ground truths. Hence, it has been difficult to objectively assess and compare biomedical spectral imaging assays and technologies. Here, we present a standardized methodology that allows assessment of the performance of biomedical spectral imaging equipment, assays, and analysis algorithms. This methodology incorporates real experimental data and a theoretical sensitivity analysis, preserving the variability present in biomedical image data. We demonstrate that this approach can be applied in several ways: to compare the effectiveness of spectral analysis algorithms, to compare the response of different imaging platforms, and to assess the level of target signature required to achieve a desired performance. Results indicate that it is possible to compare even very different hardware platforms using this methodology. Future applications could include a range of optimization tasks, such as maximizing detection sensitivity or acquisition speed, providing high utility for investigators ranging from design engineers to biomedical scientists. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Comparative Human Health Risk Analysis of Coastal Community Water and Waste Service Options

    EPA Science Inventory

    As a pilot approach to describe adverse human health effects from alternative decentralized community water systems compared to conventional centralized services (business-as-usual [BAU]), selected chemical and microbial hazards were assessed using disability adjusted life years ...

  11. A hybrid wavelet analysis-cloud model data-extending approach for meteorologic and hydrologic time series

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ding, Hao; Singh, Vijay P.; Shang, Xiaosan; Liu, Dengfeng; Wang, Yuankun; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing

    2015-05-01

    For scientific and sustainable management of water resources, hydrologic and meteorologic data series need to be often extended. This paper proposes a hybrid approach, named WA-CM (wavelet analysis-cloud model), for data series extension. Wavelet analysis has time-frequency localization features, known as "mathematics microscope," that can decompose and reconstruct hydrologic and meteorologic series by wavelet transform. The cloud model is a mathematical representation of fuzziness and randomness and has strong robustness for uncertain data. The WA-CM approach first employs the wavelet transform to decompose the measured nonstationary series and then uses the cloud model to develop an extension model for each decomposition layer series. The final extension is obtained by summing the results of extension of each layer. Two kinds of meteorologic and hydrologic data sets with different characteristics and different influence of human activity from six (three pairs) representative stations are used to illustrate the WA-CM approach. The approach is also compared with four other methods, which are conventional correlation extension method, Kendall-Theil robust line method, artificial neural network method (back propagation, multilayer perceptron, and radial basis function), and single cloud model method. To evaluate the model performance completely and thoroughly, five measures are used, which are relative error, mean relative error, standard deviation of relative error, root mean square error, and Thiel inequality coefficient. Results show that the WA-CM approach is effective, feasible, and accurate and is found to be better than other four methods compared. The theory employed and the approach developed here can be applied to extension of data in other areas as well.

  12. Human Capital Development: Comparative Analysis of BRICs

    ERIC Educational Resources Information Center

    Ardichvili, Alexandre; Zavyalova, Elena; Minina, Vera

    2012-01-01

    Purpose: The goal of this article is to conduct macro-level analysis of human capital (HC) development strategies, pursued by four countries commonly referred to as BRICs (Brazil, Russia, India, and China). Design/methodology/approach: This analysis is based on comparisons of macro indices of human capital and innovativeness of the economy and a…

  13. A Historical Analysis of Primary Mathematics Curricula in Terms of Teaching Principles

    ERIC Educational Resources Information Center

    Ozmantar, Mehmet Fatih

    2017-01-01

    This study carries out a comparative analysis of primary mathematics curricula put into practice during Turkish Republican period. The data for this study are composed of official curricula documents which are examined in terms of teaching principles. The study adopts a qualitative approach and employs document analysis method. The official…

  14. Detailed Debunking of Denial

    NASA Astrophysics Data System (ADS)

    Enting, I. G.; Abraham, J. P.

    2012-12-01

    The disinformation campaign against climate science has been compared to a guerilla war whose tactics undermine the traditional checks and balances of science. One comprehensive approach has to been produce archives of generic responses such as the websites of RealClimate and SkepticalScience. We review our experiences with an alternative approach of detailed responses to a small number of high profile cases. Our particular examples were Professor Ian Plimer and Christopher Monckton, the Third Viscount Monckton of Brenchley, each of whom has been taken seriously by political leaders in our respective countries. We relate our experiences to comparable examples such as John Mashey's analysis of the Wegman report and the formal complaints about Lomborg's "Skeptical Environmentalist" and Durkin's "Great Global Warming Swindle". Our two approaches used contrasting approaches: an on-line video of a lecture vs an evolving compendium of misrepresentations. Additionally our approaches differed in the emphasis. The analysis of Monckton concentrated on the misrepresentation of the science, while the analysis of Plimer concentrated on departures from accepted scientific practice: fabrication of data, misrepresentation of cited sources and unattributed use of the work of others. Benefits of an evolving compendium were the ability to incorporate contributions from members of the public who had identified additional errors and the scope for addressing new aspects as they came to public attention. `Detailed debunking' gives non-specialists a reference point for distinguishing non-science when engaging in public debate.

  15. Sequential sentinel SNP Regional Association Plots (SSS-RAP): an approach for testing independence of SNP association signals using meta-analysis data.

    PubMed

    Zheng, Jie; Gaunt, Tom R; Day, Ian N M

    2013-01-01

    Genome-Wide Association Studies (GWAS) frequently incorporate meta-analysis within their framework. However, conditional analysis of individual-level data, which is an established approach for fine mapping of causal sites, is often precluded where only group-level summary data are available for analysis. Here, we present a numerical and graphical approach, "sequential sentinel SNP regional association plot" (SSS-RAP), which estimates regression coefficients (beta) with their standard errors using the meta-analysis summary results directly. Under an additive model, typical for genes with small effect, the effect for a sentinel SNP can be transformed to the predicted effect for a possibly dependent SNP through a 2×2 2-SNP haplotypes table. The approach assumes Hardy-Weinberg equilibrium for test SNPs. SSS-RAP is available as a Web-tool (http://apps.biocompute.org.uk/sssrap/sssrap.cgi). To develop and illustrate SSS-RAP we analyzed lipid and ECG traits data from the British Women's Heart and Health Study (BWHHS), evaluated a meta-analysis for ECG trait and presented several simulations. We compared results with existing approaches such as model selection methods and conditional analysis. Generally findings were consistent. SSS-RAP represents a tool for testing independence of SNP association signals using meta-analysis data, and is also a convenient approach based on biological principles for fine mapping in group level summary data. © 2012 Blackwell Publishing Ltd/University College London.

  16. A review of accuracy assessment for object-based image analysis: From per-pixel to per-polygon approaches

    NASA Astrophysics Data System (ADS)

    Ye, Su; Pontius, Robert Gilmore; Rakshit, Rahul

    2018-07-01

    Object-based image analysis (OBIA) has gained widespread popularity for creating maps from remotely sensed data. Researchers routinely claim that OBIA procedures outperform pixel-based procedures; however, it is not immediately obvious how to evaluate the degree to which an OBIA map compares to reference information in a manner that accounts for the fact that the OBIA map consists of objects that vary in size and shape. Our study reviews 209 journal articles concerning OBIA published between 2003 and 2017. We focus on the three stages of accuracy assessment: (1) sampling design, (2) response design and (3) accuracy analysis. First, we report the literature's overall characteristics concerning OBIA accuracy assessment. Simple random sampling was the most used method among probability sampling strategies, slightly more than stratified sampling. Office interpreted remotely sensed data was the dominant reference source. The literature reported accuracies ranging from 42% to 96%, with an average of 85%. A third of the articles failed to give sufficient information concerning accuracy methodology such as sampling scheme and sample size. We found few studies that focused specifically on the accuracy of the segmentation. Second, we identify a recent increase of OBIA articles in using per-polygon approaches compared to per-pixel approaches for accuracy assessment. We clarify the impacts of the per-pixel versus the per-polygon approaches respectively on sampling, response design and accuracy analysis. Our review defines the technical and methodological needs in the current per-polygon approaches, such as polygon-based sampling, analysis of mixed polygons, matching of mapped with reference polygons and assessment of segmentation accuracy. Our review summarizes and discusses the current issues in object-based accuracy assessment to provide guidance for improved accuracy assessments for OBIA.

  17. Finding Tropical Cyclones on a Cloud Computing Cluster: Using Parallel Virtualization for Large-Scale Climate Simulation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasenkamp, Daren; Sim, Alexander; Wehner, Michael

    Extensive computing power has been used to tackle issues such as climate changes, fusion energy, and other pressing scientific challenges. These computations produce a tremendous amount of data; however, many of the data analysis programs currently only run a single processor. In this work, we explore the possibility of using the emerging cloud computing platform to parallelize such sequential data analysis tasks. As a proof of concept, we wrap a program for analyzing trends of tropical cyclones in a set of virtual machines (VMs). This approach allows the user to keep their familiar data analysis environment in the VMs, whilemore » we provide the coordination and data transfer services to ensure the necessary input and output are directed to the desired locations. This work extensively exercises the networking capability of the cloud computing systems and has revealed a number of weaknesses in the current cloud system software. In our tests, we are able to scale the parallel data analysis job to a modest number of VMs and achieve a speedup that is comparable to running the same analysis task using MPI. However, compared to MPI based parallelization, the cloud-based approach has a number of advantages. The cloud-based approach is more flexible because the VMs can capture arbitrary software dependencies without requiring the user to rewrite their programs. The cloud-based approach is also more resilient to failure; as long as a single VM is running, it can make progress while as soon as one MPI node fails the whole analysis job fails. In short, this initial work demonstrates that a cloud computing system is a viable platform for distributed scientific data analyses traditionally conducted on dedicated supercomputing systems.« less

  18. Unfolding dimension and the search for functional markers in the human electroencephalogram

    NASA Astrophysics Data System (ADS)

    Dünki, Rudolf M.; Schmid, Gary Bruno

    1998-02-01

    A biparametric approach to dimensional analysis in terms of a so-called ``unfolding dimension'' is introduced to explore the extent to which the human EEG can be described by stable features characteristic of an individual despite the well-known problems of intraindividual variability. Our analysis comprises an EEG data set recorded from healthy individuals over a time span of 5 years. The outcome is shown to be comparable to advanced linear methods of spectral analysis with regard to intraindividual specificity and stability over time. Such linear methods have not yet proven to be specific to the EEG of different brain states. Thus we have also investigated the specificity of our biparametric approach by comparing the mental states schizophrenic psychosis and remission, i.e., illness versus full recovery. A difference between EEG in psychosis and remission became apparent within recordings taken at rest with eyes closed and no stimulated or requested mental activity. Hence our approach distinguishes these functional brain states even in the absence of an active or intentional stimulus. This sheds a different light upon theories of schizophrenia as an information-processing disturbance of the brain.

  19. Point-based and model-based geolocation analysis of airborne laser scanning data

    NASA Astrophysics Data System (ADS)

    Sefercik, Umut Gunes; Buyuksalih, Gurcan; Jacobsen, Karsten; Alkan, Mehmet

    2017-01-01

    Airborne laser scanning (ALS) is one of the most effective remote sensing technologies providing precise three-dimensional (3-D) dense point clouds. A large-size ALS digital surface model (DSM) covering the whole Istanbul province was analyzed by point-based and model-based comprehensive statistical approaches. Point-based analysis was performed using checkpoints on flat areas. Model-based approaches were implemented in two steps as strip to strip comparing overlapping ALS DSMs individually in three subareas and comparing the merged ALS DSMs with terrestrial laser scanning (TLS) DSMs in four other subareas. In the model-based approach, the standard deviation of height and normalized median absolute deviation were used as the accuracy indicators combined with the dependency of terrain inclination. The results demonstrate that terrain roughness has a strong impact on the vertical accuracy of ALS DSMs. From the relative horizontal shifts determined and partially improved by merging the overlapping strips and comparison of the ALS, and the TLS, data were found not to be negligible. The analysis of ALS DSM in relation to TLS DSM allowed us to determine the characteristics of the DSM in detail.

  20. Open versus robotic-assisted transabdominal preperitoneal (R-TAPP) inguinal hernia repair: a multicenter matched analysis of clinical outcomes.

    PubMed

    Gamagami, R; Dickens, E; Gonzalez, A; D'Amico, L; Richardson, C; Rabaza, J; Kolachalam, R

    2018-04-26

    To compare the perioperative outcomes of initial, consecutive robotic-assisted transabdominal preperitoneal (R-TAPP) inguinal hernia repair (IHR) cases with consecutive open cases completed by the same surgeons. Multicenter, retrospective, comparative study of perioperative results from open and robotic IHR using standard univariate and multivariate regression analyses for propensity score matched (1:1) cohorts. Seven general surgeons at six institutions contributed 602 consecutive open IHR and 652 consecutive R-TAPP IHR cases. Baseline patient characteristics in the unmatched groups were similar with the exception of previous abdominal surgery and all baseline characteristics were comparable in the matched cohorts. In matched analyses, postoperative complications prior to discharge were comparable. However, from post discharge through 30 days, fewer patients experienced complications in the R-TAPP group than in the open group [4.3% vs 7.7% (p = 0.047)]. The R-TAPP group had no reoperations post discharge through 30 days of follow-up compared with five patients (1.1%) in the open group (p = 0.062), respectively. Multivariate logistic regression analysis which demonstrated patient age > 65 years and the open approach were risk factors for complications within 30 days post discharge in the matched group [age > 65 years: odds ratio (OR) = 3.33 (95% CI 1.89, 5.87; p < 0.0001); open approach: OR = 1.89 (95% CI 1.05, 3.38; p = 0.031)]. In this matched analysis, R-TAPP provides similar postoperative complications prior to discharge and a lower rate of postoperative complications through 30 days compared to open repair. R-TAPP is a promising and reproducible approach, and may facilitate adoption of minimally invasive repairs of inguinal hernias.

  1. Is the SMART approach better than other treatment approaches for prevention of asthma exacerbations? A meta-analysis.

    PubMed

    Agarwal, R; Khan, A; Aggarwal, A N; Gupta, D

    2009-12-01

    The combination of inhaled corticosteroids (ICS) and long-acting beta2 agonists (LABA) has been used as a single inhaler both for maintenance and reliever therapy in asthma, the SMART approach. The administration of additional CS with each reliever inhalation in response to symptoms is expected to provide better control of airway inflammation. The aim of this meta-analysis was to evaluate the efficacy and safety of the SMART approach versus other approaches in the management of asthma in preventing asthma exacerbations. We searched the MEDLINE and EMBASE databases for studies that have reported exacerbations in the SMART group versus the control group. We calculated the odds ratio (OR) and 95% confidence intervals (CI) to assess the exacerbations in the two groups and pooled the results using a random-effects model. Our search yielded eight studies. The use of SMART approach compared to fixed-dose ICS-LABA combination significantly decreased the odds of a severe exacerbation (OR 0.65; 95% CI, 0.53-0.80) and severe exacerbation requiring hospitalization/ER treatment (OR 0.69; 95% CI, 058-0.83). The use of SMART approach compared to fixed-dose ICS also significantly decreased the odds of a severe exacerbation (OR 0.52; 95% CI, 0.45-0.61) and severe exacerbation requiring medical intervention (OR 0.52; 95% CI, 0.42-0.65). The occurrence of adverse events was similar in the two groups. There was some evidence of statistical heterogeneity. The SMART approach using formoterol-budesonide is superior in preventing exacerbations when compared to traditional therapy with fixed dose ICS or ICS-LABA combination without any increase in adverse events.

  2. Diagnostic performance of an automated analysis software for the diagnosis of Alzheimer’s dementia with 18F FDG PET

    PubMed Central

    Partovi, Sasan; Yuh, Roger; Pirozzi, Sara; Lu, Ziang; Couturier, Spencer; Grosse, Ulrich; Schluchter, Mark D; Nelson, Aaron; Jones, Robert; O’Donnell, James K; Faulhaber, Peter

    2017-01-01

    The objective of this study was to assess the ability of a quantitative software-aided approach to improve the diagnostic accuracy of 18F FDG PET for Alzheimer’s dementia over visual analysis alone. Twenty normal subjects (M:F-12:8; mean age 80.6 years) and twenty mild AD subjects (M:F-12:8; mean age 70.6 years) with 18F FDG PET scans were obtained from the ADNI database. Three blinded readers interpreted these PET images first using a visual qualitative approach and then using a quantitative software-aided approach. Images were classified on two five-point scales based on normal/abnormal (1-definitely normal; 5-definitely abnormal) and presence of AD (1-definitely not AD; 5-definitely AD). Diagnostic sensitivity, specificity, and accuracy for both approaches were compared based on the aforementioned scales. The sensitivity, specificity, and accuracy for the normal vs. abnormal readings of all readers combined were higher when comparing the software-aided vs. visual approach (sensitivity 0.93 vs. 0.83 P = 0.0466; specificity 0.85 vs. 0.60 P = 0.0005; accuracy 0.89 vs. 0.72 P<0.0001). The specificity and accuracy for absence vs. presence of AD of all readers combined were higher when comparing the software-aided vs. visual approach (specificity 0.90 vs. 0.70 P = 0.0008; accuracy 0.81 vs. 0.72 P = 0.0356). Sensitivities of the software-aided and visual approaches did not differ significantly (0.72 vs. 0.73 P = 0.74). The quantitative software-aided approach appears to improve the performance of 18F FDG PET for the diagnosis of mild AD. It may be helpful for experienced 18F FDG PET readers analyzing challenging cases. PMID:28123864

  3. Classification of motor imagery tasks for BCI with multiresolution analysis and multiobjective feature selection.

    PubMed

    Ortega, Julio; Asensio-Cubero, Javier; Gan, John Q; Ortiz, Andrés

    2016-07-15

    Brain-computer interfacing (BCI) applications based on the classification of electroencephalographic (EEG) signals require solving high-dimensional pattern classification problems with such a relatively small number of training patterns that curse of dimensionality problems usually arise. Multiresolution analysis (MRA) has useful properties for signal analysis in both temporal and spectral analysis, and has been broadly used in the BCI field. However, MRA usually increases the dimensionality of the input data. Therefore, some approaches to feature selection or feature dimensionality reduction should be considered for improving the performance of the MRA based BCI. This paper investigates feature selection in the MRA-based frameworks for BCI. Several wrapper approaches to evolutionary multiobjective feature selection are proposed with different structures of classifiers. They are evaluated by comparing with baseline methods using sparse representation of features or without feature selection. The statistical analysis, by applying the Kolmogorov-Smirnoff and Kruskal-Wallis tests to the means of the Kappa values evaluated by using the test patterns in each approach, has demonstrated some advantages of the proposed approaches. In comparison with the baseline MRA approach used in previous studies, the proposed evolutionary multiobjective feature selection approaches provide similar or even better classification performances, with significant reduction in the number of features that need to be computed.

  4. Building the Evidence Base for Decision-making in Cancer Genomic Medicine Using Comparative Effectiveness Research

    PubMed Central

    Goddard, Katrina A.B.; Knaus, William A.; Whitlock, Evelyn; Lyman, Gary H.; Feigelson, Heather Spencer; Schully, Sheri D.; Ramsey, Scott; Tunis, Sean; Freedman, Andrew N.; Khoury, Muin J.; Veenstra, David L.

    2013-01-01

    Background The clinical utility is uncertain for many cancer genomic applications. Comparative effectiveness research (CER) can provide evidence to clarify this uncertainty. Objectives To identify approaches to help stakeholders make evidence-based decisions, and to describe potential challenges and opportunities using CER to produce evidence-based guidance. Methods We identified general CER approaches for genomic applications through literature review, the authors’ experiences, and lessons learned from a recent, seven-site CER initiative in cancer genomic medicine. Case studies illustrate the use of CER approaches. Results Evidence generation and synthesis approaches include comparative observational and randomized trials, patient reported outcomes, decision modeling, and economic analysis. We identified significant challenges to conducting CER in cancer genomics: the rapid pace of innovation, the lack of regulation, the limited evidence for clinical utility, and the beliefs that genomic tests could have personal utility without having clinical utility. Opportunities to capitalize on CER methods in cancer genomics include improvements in the conduct of evidence synthesis, stakeholder engagement, increasing the number of comparative studies, and developing approaches to inform clinical guidelines and research prioritization. Conclusions CER offers a variety of methodological approaches to address stakeholders’ needs. Innovative approaches are needed to ensure an effective translation of genomic discoveries. PMID:22516979

  5. A generalized procedure for analyzing sustained and dynamic vocal fold vibrations from laryngeal high-speed videos using phonovibrograms.

    PubMed

    Unger, Jakob; Schuster, Maria; Hecker, Dietmar J; Schick, Bernhard; Lohscheller, Jörg

    2016-01-01

    This work presents a computer-based approach to analyze the two-dimensional vocal fold dynamics of endoscopic high-speed videos, and constitutes an extension and generalization of a previously proposed wavelet-based procedure. While most approaches aim for analyzing sustained phonation conditions, the proposed method allows for a clinically adequate analysis of both dynamic as well as sustained phonation paradigms. The analysis procedure is based on a spatio-temporal visualization technique, the phonovibrogram, that facilitates the documentation of the visible laryngeal dynamics. From the phonovibrogram, a low-dimensional set of features is computed using a principle component analysis strategy that quantifies the type of vibration patterns, irregularity, lateral symmetry and synchronicity, as a function of time. Two different test bench data sets are used to validate the approach: (I) 150 healthy and pathologic subjects examined during sustained phonation. (II) 20 healthy and pathologic subjects that were examined twice: during sustained phonation and a glissando from a low to a higher fundamental frequency. In order to assess the discriminative power of the extracted features, a Support Vector Machine is trained to distinguish between physiologic and pathologic vibrations. The results for sustained phonation sequences are compared to the previous approach. Finally, the classification performance of the stationary analyzing procedure is compared to the transient analysis of the glissando maneuver. For the first test bench the proposed procedure outperformed the previous approach (proposed feature set: accuracy: 91.3%, sensitivity: 80%, specificity: 97%, previous approach: accuracy: 89.3%, sensitivity: 76%, specificity: 96%). Comparing the classification performance of the second test bench further corroborates that analyzing transient paradigms provides clear additional diagnostic value (glissando maneuver: accuracy: 90%, sensitivity: 100%, specificity: 80%, sustained phonation: accuracy: 75%, sensitivity: 80%, specificity: 70%). The incorporation of parameters describing the temporal evolvement of vocal fold vibration clearly improves the automatic identification of pathologic vibration patterns. Furthermore, incorporating a dynamic phonation paradigm provides additional valuable information about the underlying laryngeal dynamics that cannot be derived from sustained conditions. The proposed generalized approach provides a better overall classification performance than the previous approach, and hence constitutes a new advantageous tool for an improved clinical diagnosis of voice disorders. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. A program to form a multidisciplinary data base and analysis for dynamic systems

    NASA Technical Reports Server (NTRS)

    Taylor, L. W.; Suit, W. T.; Mayo, M. H.

    1984-01-01

    Diverse sets of experimental data and analysis programs have been assembled for the purpose of facilitating research in systems identification, parameter estimation and state estimation techniques. The data base analysis programs are organized to make it easy to compare alternative approaches. Additional data and alternative forms of analysis will be included as they become available.

  7. Exploratory graph analysis: A new approach for estimating the number of dimensions in psychological research

    PubMed Central

    Golino, Hudson F.; Epskamp, Sacha

    2017-01-01

    The estimation of the correct number of dimensions is a long-standing problem in psychometrics. Several methods have been proposed, such as parallel analysis (PA), Kaiser-Guttman’s eigenvalue-greater-than-one rule, multiple average partial procedure (MAP), the maximum-likelihood approaches that use fit indexes as BIC and EBIC and the less used and studied approach called very simple structure (VSS). In the present paper a new approach to estimate the number of dimensions will be introduced and compared via simulation to the traditional techniques pointed above. The approach proposed in the current paper is called exploratory graph analysis (EGA), since it is based on the graphical lasso with the regularization parameter specified using EBIC. The number of dimensions is verified using the walktrap, a random walk algorithm used to identify communities in networks. In total, 32,000 data sets were simulated to fit known factor structures, with the data sets varying across different criteria: number of factors (2 and 4), number of items (5 and 10), sample size (100, 500, 1000 and 5000) and correlation between factors (orthogonal, .20, .50 and .70), resulting in 64 different conditions. For each condition, 500 data sets were simulated using lavaan. The result shows that the EGA performs comparable to parallel analysis, EBIC, eBIC and to Kaiser-Guttman rule in a number of situations, especially when the number of factors was two. However, EGA was the only technique able to correctly estimate the number of dimensions in the four-factor structure when the correlation between factors were .7, showing an accuracy of 100% for a sample size of 5,000 observations. Finally, the EGA was used to estimate the number of factors in a real dataset, in order to compare its performance with the other six techniques tested in the simulation study. PMID:28594839

  8. Exploratory graph analysis: A new approach for estimating the number of dimensions in psychological research.

    PubMed

    Golino, Hudson F; Epskamp, Sacha

    2017-01-01

    The estimation of the correct number of dimensions is a long-standing problem in psychometrics. Several methods have been proposed, such as parallel analysis (PA), Kaiser-Guttman's eigenvalue-greater-than-one rule, multiple average partial procedure (MAP), the maximum-likelihood approaches that use fit indexes as BIC and EBIC and the less used and studied approach called very simple structure (VSS). In the present paper a new approach to estimate the number of dimensions will be introduced and compared via simulation to the traditional techniques pointed above. The approach proposed in the current paper is called exploratory graph analysis (EGA), since it is based on the graphical lasso with the regularization parameter specified using EBIC. The number of dimensions is verified using the walktrap, a random walk algorithm used to identify communities in networks. In total, 32,000 data sets were simulated to fit known factor structures, with the data sets varying across different criteria: number of factors (2 and 4), number of items (5 and 10), sample size (100, 500, 1000 and 5000) and correlation between factors (orthogonal, .20, .50 and .70), resulting in 64 different conditions. For each condition, 500 data sets were simulated using lavaan. The result shows that the EGA performs comparable to parallel analysis, EBIC, eBIC and to Kaiser-Guttman rule in a number of situations, especially when the number of factors was two. However, EGA was the only technique able to correctly estimate the number of dimensions in the four-factor structure when the correlation between factors were .7, showing an accuracy of 100% for a sample size of 5,000 observations. Finally, the EGA was used to estimate the number of factors in a real dataset, in order to compare its performance with the other six techniques tested in the simulation study.

  9. A comparative analysis of the cryo-compression and cryo-adsorption hydrogen storage methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petitpas, G; Benard, P; Klebanoff, L E

    2014-07-01

    While conventional low-pressure LH₂ dewars have existed for decades, advanced methods of cryogenic hydrogen storage have recently been developed. These advanced methods are cryo-compression and cryo-adsorption hydrogen storage, which operate best in the temperature range 30–100 K. We present a comparative analysis of both approaches for cryogenic hydrogen storage, examining how pressure and/or sorbent materials are used to effectively increase onboard H₂ density and dormancy. We start by reviewing some basic aspects of LH₂ properties and conventional means of storing it. From there we describe the cryo-compression and cryo-adsorption hydrogen storage methods, and then explore the relationship between them, clarifyingmore » the materials science and physics of the two approaches in trying to solve the same hydrogen storage task (~5–8 kg H₂, typical of light duty vehicles). Assuming that the balance of plant and the available volume for the storage system in the vehicle are identical for both approaches, the comparison focuses on how the respective storage capacities, vessel weight and dormancy vary as a function of temperature, pressure and type of cryo-adsorption material (especially, powder MOF-5 and MIL-101). By performing a comparative analysis, we clarify the science of each approach individually, identify the regimes where the attributes of each can be maximized, elucidate the properties of these systems during refueling, and probe the possible benefits of a combined “hybrid” system with both cryo-adsorption and cryo-compression phenomena operating at the same time. In addition the relationships found between onboard H₂ capacity, pressure vessel and/or sorbent mass and dormancy as a function of rated pressure, type of sorbent material and fueling conditions are useful as general designing guidelines in future engineering efforts using these two hydrogen storage approaches.« less

  10. Supraorbital Versus Endoscopic Endonasal Approaches for Olfactory Groove Meningiomas: A Cost-Minimization Study.

    PubMed

    Gandhoke, Gurpreet S; Pease, Matthew; Smith, Kenneth J; Sekula, Raymond F

    2017-09-01

    To perform a cost-minimization study comparing the supraorbital and endoscopic endonasal (EEA) approach with or without craniotomy for the resection of olfactory groove meningiomas (OGMs). We built a decision tree using probabilities of gross total resection (GTR) and cerebrospinal fluid (CSF) leak rates with the supraorbital approach versus EEA with and without additional craniotomy. The cost (not charge or reimbursement) at each "stem" of this decision tree for both surgical options was obtained from our hospital's finance department. After a base case calculation, we applied plausible ranges to all parameters and carried out multiple 1-way sensitivity analyses. Probabilistic sensitivity analyses confirmed our results. The probabilities of GTR (0.8) and CSF leak (0.2) for the supraorbital craniotomy were obtained from our series of 5 patients who underwent a supraorbital approach for the resection of an OGM. The mean tumor volume was 54.6 cm 3 (range, 17-94.2 cm 3 ). Literature-reported rates of GTR (0.6) and CSF leak (0.3) with EEA were applied to our economic analysis. Supraorbital craniotomy was the preferred strategy, with an expected value of $29,423, compared with an EEA cost of $83,838. On multiple 1-way sensitivity analyses, supraorbital craniotomy remained the preferred strategy, with a minimum cost savings of $46,000 and a maximum savings of $64,000. Probabilistic sensitivity analysis found the lowest cost difference between the 2 surgical options to be $37,431. Compared with EEA, supraorbital craniotomy provides substantial cost savings in the treatment of OGMs. Given the potential differences in effectiveness between approaches, a cost-effectiveness analysis should be undertaken. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. A time-series method for automated measurement of changes in mitotic and interphase duration from time-lapse movies.

    PubMed

    Sigoillot, Frederic D; Huckins, Jeremy F; Li, Fuhai; Zhou, Xiaobo; Wong, Stephen T C; King, Randall W

    2011-01-01

    Automated time-lapse microscopy can visualize proliferation of large numbers of individual cells, enabling accurate measurement of the frequency of cell division and the duration of interphase and mitosis. However, extraction of quantitative information by manual inspection of time-lapse movies is too time-consuming to be useful for analysis of large experiments. Here we present an automated time-series approach that can measure changes in the duration of mitosis and interphase in individual cells expressing fluorescent histone 2B. The approach requires analysis of only 2 features, nuclear area and average intensity. Compared to supervised learning approaches, this method reduces processing time and does not require generation of training data sets. We demonstrate that this method is as sensitive as manual analysis in identifying small changes in interphase or mitotic duration induced by drug or siRNA treatment. This approach should facilitate automated analysis of high-throughput time-lapse data sets to identify small molecules or gene products that influence timing of cell division.

  12. Analysis of rubber supply in Sri Lanka

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartley, M.J.; Nerlove, M.; Peters, R.K. Jr.

    1987-11-01

    An analysis of the supply response for perennial crops is undertaken for rubber in Sir Lanka, focusing on the uprooting-replanting decision and disaggregating the typical reduced-form supply response equation into several structural relationships. This approach is compared and contrasted with Dowling's analysis of supply response for rubber in Thailand, which is based upon a sophisticated reduced-form supply function developed by Wickens and Greenfield for Brazilian coffee. Because the uprooting-replanting decision is central to understanding rubber supply response in Sri Lanka and for other perennial crops where replanting activities dominate new planting, the standard approaches do not adequately capture supply response.

  13. Analysis and Long-Term Follow-Up of the Surgical Treatment of Children With Craniopharyngioma.

    PubMed

    Cheng, Jing; Shao, Qiang; Pan, Zhiyong; You, Jin

    2016-11-01

    To investigate the relationship between the operative approach, clinical pathological factors, and curative effect of the surgical treatment in the patients with craniopharyngioma; to provide a theoretical basis for determining the prognosis and reducing the recurrence rate during the long-term postoperative follow-up in children. This was a retrospective analysis of the clinical data of 92 children who underwent surgical treatment in our department from May 2011 to January 2005. Long-term follow-up was performed from 12 months to 8 years. The pterional approach was used in 49 patients, the interhemispheric approach in 20 patients, the corpus callosum approach in 16 patients, and the butterfly approach in 7 patients. Pathological classification was performed by hematoxylin and eosin stain staining of the pathological tissues and evaluated according to the different surgical approaches, MRI calcification status, calcification type, pathological type, whether radiotherapy was performed, postoperative recurrence, and death. For the pterion approach resection, there was near total resection in 46 patients (93.9%) with the lowest recurrence rate. The operative approach and postoperative recurrence rates were compared; the difference was statistically significant (P <0.05). For comparison of the operative approach and postoperative mortality, the difference was not statistically significant (P >0.05). There was not a significant difference between the MRI classification and postoperative recurrence rate (P >0.05). Comparing the degree of tumor calcification with the recurrence rate after operation and the mortality rate, the difference was statistically significant (P <0.05). The recurrence rate and mortality rate of adamantimous craniopharyngioma and squamous papillary craniopharyngioma in 2 groups following operation were compared, and the differences were statistically significant (P <0.05). Postoperative adjuvant radiotherapy was compared with the postoperative recurrence rate and mortality; the differences were statistically significant (P <0.05). The main effects on tumor recurrence include the choice of surgical approach and degree of calcification. The adamantimous craniopharyngioma relapse rate is higher, which could be because invasion of craniopharyngioma only occurs with adamantimous craniopharyngioma. Postoperative radiotherapy can significantly prolong the recurrence time and reduce the mortality rate of patients with craniopharyngioma.

  14. Comparative evaluation of saliva collection methods for proteome analysis.

    PubMed

    Golatowski, Claas; Salazar, Manuela Gesell; Dhople, Vishnu Mukund; Hammer, Elke; Kocher, Thomas; Jehmlich, Nico; Völker, Uwe

    2013-04-18

    Saliva collection devices are widely used for large-scale screening approaches. This study was designed to compare the suitability of three different whole-saliva collection approaches for subsequent proteome analyses. From 9 young healthy volunteers (4 women and 5 men) saliva samples were collected either unstimulated by passive drooling or stimulated using a paraffin gum or Salivette® (cotton swab). Saliva volume, protein concentration and salivary protein patterns were analyzed comparatively. Samples collected using paraffin gum showed the highest saliva volume (4.1±1.5 ml) followed by Salivette® collection (1.8±0.4 ml) and drooling (1.0±0.4 ml). Saliva protein concentrations (average 1145 μg/ml) showed no significant differences between the three sampling schemes. Each collection approach facilitated the identification of about 160 proteins (≥2 distinct peptides) per subject, but collection-method dependent variations in protein composition were observed. Passive drooling, paraffin gum and Salivette® each allows similar coverage of the whole saliva proteome, but the specific proteins observed depended on the collection approach. Thus, only one type of collection device should be used for quantitative proteome analysis in one experiment, especially when performing large-scale cross-sectional or multi-centric studies. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Translating person-centered care into practice: A comparative analysis of motivational interviewing, illness-integration support, and guided self-determination.

    PubMed

    Zoffmann, Vibeke; Hörnsten, Åsa; Storbækken, Solveig; Graue, Marit; Rasmussen, Bodil; Wahl, Astrid; Kirkevold, Marit

    2016-03-01

    Person-centred care [PCC] can engage people in living well with a chronic condition. However, translating PCC into practice is challenging. We aimed to compare the translational potentials of three approaches: motivational interviewing [MI], illness integration support [IIS] and guided self-determination [GSD]. Comparative analysis included eight components: (1) philosophical origin; (2) development in original clinical setting; (3) theoretical underpinnings; (4) overarching goal and supportive processes; (5) general principles, strategies or tools for engaging peoples; (6) health care professionals' background and training; (7) fidelity assessment; (8) reported effects. Although all approaches promoted autonomous motivation, they differed in other ways. Their original settings explain why IIS and GSD strive for life-illness integration, whereas MI focuses on managing ambivalence. IIS and GSD were based on grounded theories, and MI was intuitively developed. All apply processes and strategies to advance professionals' communication skills and engagement; GSD includes context-specific reflection sheets. All offer training programs; MI and GSD include fidelity tools. Each approach has a primary application: MI, when ambivalence threatens positive change; IIS, when integrating newly diagnosed chronic conditions; and GSD, when problem solving is difficult, or deadlocked. Professionals must critically consider the context in their choice of approach. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Intercultural Counselling and Assessment: Global Perspectives.

    ERIC Educational Resources Information Center

    Samuda, Ronald J., Ed.; Wolfgang, Aaron, Ed.

    This book concerning the need for more appropriate approaches to intercultural counseling in counselor training includes these articles: (1) "Comparative Immigration Patterns in the U.S., Australia and Canada: Social and Educational Implications" (R. J. Samuda); (2) "Theories of Counselling: A Comparative Analysis" (C. E.…

  17. Shear-wave velocity profiling according to three alternative approaches: A comparative case study

    NASA Astrophysics Data System (ADS)

    Dal Moro, G.; Keller, L.; Al-Arifi, N. S.; Moustafa, S. S. R.

    2016-11-01

    The paper intends to compare three different methodologies which can be used to analyze surface-wave propagation, thus eventually obtaining the vertical shear-wave velocity (VS) profile. The three presented methods (currently still quite unconventional) are characterized by different field procedures and data processing. The first methodology is a sort of evolution of the classical Multi-channel Analysis of Surface Waves (MASW) here accomplished by jointly considering Rayleigh and Love waves (analyzed according to the Full Velocity Spectrum approach) and the Horizontal-to-Vertical Spectral Ratio (HVSR). The second method is based on the joint analysis of the HVSR curve together with the Rayleigh-wave dispersion determined via Miniature Array Analysis of Microtremors (MAAM), a passive methodology that relies on a small number (4 to 6) of vertical geophones deployed along a small circle (for the common near-surface application the radius usually ranges from 0.6 to 5 m). Finally, the third considered approach is based on the active data acquired by a single 3-component geophone and relies on the joint inversion of the group-velocity spectra of the radial and vertical components of the Rayleigh waves, together with the Radial-to-Vertical Spectral Ratio (RVSR). The results of the analyses performed while considering these approaches (completely different both in terms of field procedures and data analysis) appear extremely consistent thus mutually validating their performances. Pros and cons of each approach are summarized both in terms of computational aspects as well as with respect to practical considerations regarding the specific character of the pertinent field procedures.

  18. An opportunity cost approach to sample size calculation in cost-effectiveness analysis.

    PubMed

    Gafni, A; Walter, S D; Birch, S; Sendi, P

    2008-01-01

    The inclusion of economic evaluations as part of clinical trials has led to concerns about the adequacy of trial sample size to support such analysis. The analytical tool of cost-effectiveness analysis is the incremental cost-effectiveness ratio (ICER), which is compared with a threshold value (lambda) as a method to determine the efficiency of a health-care intervention. Accordingly, many of the methods suggested to calculating the sample size requirements for the economic component of clinical trials are based on the properties of the ICER. However, use of the ICER and a threshold value as a basis for determining efficiency has been shown to be inconsistent with the economic concept of opportunity cost. As a result, the validity of the ICER-based approaches to sample size calculations can be challenged. Alternative methods for determining improvements in efficiency have been presented in the literature that does not depend upon ICER values. In this paper, we develop an opportunity cost approach to calculating sample size for economic evaluations alongside clinical trials, and illustrate the approach using a numerical example. We compare the sample size requirement of the opportunity cost method with the ICER threshold method. In general, either method may yield the larger required sample size. However, the opportunity cost approach, although simple to use, has additional data requirements. We believe that the additional data requirements represent a small price to pay for being able to perform an analysis consistent with both concept of opportunity cost and the problem faced by decision makers. Copyright (c) 2007 John Wiley & Sons, Ltd.

  19. Analysis of ballistic transport in nanoscale devices by using an accelerated finite element contact block reduction approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, H.; Li, G., E-mail: gli@clemson.edu

    2014-08-28

    An accelerated Finite Element Contact Block Reduction (FECBR) approach is presented for computational analysis of ballistic transport in nanoscale electronic devices with arbitrary geometry and unstructured mesh. Finite element formulation is developed for the theoretical CBR/Poisson model. The FECBR approach is accelerated through eigen-pair reduction, lead mode space projection, and component mode synthesis techniques. The accelerated FECBR is applied to perform quantum mechanical ballistic transport analysis of a DG-MOSFET with taper-shaped extensions and a DG-MOSFET with Si/SiO{sub 2} interface roughness. The computed electrical transport properties of the devices obtained from the accelerated FECBR approach and associated computational cost as amore » function of system degrees of freedom are compared with those obtained from the original CBR and direct inversion methods. The performance of the accelerated FECBR in both its accuracy and efficiency is demonstrated.« less

  20. Complete Hamiltonian analysis of cosmological perturbations at all orders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nandi, Debottam; Shankaranarayanan, S., E-mail: debottam@iisertvm.ac.in, E-mail: shanki@iisertvm.ac.in

    2016-06-01

    In this work, we present a consistent Hamiltonian analysis of cosmological perturbations at all orders. To make the procedure transparent, we consider a simple model and resolve the 'gauge-fixing' issues and extend the analysis to scalar field models and show that our approach can be applied to any order of perturbation for any first order derivative fields. In the case of Galilean scalar fields, our procedure can extract constrained relations at all orders in perturbations leading to the fact that there is no extra degrees of freedom due to the presence of higher time derivatives of the field in themore » Lagrangian. We compare and contrast our approach to the Lagrangian approach (Chen et al. [2006]) for extracting higher order correlations and show that our approach is efficient and robust and can be applied to any model of gravity and matter fields without invoking slow-roll approximation.« less

  1. A Spiritually-based approach to breast cancer awareness: Cognitive response analysis of communication effectiveness

    PubMed Central

    Holt, Cheryl L.; Lee, Crystal; Wright, Katrina

    2017-01-01

    The purpose of this study was to compare the communication effectiveness of a spiritually-based approach to breast cancer early detection education with a secular approach, among African American women, by conducting a cognitive response analysis. A total of 108 women from six Alabama churches were randomly assigned by church to receive a spiritually-based or secular educational booklet discussing breast cancer early detection. Based on the Elaboration Likelihood Model (Petty & Cacioppo, 1981), after reading the booklets participants were asked to complete a thought-listing task writing down any thoughts they experienced and rating them as positive, negative, or neutral. Two independent coders then used five dimensions to code participants thoughts. Compared with the secular booklet, the spiritually-based booklet resulted in significantly more thoughts involving personal connection, self-assessment, and spiritually-based responses. These results suggest that a spiritually-based approach to breast cancer awareness may be more effective than the secular because it caused women to more actively process the message, stimulating central route processing. The incorporation of spiritually-based content into church-based breast cancer education could be a promising health communication approach for African American women. PMID:18443989

  2. Comparing multiple imputation methods for systematically missing subject-level data.

    PubMed

    Kline, David; Andridge, Rebecca; Kaizar, Eloise

    2017-06-01

    When conducting research synthesis, the collection of studies that will be combined often do not measure the same set of variables, which creates missing data. When the studies to combine are longitudinal, missing data can occur on the observation-level (time-varying) or the subject-level (non-time-varying). Traditionally, the focus of missing data methods for longitudinal data has been on missing observation-level variables. In this paper, we focus on missing subject-level variables and compare two multiple imputation approaches: a joint modeling approach and a sequential conditional modeling approach. We find the joint modeling approach to be preferable to the sequential conditional approach, except when the covariance structure of the repeated outcome for each individual has homogenous variance and exchangeable correlation. Specifically, the regression coefficient estimates from an analysis incorporating imputed values based on the sequential conditional method are attenuated and less efficient than those from the joint method. Remarkably, the estimates from the sequential conditional method are often less efficient than a complete case analysis, which, in the context of research synthesis, implies that we lose efficiency by combining studies. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Comparative Study of Wing Lift Distribution Analysis for High Altitude Long Endurance (HALE) Unmaned Aerial Vehicle

    NASA Astrophysics Data System (ADS)

    Silitonga, Faber Y.; Agoes Moelyadi, M.

    2018-04-01

    The development of High Altitude Long Endurance (HALE) Unmanned Aerial Vehicle (UAV) has been emerged for both civil and military purposes. Its ability of operating in high altitude with long endurance is important in supporting maritime applications.Preliminary analysis of HALE UAV lift distribution of the wing presented to give decisive consideration for its early development. Ensuring that the generated lift is enough to compensate its own weight. Therotical approach using Pradtl’s non-linear lifting line theory will be compared with modern numerical approach using Computational Fluid Dynamics (CFD). Results of wing lift distribution calculated from both methods will be compared to study the reliability of it. HALE UAV ITB has high aspect ratio wing and will be analyze at cruise flight condition. The result indicates difference between Non-linear Lifting Line and CFD method.

  4. Community, State, and Federal Approaches to Cumulative Risk Assessment: Challenges and Opportunities for Integration

    PubMed Central

    Barzyk, Timothy M.; Wilson, Sacoby; Wilson, Anthony

    2015-01-01

    Community, state, and federal approaches to conventional and cumulative risk assessment (CRA) were described and compared to assess similarities and differences, and develop recommendations for a consistent CRA approach, acceptable across each level as a rigorous scientific methodology, including partnership formation and solution development as necessary practices. Community, state, and federal examples were described and then summarized based on their adherence to CRA principles of: (1) planning, scoping, and problem formulation; (2) risk analysis and ranking, and (3) risk characterization, interpretation, and management. While each application shared the common goal of protecting human health and the environment, they adopted different approaches to achieve this. For a specific project-level analysis of a particular place or instance, this may be acceptable, but to ensure long-term applicability and transferability to other projects, recommendations for developing a consistent approach to CRA are provided. This approach would draw from best practices, risk assessment and decision analysis sciences, and historical lessons learned to provide results in an understandable and accepted manner by all entities. This approach is intended to provide a common ground around which to develop CRA methods and approaches that can be followed at all levels. PMID:25918910

  5. Evaluation of hierarchical agglomerative cluster analysis methods for discrimination of primary biological aerosol

    NASA Astrophysics Data System (ADS)

    Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.

    2015-07-01

    In this paper we present improved methods for discriminating and quantifying Primary Biological Aerosol Particles (PBAP) by applying hierarchical agglomerative cluster analysis to multi-parameter ultra violet-light induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1×106 points on a desktop computer, allowing for each fluorescent particle in a dataset to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient dataset. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best performing methods were applied to the BEACHON-RoMBAS ambient dataset where it was found that the z-score and range normalisation methods yield similar results with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misatrribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed yielding an explict cluster attribution for each particle, improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.

  6. Comparison of posterior retroperitoneal and transabdominal lateral approaches in robotic adrenalectomy: an analysis of 200 cases.

    PubMed

    Kahramangil, Bora; Berber, Eren

    2018-04-01

    Although numerous studies have been published on robotic adrenalectomy (RA) in the literature, none has done a comparison of posterior retroperitoneal (PR) and transabdominal lateral (TL) approaches. The aim of this study was to compare the outcomes of robotic PR and TL adrenalectomy. This is a retrospective analysis of a prospectively maintained database. Between September 2008 and January 2017, perioperative outcomes of patients undergoing RA through PR and TL approaches were recorded into an IRB-approved database. Clinical and perioperative parameters were compared using Student's t test, Wilcoxon rank-sum test, and χ 2 test. Multivariate regression analysis was performed to determine factors associated with total operative time. 188 patients underwent 200 RAs. 110 patients were operated through TL and 78 patients through PR approach. Overall, conversion rate to open was 2.5% and 90-day morbidity 4.8%. The perioperative outcomes of TL and PR approaches were similar regarding estimated blood loss, rate of conversion to open, length of hospital stay, and 90-day morbidity. PR approach resulted in a shorter mean ± SD total operative time (136.3 ± 38.7 vs. 154.6 ± 48.4 min; p = 0.005) and lower visual analog scale pain score on postoperative day #1 (4.3 ± 2.5 vs. 5.4 ± 2.4; p = 0.001). After excluding tumors larger than 6 cm operated through TL approach, the difference in operative times persisted (136.3 ± 38.7 vs. 153.7 ± 45.7 min; p = 0.009). On multivariate regression analysis, increasing BMI and TL approaches were associated with longer total operative time. This study shows that robotic PR and TL approaches are equally safe and efficacious. With experience, shorter operative time and less postoperative pain can be achieved with PR technique. This supports the preferential utilization of PR approach in high-volume centers with enough experience.

  7. Addressing Informatics Barriers to Conducting Observational Comparative Effectiveness Research: A Comparative Case Analysis

    ERIC Educational Resources Information Center

    Boone, Christopher P. D.

    2013-01-01

    Background: The U.S. health care system has been under immense scrutiny for ever-increasing costs and poor health outcomes for its patients. Comparative Effectiveness Research (CER) has emerged as a generally accepted practice by providers, policy makers, and scientists as an approach to identify the most clinical- and cost-effective interventions…

  8. Automatic Error Analysis Using Intervals

    ERIC Educational Resources Information Center

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  9. Estimating residential price elasticity of demand for water: A contingent valuation approach

    NASA Astrophysics Data System (ADS)

    Thomas, John F.; Syme, Geoffrey J.

    1988-11-01

    Residential households in Perth, Western Australia have access to privately extracted groundwater as well as a public mains water supply, which has been charged through a two-part block tariff. A contingent valuation approach is developed to estimate price elasticity of demand for public supply. Results are compared with those of a multivariate time series analysis. Validation tests for the contingent approach are proposed, based on a comparison of predicted behaviors following hypothesised price changes with relevant independent data. Properly conducted, the contingent approach appears to be reliable, applicable where the available data do not favor regression analysis, and a fruitful source of information about social, technical, and behavioral responses to change in the price of water.

  10. Development Context Driven Change Awareness and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha

    2014-01-01

    Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.

  11. Development Context Driven Change Awareness and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian

    2014-01-01

    Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.

  12. Multilingual Sentiment Analysis: State of the Art and Independent Comparison of Techniques.

    PubMed

    Dashtipour, Kia; Poria, Soujanya; Hussain, Amir; Cambria, Erik; Hawalah, Ahmad Y A; Gelbukh, Alexander; Zhou, Qiang

    With the advent of Internet, people actively express their opinions about products, services, events, political parties, etc., in social media, blogs, and website comments. The amount of research work on sentiment analysis is growing explosively. However, the majority of research efforts are devoted to English-language data, while a great share of information is available in other languages. We present a state-of-the-art review on multilingual sentiment analysis. More importantly, we compare our own implementation of existing approaches on common data. Precision observed in our experiments is typically lower than the one reported by the original authors, which we attribute to the lack of detail in the original presentation of those approaches. Thus, we compare the existing works by what they really offer to the reader, including whether they allow for accurate implementation and for reliable reproduction of the reported results.

  13. Quantile regression in the presence of monotone missingness with sensitivity analysis

    PubMed Central

    Liu, Minzhao; Daniels, Michael J.; Perri, Michael G.

    2016-01-01

    In this paper, we develop methods for longitudinal quantile regression when there is monotone missingness. In particular, we propose pattern mixture models with a constraint that provides a straightforward interpretation of the marginal quantile regression parameters. Our approach allows sensitivity analysis which is an essential component in inference for incomplete data. To facilitate computation of the likelihood, we propose a novel way to obtain analytic forms for the required integrals. We conduct simulations to examine the robustness of our approach to modeling assumptions and compare its performance to competing approaches. The model is applied to data from a recent clinical trial on weight management. PMID:26041008

  14. A Critical Comparison of Transformation and Deep Approach Theories of Learning

    ERIC Educational Resources Information Center

    Howie, Peter; Bagnall, Richard

    2015-01-01

    This paper reports a critical comparative analysis of two popular and significant theories of adult learning: the transformation and the deep approach theories of learning. These theories are operative in different educational sectors, are significant, respectively, in each, and they may be seen as both touching on similar concerns with learning…

  15. Planetary quarantine

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The overall objective is to identify those areas of future missions which will be impacted by planetary quarantine (PQ) constraints. The objective of the phase being described was to develop an approach for using decision theory in performing a PQ analysis for a Mariner Jupiter Uranus Mission and to compare it with the traditional approach used for other missions.

  16. Analysis of Employment Quality of Chinese Vocational and Technical College Graduates

    ERIC Educational Resources Information Center

    Po, Yang; Jianru, Guo; Yinan, Jin

    2015-01-01

    The employment quality of college graduates is a recent topic of heated discussion in China. Given the differences in the talent development goals of academic and vocational institutions and in individual job search approaches, this research compares the differences between the job search approaches and actual employment outcomes of graduates of…

  17. A comparison of a modified sequential oral sensory approach to an applied behavior-analytic approach in the treatment of food selectivity in children with autism spectrum disorder.

    PubMed

    Peterson, Kathryn M; Piazza, Cathleen C; Volkert, Valerie M

    2016-09-01

    Treatments of pediatric feeding disorders based on applied behavior analysis (ABA) have the most empirical support in the research literature (Volkert & Piazza, 2012); however, professionals often recommend, and caregivers often use, treatments that have limited empirical support. In the current investigation, we compared a modified sequential oral sensory approach (M-SOS; Benson, Parke, Gannon, & Muñoz, 2013) to an ABA approach for the treatment of the food selectivity of 6 children with autism. We randomly assigned 3 children to ABA and 3 children to M-SOS and compared the effects of treatment in a multiple baseline design across novel, healthy target foods. We used a multielement design to assess treatment generalization. Consumption of target foods increased for children who received ABA, but not for children who received M-SOS. We subsequently implemented ABA with the children for whom M-SOS was not effective and observed a potential treatment generalization effect during ABA when M-SOS preceded ABA. © 2016 Society for the Experimental Analysis of Behavior.

  18. Identifying functional reorganization of spelling networks: an individual peak probability comparison approach

    PubMed Central

    Purcell, Jeremy J.; Rapp, Brenda

    2013-01-01

    Previous research has shown that damage to the neural substrates of orthographic processing can lead to functional reorganization during reading (Tsapkini et al., 2011); in this research we ask if the same is true for spelling. To examine the functional reorganization of spelling networks we present a novel three-stage Individual Peak Probability Comparison (IPPC) analysis approach for comparing the activation patterns obtained during fMRI of spelling in a single brain-damaged individual with dysgraphia to those obtained in a set of non-impaired control participants. The first analysis stage characterizes the convergence in activations across non-impaired control participants by applying a technique typically used for characterizing activations across studies: Activation Likelihood Estimate (ALE) (Turkeltaub et al., 2002). This method was used to identify locations that have a high likelihood of yielding activation peaks in the non-impaired participants. The second stage provides a characterization of the degree to which the brain-damaged individual's activations correspond to the group pattern identified in Stage 1. This involves performing a Mahalanobis distance statistics analysis (Tsapkini et al., 2011) that compares each of a control group's peak activation locations to the nearest peak generated by the brain-damaged individual. The third stage evaluates the extent to which the brain-damaged individual's peaks are atypical relative to the range of individual variation among the control participants. This IPPC analysis allows for a quantifiable, statistically sound method for comparing an individual's activation pattern to the patterns observed in a control group and, thus, provides a valuable tool for identifying functional reorganization in a brain-damaged individual with impaired spelling. Furthermore, this approach can be applied more generally to compare any individual's activation pattern with that of a set of other individuals. PMID:24399981

  19. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Liebetreu, John; Moore, Michael S.; Price, Jeremy C.; Abbott, Ben

    2005-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  20. Modeling and Analysis of Space Based Transceivers

    NASA Technical Reports Server (NTRS)

    Moore, Michael S.; Price, Jeremy C.; Abbott, Ben; Liebetreu, John; Reinhart, Richard C.; Kacpura, Thomas J.

    2007-01-01

    This paper presents the tool chain, methodology, and initial results of a study to provide a thorough, objective, and quantitative analysis of the design alternatives for space Software Defined Radio (SDR) transceivers. The approach taken was to develop a set of models and tools for describing communications requirements, the algorithm resource requirements, the available hardware, and the alternative software architectures, and generate analysis data necessary to compare alternative designs. The Space Transceiver Analysis Tool (STAT) was developed to help users identify and select representative designs, calculate the analysis data, and perform a comparative analysis of the representative designs. The tool allows the design space to be searched quickly while permitting incremental refinement in regions of higher payoff.

  1. An Efficient Soft Set-Based Approach for Conflict Analysis

    PubMed Central

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  2. An Efficient Soft Set-Based Approach for Conflict Analysis.

    PubMed

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  3. An application of Chan-Vese method used to determine the ROI area in CT lung screening

    NASA Astrophysics Data System (ADS)

    Prokop, Paweł; Surtel, Wojciech

    2016-09-01

    The article presents two approaches of determining the ROI area in CT lung screening. First approach is based on a classic method of framing the image in order to determine the ROI by using a MaZda tool. Second approach is based on segmentation of CT images of the lungs and reducing the redundant information from the image. Of the two approaches of an Active Contour, it was decided to choose the Chan-Vese method. In order to determine the effectiveness of the approach, it was performed an analysis of received ROI texture and extraction of textural features. In order to determine the effectiveness of the method, it was performed an analysis of the received ROI textures and extraction of the texture features, by using a Mazda tool. The results were compared and presented in the form of the radar graphs. The second approach proved to be effective and appropriate and consequently it is used for further analysis of CT images, in the computer-aided diagnosis of sarcoidosis.

  4. Eco-Systemic Analysis of Anorexia Nervosa.

    ERIC Educational Resources Information Center

    Sheppy, Margarette I.; And Others

    1988-01-01

    Tested eco-systemic approach to understanding of anorexia nervosa. Compared 30 anorexics and parents to 34 matched control subjects and parents. Found that, compared to controls, families of anorexics were less supportive, helpful, and committed to each other. Family interactions perceived by anorexics were characterized by overprotective,…

  5. Social Networks and Mourning: A Comparative Approach.

    ERIC Educational Resources Information Center

    Rubin, Nissan

    1990-01-01

    Suggests using social network theory to explain varieties of mourning behavior in different societies. Compares participation in funeral ceremonies of members of different social circles in American society and Israeli kibbutz. Concludes that results demonstrated validity of concepts deriving from social network analysis in study of bereavement,…

  6. MultiMetEval: Comparative and Multi-Objective Analysis of Genome-Scale Metabolic Models

    PubMed Central

    Gevorgyan, Albert; Kierzek, Andrzej M.; Breitling, Rainer; Takano, Eriko

    2012-01-01

    Comparative metabolic modelling is emerging as a novel field, supported by the development of reliable and standardized approaches for constructing genome-scale metabolic models in high throughput. New software solutions are needed to allow efficient comparative analysis of multiple models in the context of multiple cellular objectives. Here, we present the user-friendly software framework Multi-Metabolic Evaluator (MultiMetEval), built upon SurreyFBA, which allows the user to compose collections of metabolic models that together can be subjected to flux balance analysis. Additionally, MultiMetEval implements functionalities for multi-objective analysis by calculating the Pareto front between two cellular objectives. Using a previously generated dataset of 38 actinobacterial genome-scale metabolic models, we show how these approaches can lead to exciting novel insights. Firstly, after incorporating several pathways for the biosynthesis of natural products into each of these models, comparative flux balance analysis predicted that species like Streptomyces that harbour the highest diversity of secondary metabolite biosynthetic gene clusters in their genomes do not necessarily have the metabolic network topology most suitable for compound overproduction. Secondly, multi-objective analysis of biomass production and natural product biosynthesis in these actinobacteria shows that the well-studied occurrence of discrete metabolic switches during the change of cellular objectives is inherent to their metabolic network architecture. Comparative and multi-objective modelling can lead to insights that could not be obtained by normal flux balance analyses. MultiMetEval provides a powerful platform that makes these analyses straightforward for biologists. Sources and binaries of MultiMetEval are freely available from https://github.com/PiotrZakrzewski/MetEval/downloads. PMID:23272111

  7. Analysis of Slug Tests in Formations of High Hydraulic Conductivity

    USGS Publications Warehouse

    Butler, J.J.; Garnett, E.J.; Healey, J.M.

    2003-01-01

    A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.

  8. Analysis of slug tests in formations of high hydraulic conductivity.

    PubMed

    Butler, James J; Garnett, Elizabeth J; Healey, John M

    2003-01-01

    A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.

  9. Power flow analysis of two coupled plates with arbitrary characteristics

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1990-01-01

    In the last progress report (Feb. 1988) some results were presented for a parametric analysis on the vibrational power flow between two coupled plate structures using the mobility power flow approach. The results reported then were for changes in the structural parameters of the two plates, but with the two plates identical in their structural characteristics. Herein, limitation is removed. The vibrational power input and output are evaluated for different values of the structural damping loss factor for the source and receiver plates. In performing this parametric analysis, the source plate characteristics are kept constant. The purpose of this parametric analysis is to determine the most critical parameters that influence the flow of vibrational power from the source plate to the receiver plate. In the case of the structural damping parametric analysis, the influence of changes in the source plate damping is also investigated. The results obtained from the mobility power flow approach are compared to results obtained using a statistical energy analysis (SEA) approach. The significance of the power flow results are discussed together with a discussion and a comparison between the SEA results and the mobility power flow results. Furthermore, the benefits derived from using the mobility power flow approach are examined.

  10. Seeking unique and common biological themes in multiple gene lists or datasets: pathway pattern extraction pipeline for pathway-level comparative analysis.

    PubMed

    Yi, Ming; Mudunuri, Uma; Che, Anney; Stephens, Robert M

    2009-06-29

    One of the challenges in the analysis of microarray data is to integrate and compare the selected (e.g., differential) gene lists from multiple experiments for common or unique underlying biological themes. A common way to approach this problem is to extract common genes from these gene lists and then subject these genes to enrichment analysis to reveal the underlying biology. However, the capacity of this approach is largely restricted by the limited number of common genes shared by datasets from multiple experiments, which could be caused by the complexity of the biological system itself. We now introduce a new Pathway Pattern Extraction Pipeline (PPEP), which extends the existing WPS application by providing a new pathway-level comparative analysis scheme. To facilitate comparing and correlating results from different studies and sources, PPEP contains new interfaces that allow evaluation of the pathway-level enrichment patterns across multiple gene lists. As an exploratory tool, this analysis pipeline may help reveal the underlying biological themes at both the pathway and gene levels. The analysis scheme provided by PPEP begins with multiple gene lists, which may be derived from different studies in terms of the biological contexts, applied technologies, or methodologies. These lists are then subjected to pathway-level comparative analysis for extraction of pathway-level patterns. This analysis pipeline helps to explore the commonality or uniqueness of these lists at the level of pathways or biological processes from different but relevant biological systems using a combination of statistical enrichment measurements, pathway-level pattern extraction, and graphical display of the relationships of genes and their associated pathways as Gene-Term Association Networks (GTANs) within the WPS platform. As a proof of concept, we have used the new method to analyze many datasets from our collaborators as well as some public microarray datasets. This tool provides a new pathway-level analysis scheme for integrative and comparative analysis of data derived from different but relevant systems. The tool is freely available as a Pathway Pattern Extraction Pipeline implemented in our existing software package WPS, which can be obtained at http://www.abcc.ncifcrf.gov/wps/wps_index.php.

  11. A hybrid-stress finite element approach for stress and vibration analysis in linear anisotropic elasticity

    NASA Technical Reports Server (NTRS)

    Oden, J. Tinsley; Fly, Gerald W.; Mahadevan, L.

    1987-01-01

    A hybrid stress finite element method is developed for accurate stress and vibration analysis of problems in linear anisotropic elasticity. A modified form of the Hellinger-Reissner principle is formulated for dynamic analysis and an algorithm for the determination of the anisotropic elastic and compliance constants from experimental data is developed. These schemes were implemented in a finite element program for static and dynamic analysis of linear anisotropic two dimensional elasticity problems. Specific numerical examples are considered to verify the accuracy of the hybrid stress approach and compare it with that of the standard displacement method, especially for highly anisotropic materials. It is that the hybrid stress approach gives much better results than the displacement method. Preliminary work on extensions of this method to three dimensional elasticity is discussed, and the stress shape functions necessary for this extension are included.

  12. TH-AB-201-10: Portal Dosimetry with Elekta IViewDose:Performance of the Simplified Commissioning Approach Versus Full Commissioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kydonieos, M; Folgueras, A; Florescu, L

    2016-06-15

    Purpose: Elekta recently developed a solution for in-vivo EPID dosimetry (iViewDose, Elekta AB, Stockholm, Sweden) in conjunction with the Netherlands Cancer Institute (NKI). This uses a simplified commissioning approach via Template Commissioning Models (TCMs), consisting of a subset of linac-independent pre-defined parameters. This work compares the performance of iViewDose using a TCM commissioning approach with that corresponding to full commissioning. Additionally, the dose reconstruction based on the simplified commissioning approach is validated via independent dose measurements. Methods: Measurements were performed at the NKI on a VersaHD™ (Elekta AB, Stockholm, Sweden). Treatment plans were generated with Pinnacle 9.8 (Philips Medical Systems,more » Eindhoven, The Netherlands). A farmer chamber dose measurement and two EPID images were used to create a linac-specific commissioning model based on a TCM. A complete set of commissioning measurements was collected and a full commissioning model was created.The performance of iViewDose based on the two commissioning approaches was compared via a series of set-to-work tests in a slab phantom. In these tests, iViewDose reconstructs and compares EPID to TPS dose for square fields, IMRT and VMAT plans via global gamma analysis and isocentre dose difference. A clinical VMAT plan was delivered to a homogeneous Octavius 4D phantom (PTW, Freiburg, Germany). Dose was measured with the Octavius 1500 array and VeriSoft software was used for 3D dose reconstruction. EPID images were acquired. TCM-based iViewDose and 3D Octavius dose distributions were compared against the TPS. Results: For both the TCM-based and the full commissioning approaches, the pass rate, mean γ and dose difference were >97%, <0.5 and <2.5%, respectively. Equivalent gamma analysis results were obtained for iViewDose (TCM approach) and Octavius for a VMAT plan. Conclusion: iViewDose produces similar results with the simplified and full commissioning approaches. Good agreement is obtained between iViewDose (simplified approach) and the independent measurement tool. This research is funded by Elekta Limited.« less

  13. Diabetes Care Management Teams Did Not Reduce Utilization When Compared With Traditional Care: A Randomized Cluster Trial.

    PubMed

    Kearns, Patrick

    2017-10-01

    PURPOSE: Health services research evaluates redesign models for primary care. Care management is one alternative. Evaluation includes resource utilization as a criterion. Compare the impact of care-manager teams on resource utilization. The comparison includes entire panes of patients and the subset of patients with diabetes. DESIGN: Randomized, prospective, cohort study comparing change in utilization rates between groups, pre- and post-intervention. METHODOLOGY: Ten primary care physician panels in a safety-net setting. Ten physicians were randomized to either a care-management approach (Group 1) or a traditional approach (Group 2). Care managers focused on diabetes and the cardiovascular cluster of diseases. Analysis compared rates of hospitalization, 30-day readmission, emergency room visits, and urgent care visits. Analysis compared baseline rates to annual rates after a yearlong run-in for entire panels and the subset of patients with diabetes. RESULTS: Resource utilization showed no statistically significant change between baseline and Year 3 (P=.79). Emergency room visits and hospital readmission increased for both groups (P=.90), while hospital admissions and urgent care visits decreased (P=.73). Similarly, utilization was not significantly different for patients with diabetes (P=.69). CONCLUSIONS: A care-management team approach failed to improve resource utilization rates by entire panels and the subset of diabetic patients compared to traditional care. This reinforces the need for further evidentiary support for the care-management model's hypothesis in the safety net.

  14. Motion Cueing Algorithm Development: Piloted Performance Testing of the Cueing Algorithms

    NASA Technical Reports Server (NTRS)

    Houck, Jacob A. (Technical Monitor); Telban, Robert J.; Cardullo, Frank M.; Kelly, Lon C.

    2005-01-01

    The relative effectiveness in simulating aircraft maneuvers with both current and newly developed motion cueing algorithms was assessed with an eleven-subject piloted performance evaluation conducted on the NASA Langley Visual Motion Simulator (VMS). In addition to the current NASA adaptive algorithm, two new cueing algorithms were evaluated: the optimal algorithm and the nonlinear algorithm. The test maneuvers included a straight-in approach with a rotating wind vector, an offset approach with severe turbulence and an on/off lateral gust that occurs as the aircraft approaches the runway threshold, and a takeoff both with and without engine failure after liftoff. The maneuvers were executed with each cueing algorithm with added visual display delay conditions ranging from zero to 200 msec. Two methods, the quasi-objective NASA Task Load Index (TLX), and power spectral density analysis of pilot control, were used to assess pilot workload. Piloted performance parameters for the approach maneuvers, the vertical velocity upon touchdown and the runway touchdown position, were also analyzed but did not show any noticeable difference among the cueing algorithms. TLX analysis reveals, in most cases, less workload and variation among pilots with the nonlinear algorithm. Control input analysis shows pilot-induced oscillations on a straight-in approach were less prevalent compared to the optimal algorithm. The augmented turbulence cues increased workload on an offset approach that the pilots deemed more realistic compared to the NASA adaptive algorithm. The takeoff with engine failure showed the least roll activity for the nonlinear algorithm, with the least rudder pedal activity for the optimal algorithm.

  15. Analysis of x-ray tomography data of an extruded low density styrenic foam: an image analysis study

    NASA Astrophysics Data System (ADS)

    Lin, Jui-Ching; Heeschen, William

    2016-10-01

    Extruded styrenic foams are low density foams that are widely used for thermal insulation. It is difficult to precisely characterize the structure of the cells in low density foams by traditional cross-section viewing due to the frailty of the walls of the cells. X-ray computed tomography (CT) is a non-destructive, three dimensional structure characterization technique that has great potential for structure characterization of styrenic foams. Unfortunately the intrinsic artifacts of the data and the artifacts generated during image reconstruction are often comparable in size and shape to the thin walls of the foam, making robust and reliable analysis of cell sizes challenging. We explored three different image processing methods to clean up artifacts in the reconstructed images, thus allowing quantitative three dimensional determination of cell size in a low density styrenic foam. Three image processing approaches - an intensity based approach, an intensity variance based approach, and a machine learning based approach - are explored in this study, and the machine learning image feature classification method was shown to be the best. Individual cells are segmented within the images after the images were cleaned up using the three different methods and the cell sizes are measured and compared in the study. Although the collected data with the image analysis methods together did not yield enough measurements for a good statistic of the measurement of cell sizes, the problem can be resolved by measuring multiple samples or increasing imaging field of view.

  16. Ortholog Identification and Comparative Analysis of Microbial Genomes Using MBGD and RECOG.

    PubMed

    Uchiyama, Ikuo

    2017-01-01

    Comparative genomics is becoming an essential approach for identification of genes associated with a specific function or phenotype. Here, we introduce the microbial genome database for comparative analysis (MBGD), which is a comprehensive ortholog database among the microbial genomes available so far. MBGD contains several precomputed ortholog tables including the standard ortholog table covering the entire taxonomic range and taxon-specific ortholog tables for various major taxa. In addition, MBGD allows the users to create an ortholog table within any specified set of genomes through dynamic calculations. In particular, MBGD has a "My MBGD" mode where users can upload their original genome sequences and incorporate them into orthology analysis. The created ortholog table can serve as the basis for various comparative analyses. Here, we describe the use of MBGD and briefly explain how to utilize the orthology information during comparative genome analysis in combination with the stand-alone comparative genomics software RECOG, focusing on the application to comparison of closely related microbial genomes.

  17. Methods for therapeutic trials in COPD: lessons from the TORCH trial.

    PubMed

    Keene, O N; Vestbo, J; Anderson, J A; Calverley, P M A; Celli, B; Ferguson, G T; Jenkins, C; Jones, P W

    2009-11-01

    The TORCH (Towards a Revolution in COPD Health) trial has highlighted some important issues in the design and analysis of long term trials in chronic obstructive pulmonary disease. These include collection of off-treatment exacerbation data, analysis of exacerbation rates and the effect of inclusion of patients receiving inhaled corticosteroids (ICS) prior to randomisation. When effective medications are available to patients who withdraw, inclusion of off-treatment data can mask important treatment effects on exacerbation rates. Analysis of on-treatment data avoids this bias but it needs to be combined with careful analysis of withdrawal patterns across treatments. The negative binomial model is currently the best approach to statistical analysis of exacerbation rates, while analysis of time to exacerbation can supplement this approach. In the TORCH trial, exacerbation rates were higher among patients with previous use of ICS compared to those with no prior use on all study treatments. Retrospective subgroup analysis suggests ICS reduced exacerbation rates compared with placebo, regardless of prior use of ICS before entry to the study. Factorial analysis provides an alternative analysis for trials with combinations of treatments, but assumes no interaction between treatments, an assumption which cannot be verified by a significance test. No definitive conclusions can yet be drawn on whether ICS treatment has an effect on mortality.

  18. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis.

    PubMed

    Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L

    2017-07-01

    To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.

  19. [Are there selection criteria between abdominal approach and vaginal route for genital prolapse surgical management?].

    PubMed

    Cour, F; Vidart, A

    2016-07-01

    The never ending debate over the surgical approach of genital prolapse repair (abdominal versus vaginal route) is as passionate as ever. The available literature may support a multidisciplinary analysis of our expert daily practice. Our purpose was to define selection criteria for surgical approach between abdominal and vaginal route in the management of genital prolapse by reviewing the literature. Systematically review of the literature concerning comparative anatomical and functionnal results of surgery of pelvic organ prolaps by vaginal or abdominal route. We were confronted to the lack of data in the literature, with few prospective randomized comparative studies. Many limitations were identified such as small populations in the studies, no description of sub-population, multiplicity of surgical procedures. Moreover, vaginal route was compared to sacral colpopexy by open abdominal approach, whereas laparoscopic sacrocolpopexy is now recommended. Only one prospective randomized comparative trial assessed laparoscopic sacrocolpopexy and vaginal approach, in which was used a mesh withdrawn from the market. The lack of available randomized trials makes it impossible to define HAS compliant guidelines on this topic. However, selection criteria for each surgical approach and technique were drawn from experts' advices. © 2016 Published by Elsevier Masson SAS. © 2016 Elsevier Masson SAS. Tous droits réservés.

  20. The added value of ordinal analysis in clinical trials: an example in traumatic brain injury.

    PubMed

    Roozenbeek, Bob; Lingsma, Hester F; Perel, Pablo; Edwards, Phil; Roberts, Ian; Murray, Gordon D; Maas, Andrew Ir; Steyerberg, Ewout W

    2011-01-01

    In clinical trials, ordinal outcome measures are often dichotomized into two categories. In traumatic brain injury (TBI) the 5-point Glasgow outcome scale (GOS) is collapsed into unfavourable versus favourable outcome. Simulation studies have shown that exploiting the ordinal nature of the GOS increases chances of detecting treatment effects. The objective of this study is to quantify the benefits of ordinal analysis in the real-life situation of a large TBI trial. We used data from the CRASH trial that investigated the efficacy of corticosteroids in TBI patients (n = 9,554). We applied two techniques for ordinal analysis: proportional odds analysis and the sliding dichotomy approach, where the GOS is dichotomized at different cut-offs according to baseline prognostic risk. These approaches were compared to dichotomous analysis. The information density in each analysis was indicated by a Wald statistic. All analyses were adjusted for baseline characteristics. Dichotomous analysis of the six-month GOS showed a non-significant treatment effect (OR = 1.09, 95% CI 0.98 to 1.21, P = 0.096). Ordinal analysis with proportional odds regression or sliding dichotomy showed highly statistically significant treatment effects (OR 1.15, 95% CI 1.06 to 1.25, P = 0.0007 and 1.19, 95% CI 1.08 to 1.30, P = 0.0002), with 2.05-fold and 2.56-fold higher information density compared to the dichotomous approach respectively. Analysis of the CRASH trial data confirmed that ordinal analysis of outcome substantially increases statistical power. We expect these results to hold for other fields of critical care medicine that use ordinal outcome measures and recommend that future trials adopt ordinal analyses. This will permit detection of smaller treatment effects.

  1. Multilevel Analysis Methods for Partially Nested Cluster Randomized Trials

    ERIC Educational Resources Information Center

    Sanders, Elizabeth A.

    2011-01-01

    This paper explores multilevel modeling approaches for 2-group randomized experiments in which a treatment condition involving clusters of individuals is compared to a control condition involving only ungrouped individuals, otherwise known as partially nested cluster randomized designs (PNCRTs). Strategies for comparing groups from a PNCRT in the…

  2. The Cluster Sensitivity Index: A Basic Measure of Classification Robustness

    ERIC Educational Resources Information Center

    Hom, Willard C.

    2010-01-01

    Analysts of institutional performance have occasionally used a peer grouping approach in which they compared institutions only to other institutions with similar characteristics. Because analysts historically have used cluster analysis to define peer groups (i.e., the group of comparable institutions), the author proposes and demonstrates with…

  3. A Comparative Analysis of Teenagers Who Smoke Different Cigarette Brands.

    ERIC Educational Resources Information Center

    Enomoto, Carl E.

    2000-01-01

    Analyzes and compares the survey responses of teenagers who smoke different cigarette brands, specifically Marlboro, Camel, and Newport. Differences were seen across brands but teen smokers had similar opinions about quitting. Given the differences across brands, more flexible approaches may be needed to address teenage smoking. (Author/MKA)

  4. A data envelopment analysis approach to compare the environmental efficiency of energy technologies and countries

    USDA-ARS?s Scientific Manuscript database

    Due to increasing financial and environmental concerns, governmental rules, regulations and incentives alternative energy sources are soon expected to grow at a much faster pace than conventional sources of energy. However, the current body of research providing comparative decision making models fo...

  5. Percutaneous versus traditional and paraspinal posterior open approaches for treatment of thoracolumbar fractures without neurologic deficit: a meta-analysis.

    PubMed

    Sun, Xiang-Yao; Zhang, Xi-Nuo; Hai, Yong

    2017-05-01

    This study evaluated differences in outcome variables between percutaneous, traditional, and paraspinal posterior open approaches for traumatic thoracolumbar fractures without neurologic deficit. A systematic review of PubMed, Cochrane, and Embase was performed. In this meta-analysis, we conducted online searches of PubMed, Cochrane, Embase using the search terms "thoracolumbar fractures", "lumbar fractures", ''percutaneous'', "minimally invasive", ''open", "traditional", "posterior", "conventional", "pedicle screw", "sextant", and "clinical trial". The analysis was performed on individual patient data from all the studies that met the selection criteria. Clinical outcomes were expressed as risk difference for dichotomous outcomes and mean difference for continuous outcomes with 95 % confidence interval. Heterogeneity was assessed using the χ 2 test and I 2 statistics. There were 4 randomized controlled trials and 14 observational articles included in this analysis. Percutaneous approach was associated with better ODI score, less Cobb angle correction, less Cobb angle correction loss, less postoperative VBA correction, and lower infection rate compared with open approach. Percutaneous approach was also associated with shorter operative duration, longer intraoperative fluoroscopy, less postoperative VAS, and postoperative VBH% in comparison with traditional open approach. No significant difference was found in Cobb angle correction, postoperative VBA, VBA correction loss, Postoperative VBH%, VBH correction loss, and pedicle screw misplacement between percutaneous approach and open approach. There was no significant difference in operative duration, intraoperative fluoroscopy, postoperative VAS, and postoperative VBH% between percutaneous approach and paraspianl approach. The functional and the radiological outcome of percutaneous approach would be better than open approach in the long term. Although trans-muscular spatium approach belonged to open fixation methods, it was strictly defined as less invasive approach, which provided less injury to the paraspinal muscles and better reposition effect.

  6. Two-dimensional statistical linear discriminant analysis for real-time robust vehicle-type recognition

    NASA Astrophysics Data System (ADS)

    Zafar, I.; Edirisinghe, E. A.; Acar, S.; Bez, H. E.

    2007-02-01

    Automatic vehicle Make and Model Recognition (MMR) systems provide useful performance enhancements to vehicle recognitions systems that are solely based on Automatic License Plate Recognition (ALPR) systems. Several car MMR systems have been proposed in literature. However these approaches are based on feature detection algorithms that can perform sub-optimally under adverse lighting and/or occlusion conditions. In this paper we propose a real time, appearance based, car MMR approach using Two Dimensional Linear Discriminant Analysis that is capable of addressing this limitation. We provide experimental results to analyse the proposed algorithm's robustness under varying illumination and occlusions conditions. We have shown that the best performance with the proposed 2D-LDA based car MMR approach is obtained when the eigenvectors of lower significance are ignored. For the given database of 200 car images of 25 different make-model classifications, a best accuracy of 91% was obtained with the 2D-LDA approach. We use a direct Principle Component Analysis (PCA) based approach as a benchmark to compare and contrast the performance of the proposed 2D-LDA approach to car MMR. We conclude that in general the 2D-LDA based algorithm supersedes the performance of the PCA based approach.

  7. Modeling the economic impact of medication adherence in type 2 diabetes: a theoretical approach.

    PubMed

    Cobden, David S; Niessen, Louis W; Rutten, Frans Fh; Redekop, W Ken

    2010-09-07

    While strong correlations exist between medication adherence and health economic outcomes in type 2 diabetes, current economic analyses do not adequately consider them. We propose a new approach to incorporate adherence in cost-effectiveness analysis. We describe a theoretical approach to incorporating the effect of adherence when estimating the long-term costs and effectiveness of an antidiabetic medication. This approach was applied in a Markov model which includes common diabetic health states. We compared two treatments using hypothetical patient cohorts: injectable insulin (IDM) and oral (OAD) medications. Two analyses were performed, one which ignored adherence (analysis 1) and one which incorporated it (analysis 2). Results from the two analyses were then compared to explore the extent to which adherence may impact incremental cost-effectiveness ratios. In both analyses, IDM was more costly and more effective than OAD. When adherence was ignored, IDM generated an incremental cost-effectiveness of $12,097 per quality-adjusted life-year (QALY) gained versus OAD. Incorporation of adherence resulted in a slightly higher ratio ($16,241/QALY). This increase was primarily due to better adherence with OAD than with IDM, and the higher direct medical costs for IDM. Incorporating medication adherence into economic analyses can meaningfully influence the estimated cost-effectiveness of type 2 diabetes treatments, and should therefore be considered in health care decision-making. Future work on the impact of adherence on health economic outcomes, and validation of different approaches to modeling adherence, is warranted.

  8. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  9. A Multiperspective Analysis on Developing and Maintaining Trust in Senior Student Affairs Leadership

    ERIC Educational Resources Information Center

    Ruthkosky, Philip J.

    2013-01-01

    This study examines senior student affairs leadership through the diverse lenses of subordinates, administrative peers, presidents, and senior student affairs officers (SSAOs). Guided by an interpretive paradigm, a qualitative methodology was employed consisting of a six-case comparative analysis and grounded theory approach. The findings provide…

  10. The Mathematical Analysis of Style: A Correlation-Based Approach.

    ERIC Educational Resources Information Center

    Oppenheim, Rosa

    1988-01-01

    Examines mathematical models of style analysis, focusing on the pattern in which literary characteristics occur. Describes an autoregressive integrated moving average model (ARIMA) for predicting sentence length in different works by the same author and comparable works by different authors. This technique is valuable in characterizing stylistic…

  11. Opportunities for Applied Behavior Analysis in the Total Quality Movement.

    ERIC Educational Resources Information Center

    Redmon, William K.

    1992-01-01

    This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…

  12. On the Utility of Content Analysis in Author Attribution: "The Federalist."

    ERIC Educational Resources Information Center

    Martindale, Colin; McKenzie, Dean

    1995-01-01

    Compares the success of lexical statistics, content analysis, and function words in determining the true author of "The Federalist." The function word approach proved most successful in attributing the papers to James Madison. Lexical statistics contributed nothing, while content analytic measures resulted in some success. (MJP)

  13. A phasor approach analysis of multiphoton FLIM measurements of three-dimensional cell culture models

    NASA Astrophysics Data System (ADS)

    Lakner, P. H.; Möller, Y.; Olayioye, M. A.; Brucker, S. Y.; Schenke-Layland, K.; Monaghan, M. G.

    2016-03-01

    Fluorescence lifetime imaging microscopy (FLIM) is a useful approach to obtain information regarding the endogenous fluorophores present in biological samples. The concise evaluation of FLIM data requires the use of robust mathematical algorithms. In this study, we developed a user-friendly phasor approach for analyzing FLIM data and applied this method on three-dimensional (3D) Caco-2 models of polarized epithelial luminal cysts in a supporting extracellular matrix environment. These Caco-2 based models were treated with epidermal growth factor (EGF), to stimulate proliferation in order to determine if FLIM could detect such a change in cell behavior. Autofluorescence from nicotinamide adenine dinucleotide (phosphate) (NAD(P)H) in luminal Caco-2 cysts was stimulated by 2-photon laser excitation. Using a phasor approach, the lifetimes of involved fluorophores and their contribution were calculated with fewer initial assumptions when compared to multiexponential decay fitting. The phasor approach simplified FLIM data analysis, making it an interesting tool for non-experts in numerical data analysis. We observed that an increased proliferation stimulated by EGF led to a significant shift in fluorescence lifetime and a significant alteration of the phasor data shape. Our data demonstrates that multiphoton FLIM analysis with the phasor approach is a suitable method for the non-invasive analysis of 3D in vitro cell culture models qualifying this method for monitoring basic cellular features and the effect of external factors.

  14. Exploring the Use of Cost-Benefit Analysis to Compare Pharmaceutical Treatments for Menorrhagia.

    PubMed

    Sanghera, Sabina; Frew, Emma; Gupta, Janesh Kumar; Kai, Joe; Roberts, Tracy Elizabeth

    2015-09-01

    The extra-welfarist theoretical framework tends to focus on health-related quality of life, whilst the welfarist framework captures a wider notion of well-being. EQ-5D and SF-6D are commonly used to value outcomes in chronic conditions with episodic symptoms, such as heavy menstrual bleeding (clinically termed menorrhagia). Because of their narrow-health focus and the condition's periodic nature these measures may be unsuitable. A viable alternative measure is willingness to pay (WTP) from the welfarist framework. We explore the use of WTP in a preliminary cost-benefit analysis comparing pharmaceutical treatments for menorrhagia. A cost-benefit analysis was carried out based on an outcome of WTP. The analysis is based in the UK primary care setting over a 24-month time period, with a partial societal perspective. Ninety-nine women completed a WTP exercise from the ex-ante (pre-treatment/condition) perspective. Maximum average WTP values were elicited for two pharmaceutical treatments, levonorgestrel-releasing intrauterine system (LNG-IUS) and oral treatment. Cost data were offset against WTP and the net present value derived for treatment. Qualitative information explaining the WTP values was also collected. Oral treatment was indicated to be the most cost-beneficial intervention costing £107 less than LNG-IUS and generating £7 more benefits. The mean incremental net present value for oral treatment compared with LNG-IUS was £113. The use of the WTP approach was acceptable as very few protests and non-responses were observed. The preliminary cost-benefit analysis results recommend oral treatment as the first-line treatment for menorrhagia. The WTP approach is a feasible alternative to the conventional EQ-5D/SF-6D approaches and offers advantages by capturing benefits beyond health, which is particularly relevant in menorrhagia.

  15. Treatments of Missing Values in Large National Data Affect Conclusions: The Impact of Multiple Imputation on Arthroplasty Research.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Su, Edwin P; Grauer, Jonathan N

    2018-03-01

    Despite the advantages of large, national datasets, one continuing concern is missing data values. Complete case analysis, where only cases with complete data are analyzed, is commonly used rather than more statistically rigorous approaches such as multiple imputation. This study characterizes the potential selection bias introduced using complete case analysis and compares the results of common regressions using both techniques following unicompartmental knee arthroplasty. Patients undergoing unicompartmental knee arthroplasty were extracted from the 2005 to 2015 National Surgical Quality Improvement Program. As examples, the demographics of patients with and without missing preoperative albumin and hematocrit values were compared. Missing data were then treated with both complete case analysis and multiple imputation (an approach that reproduces the variation and associations that would have been present in a full dataset) and the conclusions of common regressions for adverse outcomes were compared. A total of 6117 patients were included, of which 56.7% were missing at least one value. Younger, female, and healthier patients were more likely to have missing preoperative albumin and hematocrit values. The use of complete case analysis removed 3467 patients from the study in comparison with multiple imputation which included all 6117 patients. The 2 methods of handling missing values led to differing associations of low preoperative laboratory values with commonly studied adverse outcomes. The use of complete case analysis can introduce selection bias and may lead to different conclusions in comparison with the statistically rigorous multiple imputation approach. Joint surgeons should consider the methods of handling missing values when interpreting arthroplasty research. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Use of failure modes, effects, and criticality analysis to compare the vulnerabilities of laparoscopic versus open appendectomy.

    PubMed

    Guida, Edoardo; Rosati, Ubaldo; Pini Prato, Alessio; Avanzini, Stefano; Pio, Luca; Ghezzi, Michele; Jasonni, Vincenzo; Mattioli, Girolamo

    2015-06-01

    To measure the feasibility of using FMECA applied to the surgery and then compare the vulnerabilities of laparoscopic versus open appendectomy by using FMECA. The FMECA study was performed on each single selected phase of appendectomy and on complication-related data during the period January 1, 2009, to December 31, 2010. The risk analysis phase was completed by evaluation of the criticality index (CI) of each appendectomy-related failure mode (FM). The CI is calculated by multiplying the estimated frequency of occurrence (O) of the FM, by the expected severity of the injury to the patient (S), and the detectability (D) of the FM. In the first year of analysis (2009), 177 appendectomies were performed, 110 open and 67 laparoscopic. Eleven adverse events were related to the open appendectomy: 1 bleeding (CI: 8) and 10 postoperative infections (CI: 32). Three adverse events related to the laparoscopic approach were recorded: 1 postoperative infection (CI: 8) and 2 incorrect extractions of the appendix through the umbilical port (CI: 6). In the second year of analysis (2010), 158 appendectomies were performed, 69 open and 89 laparoscopic. Four adverse events were related to the open appendectomy: 1 incorrect management of the histological specimen (CI: 2), 1 dehiscence of the surgical wound (CI: 6), and 2 infections (CI: 6). No adverse events were recorded in laparoscopic approach. FMECA helped the staff compare the 2 approaches through an accurate step-by-step analysis, highlighting that laparoscopic appendectomy is feasible and safe, associated with a lower incidence of infection and other complications, reduced length of hospital stay, and an apparent lower procedure-related risk.

  17. Cost-effectiveness analysis of a system-based approach for managing neonatal jaundice and preventing kernicterus in Ontario.

    PubMed

    Xie, Bin; da Silva, Orlando; Zaric, Greg

    2012-01-01

    To evaluate the incremental cost-effectiveness of a system-based approach for the management of neonatal jaundice and the prevention of kernicterus in term and late-preterm (≥35 weeks) infants, compared with the traditional practice based on visual inspection and selected bilirubin testing. Two hypothetical cohorts of 150,000 term and late-preterm neonates were used to compare the costs and outcomes associated with the use of a system-based or traditional practice approach. Data for the evaluation were obtained from the case costing centre at a large teaching hospital in Ontario, supplemented by data from the literature. The per child cost for the system-based approach cohort was $176, compared with $173 in the traditional practice cohort. The higher cost associated with the system-based cohort reflects increased costs for predischarge screening and treatment and increased postdischarge follow-up visits. These costs are partially offset by reduced costs from fewer emergency room visits, hospital readmissions and kernicterus cases. Compared with the traditional approach, the cost to prevent one kernicterus case using the system-based approach was $570,496, the cost per life year gained was $26,279, and the cost per quality-adjusted life year gained was $65,698. The cost to prevent one kernicterus case using the system-based approach is much lower than previously reported in the literature.

  18. Cost-effectiveness analysis of a system-based approach for managing neonatal jaundice and preventing kernicterus in Ontario

    PubMed Central

    Xie, Bin; da Silva, Orlando; Zaric, Greg

    2012-01-01

    OBJECTIVE: To evaluate the incremental cost-effectiveness of a system-based approach for the management of neonatal jaundice and the prevention of kernicterus in term and late-preterm (≥35 weeks) infants, compared with the traditional practice based on visual inspection and selected bilirubin testing. STUDY DESIGN: Two hypothetical cohorts of 150,000 term and late-preterm neonates were used to compare the costs and outcomes associated with the use of a system-based or traditional practice approach. Data for the evaluation were obtained from the case costing centre at a large teaching hospital in Ontario, supplemented by data from the literature. RESULTS: The per child cost for the system-based approach cohort was $176, compared with $173 in the traditional practice cohort. The higher cost associated with the system-based cohort reflects increased costs for predischarge screening and treatment and increased postdischarge follow-up visits. These costs are partially offset by reduced costs from fewer emergency room visits, hospital readmissions and kernicterus cases. Compared with the traditional approach, the cost to prevent one kernicterus case using the system-based approach was $570,496, the cost per life year gained was $26,279, and the cost per quality-adjusted life year gained was $65,698. CONCLUSION: The cost to prevent one kernicterus case using the system-based approach is much lower than previously reported in the literature. PMID:23277747

  19. Classical, Generalizability, and Multifaceted Rasch Detection of Interrater Variability in Large, Sparse Data Sets.

    ERIC Educational Resources Information Center

    MacMillan, Peter D.

    2000-01-01

    Compared classical test theory (CTT), generalizability theory (GT), and multifaceted Rasch model (MFRM) approaches to detecting and correcting for rater variability using responses of 4,930 high school students graded by 3 raters on 9 scales. The MFRM approach identified far more raters as different than did the CTT analysis. GT and Rasch…

  20. Lightweight and Compostable Fiberboard for the Military

    DTIC Science & Technology

    2012-08-01

    individual sheets with compression molding methods. The second approach examined different biodegradable coatings for paper formation which enhanced wet...strength properties of paper based products. The third approach identified effective coated corrugated alternatives that exhibited comparable...fiberboard containers to different environmental conditions. Analysis of variance of compression data as a function of moisture, insert design and paper

  1. Comparison of the Relative Effectiveness of Different Kinds of Reinforcers: A PEM Approach

    ERIC Educational Resources Information Center

    Ma, Hsen-Hsing

    2009-01-01

    The purpose of the present study was to apply the percentage of data points exceeding the median of baseline phase (PEM) approach for a meta-analysis of single-case experiments to compare the relative effectiveness of different kinds of reinforcers used in behavior modification. Altogether 153 studies were located, which produced 1091 effect…

  2. Three Methods of Estimating a Model of Group Effects: A Comparison with Reference to School Effect Studies.

    ERIC Educational Resources Information Center

    Igra, Amnon

    1980-01-01

    Three methods of estimating a model of school effects are compared: ordinary least squares; an approach based on the analysis of covariance; and, a residualized input-output approach. Results are presented using a matrix algebra formulation, and advantages of the first two methods are considered. (Author/GK)

  3. Analysis of ToxCast data for food-relevant compounds by comparison with in vivo data using the RISK21 approach

    EPA Science Inventory

    The ToxCast program has generated a wealth of in vitro high throughput screening data, and best approaches for the interpretation and use of these data remain undetermined. We present case studies comparing the ToxCast and in vivo toxicity data for two food contact substances us...

  4. Multiple constraint analysis of regional land-surface carbon flux

    Treesearch

    D.P. Turner; M. Göckede; B.E. Law; W.D. Ritts; W.B. Cohen; Z. Yang; T. Hudiburg; R. Kennedy; M. Duane

    2011-01-01

    We applied and compared bottom-up (process model-based) and top-down (atmospheric inversion-based) scaling approaches to evaluate the spatial and temporal patterns of net ecosystem production (NEP) over a 2.5 × 105 km2 area (the state of Oregon) in the western United States. Both approaches indicated a carbon sink over this...

  5. The New Approach to Sport Medicine: 3-D Reconstruction

    ERIC Educational Resources Information Center

    Ince, Alparslan

    2015-01-01

    The aim of this study is to present a new approach to sport medicine. Comparative analysis of the Vertebrae Lumbales was done in sedentary group and Muay Thai athletes. It was done by acquiring three dimensional (3-D) data and models through photogrammetric methods from the Multi-detector Computerized Tomography (MDCT) images of the Vertebrae…

  6. Comparing compound-specific and bulk stable nitrogen isotope trophic discrimination factors across multiple freshwater fish species and diets

    USDA-ARS?s Scientific Manuscript database

    The use of nitrogen stable isotopes for estimation of animal trophic position has become an indispensable approach in food web ecology. Compound-specific isotope analysis of amino acids is a new approach for estimating trophic position that may overcome key issues associated with nitrogen stable iso...

  7. What Makes the Difference? An Analysis of a Reading Intervention Programme Implemented in Rural Schools in Cambodia

    ERIC Educational Resources Information Center

    Courtney, Jane; Gravelle, Maggie

    2014-01-01

    This article compares the existing single-strategy approach towards the teaching of early literacy in schools in rural Cambodia with a multiple-strategy approach introduced as part of a reading intervention programme. Classroom observations, questionnaires and in-depth interviews with teachers were used to explore teachers' practices and…

  8. A Marker-Based Approach for the Automated Selection of a Single Segmentation from a Hierarchical Set of Image Segmentations

    NASA Technical Reports Server (NTRS)

    Tarabalka, Y.; Tilton, J. C.; Benediktsson, J. A.; Chanussot, J.

    2012-01-01

    The Hierarchical SEGmentation (HSEG) algorithm, which combines region object finding with region object clustering, has given good performances for multi- and hyperspectral image analysis. This technique produces at its output a hierarchical set of image segmentations. The automated selection of a single segmentation level is often necessary. We propose and investigate the use of automatically selected markers for this purpose. In this paper, a novel Marker-based HSEG (M-HSEG) method for spectral-spatial classification of hyperspectral images is proposed. Two classification-based approaches for automatic marker selection are adapted and compared for this purpose. Then, a novel constrained marker-based HSEG algorithm is applied, resulting in a spectral-spatial classification map. Three different implementations of the M-HSEG method are proposed and their performances in terms of classification accuracies are compared. The experimental results, presented for three hyperspectral airborne images, demonstrate that the proposed approach yields accurate segmentation and classification maps, and thus is attractive for remote sensing image analysis.

  9. Improving the analysis, storage and sharing of neuroimaging data using relational databases and distributed computing.

    PubMed

    Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L

    2008-01-15

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.

  10. Improving the Analysis, Storage and Sharing of Neuroimaging Data using Relational Databases and Distributed Computing

    PubMed Central

    Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.

    2007-01-01

    The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812

  11. Evaluation of peak-picking algorithms for protein mass spectrometry.

    PubMed

    Bauer, Chris; Cramer, Rainer; Schuchhardt, Johannes

    2011-01-01

    Peak picking is an early key step in MS data analysis. We compare three commonly used approaches to peak picking and discuss their merits by means of statistical analysis. Methods investigated encompass signal-to-noise ratio, continuous wavelet transform, and a correlation-based approach using a Gaussian template. Functionality of the three methods is illustrated and discussed in a practical context using a mass spectral data set created with MALDI-TOF technology. Sensitivity and specificity are investigated using a manually defined reference set of peaks. As an additional criterion, the robustness of the three methods is assessed by a perturbation analysis and illustrated using ROC curves.

  12. Lattice Independent Component Analysis for Mobile Robot Localization

    NASA Astrophysics Data System (ADS)

    Villaverde, Ivan; Fernandez-Gauna, Borja; Zulueta, Ekaitz

    This paper introduces an approach to appearance based mobile robot localization using Lattice Independent Component Analysis (LICA). The Endmember Induction Heuristic Algorithm (EIHA) is used to select a set of Strong Lattice Independent (SLI) vectors, which can be assumed to be Affine Independent, and therefore candidates to be the endmembers of the data. Selected endmembers are used to compute the linear unmixing of the robot's acquired images. The resulting mixing coefficients are used as feature vectors for view recognition through classification. We show on a sample path experiment that our approach can recognise the localization of the robot and we compare the results with the Independent Component Analysis (ICA).

  13. GRACE time-variable gravity field recovery using an improved energy balance approach

    NASA Astrophysics Data System (ADS)

    Shang, Kun; Guo, Junyi; Shum, C. K.; Dai, Chunli; Luo, Jia

    2015-12-01

    A new approach based on energy conservation principle for satellite gravimetry mission has been developed and yields more accurate estimation of in situ geopotential difference observables using K-band ranging (KBR) measurements from the Gravity Recovery and Climate Experiment (GRACE) twin-satellite mission. This new approach preserves more gravity information sensed by KBR range-rate measurements and reduces orbit error as compared to previous energy balance methods. Results from analysis of 11 yr of GRACE data indicated that the resulting geopotential difference estimates agree well with predicted values from official Level 2 solutions: with much higher correlation at 0.9, as compared to 0.5-0.8 reported by previous published energy balance studies. We demonstrate that our approach produced a comparable time-variable gravity solution with the Level 2 solutions. The regional GRACE temporal gravity solutions over Greenland reveals that a substantially higher temporal resolution is achievable at 10-d sampling as compared to the official monthly solutions, but without the compromise of spatial resolution, nor the need to use regularization or post-processing.

  14. Assessing differential expression in two-color microarrays: a resampling-based empirical Bayes approach.

    PubMed

    Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D

    2013-01-01

    Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.

  15. Comparative study of the novel and conventional injection approach for inferior alveolar nerve block.

    PubMed

    Boonsiriseth, K; Sirintawat, N; Arunakul, K; Wongsirichat, N

    2013-07-01

    This study aimed to evaluate the efficacy of anesthesia obtained with a novel injection approach for inferior alveolar nerve block compared with the conventional injection approach. 40 patients in good health, randomly received each of two injection approaches of local anesthetic on each side of the mandible at two separate appointments. A sharp probe and an electric pulp tester were used to test anesthesia before injection, after injection when the patients' sensation changed, and 5 min after injection. This study comprised positive aspiration and intravascular injection 5% and neurovascular bundle injection 7.5% in the conventional inferior alveolar nerve block, but without occurrence in the novel injection approach. A visual analog scale (VAS) pain assessment was used during injection and surgery. The significance level used in the statistical analysis was p<0.05. For the novel injection approach compared with the conventional injection approach, no significant difference was found on the subjective onset, objective onset, operation time, duration of anesthesia and VAS pain score during operation, but the VAS pain score during injection was significantly different. The efficacy of inferior alveolar nerve block by the novel injection approach provided adequate anesthesia and caused less pain and greater safety during injection. Copyright © 2012 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  16. Bayesian correction for covariate measurement error: A frequentist evaluation and comparison with regression calibration.

    PubMed

    Bartlett, Jonathan W; Keogh, Ruth H

    2018-06-01

    Bayesian approaches for handling covariate measurement error are well established and yet arguably are still relatively little used by researchers. For some this is likely due to unfamiliarity or disagreement with the Bayesian inferential paradigm. For others a contributory factor is the inability of standard statistical packages to perform such Bayesian analyses. In this paper, we first give an overview of the Bayesian approach to handling covariate measurement error, and contrast it with regression calibration, arguably the most commonly adopted approach. We then argue why the Bayesian approach has a number of statistical advantages compared to regression calibration and demonstrate that implementing the Bayesian approach is usually quite feasible for the analyst. Next, we describe the closely related maximum likelihood and multiple imputation approaches and explain why we believe the Bayesian approach to generally be preferable. We then empirically compare the frequentist properties of regression calibration and the Bayesian approach through simulation studies. The flexibility of the Bayesian approach to handle both measurement error and missing data is then illustrated through an analysis of data from the Third National Health and Nutrition Examination Survey.

  17. Comparative spectral analysis of veterinary powder product by continuous wavelet and derivative transforms

    NASA Astrophysics Data System (ADS)

    Dinç, Erdal; Kanbur, Murat; Baleanu, Dumitru

    2007-10-01

    Comparative simultaneous determination of chlortetracycline and benzocaine in the commercial veterinary powder product was carried out by continuous wavelet transform (CWT) and classical derivative transform (or classical derivative spectrophotometry). In this quantitative spectral analysis, two proposed analytical methods do not require any chemical separation process. In the first step, several wavelet families were tested to find an optimal CWT for the overlapping signal processing of the analyzed compounds. Subsequently, we observed that the coiflets (COIF-CWT) method with dilation parameter, a = 400, gives suitable results for this analytical application. For a comparison, the classical derivative spectrophotometry (CDS) approach was also applied to the simultaneous quantitative resolution of the same analytical problem. Calibration functions were obtained by measuring the transform amplitudes corresponding to zero-crossing points for both CWT and CDS methods. The utility of these two analytical approaches were verified by analyzing various synthetic mixtures consisting of chlortetracycline and benzocaine and they were applied to the real samples consisting of veterinary powder formulation. The experimental results obtained from the COIF-CWT approach were statistically compared with those obtained by classical derivative spectrophotometry and successful results were reported.

  18. Protein precipitation of diluted samples in SDS-containing buffer with acetone leads to higher protein recovery and reproducibility in comparison with TCA/acetone approach.

    PubMed

    Santa, Cátia; Anjo, Sandra I; Manadas, Bruno

    2016-07-01

    Proteomic approaches are extremely valuable in many fields of research, where mass spectrometry methods have gained an increasing interest, especially because of the ability to perform quantitative analysis. Nonetheless, sample preparation prior to mass spectrometry analysis is of the utmost importance. In this work, two protein precipitation approaches, widely used for cleaning and concentrating protein samples, were tested and compared in very diluted samples solubilized in a strong buffer (containing SDS). The amount of protein recovered after acetone and TCA/acetone precipitation was assessed, as well as the protein identification and relative quantification by SWATH-MS yields were compared with the results from the same sample without precipitation. From this study, it was possible to conclude that in the case of diluted samples in denaturing buffers, the use of cold acetone as precipitation protocol is more favourable than the use of TCA/acetone in terms of reproducibility in protein recovery and number of identified and quantified proteins. Furthermore, the reproducibility in relative quantification of the proteins is even higher in samples precipitated with acetone compared with the original sample. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Gastric-tube versus whole-stomach esophagectomy for esophageal cancer: A systematic review and meta-analysis.

    PubMed

    Zhang, Wenxiong; Yu, Dongliang; Peng, Jinhua; Xu, Jianjun; Wei, Yiping

    2017-01-01

    To conduct a systematic review and meta-analysis of studies comparing the gastric-tube vs. whole-stomach for esophageal cancer in order to determine the optimal surgical technique of esophagectomy. A comprehensive literature search was performed using PubMed, EMBASE, ScienceDirect, Ovid MEDLINE, Cochrane Library, Web of Science, Google Scholar, and Scopus. Clinical trials that compared the gastric-tube versus whole-stomach for esophageal cancer were selected. The clinical endpoints included anastomotic leakage, anastomotic stenosis, reflux esophagitis, pneumonia, delayed gastric emptying, and thoracic stomach syndrome. A total of 6 articles (1571 patients) were included. Compared to the whole-stomach approach, the gastric-tube approach was associated with a lower incidence of reflux esophagitis (95% confidence interval [CI]: 0.16 to 0.81, p = 0.01) and thoracic stomach syndrome (95% CI: 0.17 to 0.55, p < 0.0001). The rates of anastomotic leakage, anastomotic stenosis, pneumonia, and delayed gastric emptying did not significantly differ between the two groups. The gastric-tube esophagectomy is superior to the whole-stomach approach, as it is associated with a lower incidence of postoperative reflux esophagitis and thoracic stomach syndrome. Our findings must be validated in large-scale randomized controlled trials.

  20. Extracting archaeal populations from iron oxidizing systems

    NASA Astrophysics Data System (ADS)

    Whitmore, L. M.; Hutchison, J.; Chrisler, W.; Jay, Z.; Moran, J.; Inskeep, W.; Kreuzer, H.

    2013-12-01

    Unique environments in Yellowstone National Park offer exceptional conditions for studying microorganisms in extreme and constrained systems. However, samples from some extreme systems often contain inorganic components that pose complications during microbial and molecular analysis. Several archaeal species are found in acidic, geothermal ferric-oxyhydroxide mats; these species have been shown to adhere to mineral surfaces in flocculated colonies. For optimal microbial analysis, (microscopy, flow cytometry, genomic extractions, proteomic analysis, stable isotope analysis, and others), improved techniques are needed to better facilitate cell detachment and separation from mineral surfaces. As a requirement, these techniques must preserve cell structure while simultaneously minimizing organic carryover to downstream analysis. Several methods have been developed for removing sediments from mixed prokaryotic populations, including ultra-centrifugation, nycodenz gradient, sucrose cushions, and cell straining. In this study we conduct a comparative analysis of mechanisms used to detach archaeal cell populations from the mineral interface. Specifically, we evaluated mechanical and chemical approaches for cell separation and homogenization. Methods were compared using confocal microscopy, flow cytometry analyses, and real-time PCR detection. The methodology and approaches identified will be used to optimize biomass collection from environmental specimens or isolates grown with solid phases.

  1. Evaluation of methodology for the analysis of 'time-to-event' data in pharmacogenomic genome-wide association studies.

    PubMed

    Syed, Hamzah; Jorgensen, Andrea L; Morris, Andrew P

    2016-06-01

    To evaluate the power to detect associations between SNPs and time-to-event outcomes across a range of pharmacogenomic study designs while comparing alternative regression approaches. Simulations were conducted to compare Cox proportional hazards modeling accounting for censoring and logistic regression modeling of a dichotomized outcome at the end of the study. The Cox proportional hazards model was demonstrated to be more powerful than the logistic regression analysis. The difference in power between the approaches was highly dependent on the rate of censoring. Initial evaluation of single-nucleotide polymorphism association signals using computationally efficient software with dichotomized outcomes provides an effective screening tool for some design scenarios, and thus has important implications for the development of analytical protocols in pharmacogenomic studies.

  2. Ocean wavenumber estimation from wave-resolving time series imagery

    USGS Publications Warehouse

    Plant, N.G.; Holland, K.T.; Haller, M.C.

    2008-01-01

    We review several approaches that have been used to estimate ocean surface gravity wavenumbers from wave-resolving remotely sensed image sequences. Two fundamentally different approaches that utilize these data exist. A power spectral density approach identifies wavenumbers where image intensity variance is maximized. Alternatively, a cross-spectral correlation approach identifies wavenumbers where intensity coherence is maximized. We develop a solution to the latter approach based on a tomographic analysis that utilizes a nonlinear inverse method. The solution is tolerant to noise and other forms of sampling deficiency and can be applied to arbitrary sampling patterns, as well as to full-frame imagery. The solution includes error predictions that can be used for data retrieval quality control and for evaluating sample designs. A quantitative analysis of the intrinsic resolution of the method indicates that the cross-spectral correlation fitting improves resolution by a factor of about ten times as compared to the power spectral density fitting approach. The resolution analysis also provides a rule of thumb for nearshore bathymetry retrievals-short-scale cross-shore patterns may be resolved if they are about ten times longer than the average water depth over the pattern. This guidance can be applied to sample design to constrain both the sensor array (image resolution) and the analysis array (tomographic resolution). ?? 2008 IEEE.

  3. Hemiarch versus total aortic arch replacement in acute type A dissection: a systematic review and meta-analysis

    PubMed Central

    Poon, Shi Sum; Theologou, Thomas; Harrington, Deborah; Kuduvalli, Manoj; Oo, Aung

    2016-01-01

    Background Despite recent advances in aortic surgery, acute type A aortic dissection remains a surgical emergency associated with high mortality and morbidity. Appropriate management is crucial to achieve satisfactory outcomes but the optimal surgical approach is controversial. The present systematic review and meta-analysis sought to access cumulative data from comparative studies between hemiarch and total aortic arch replacement in patients with acute type A aortic dissection. Methods A systematic review of the literature using six databases. Eligible studies include comparative studies on hemiarch versus total arch replacement reporting short, medium and long term outcomes. A meta-analysis was performed on eligible studies reporting outcome of interest to quantify the effects of hemiarch replacement on mortality and morbidity risk compared to total arch replacement. Result Fourteen retrospective studies met the inclusion criteria and 2,221 patients were included in the final analysis. Pooled analysis showed that hemiarch replacement was associated with a lower risk of post-operative renal dialysis [risk ratio (RR) =0.72; 95% confidence interval (CI): 0.56–0.94; P=0.02; I2=0%]. There was no significant difference in terms of in-hospital mortality between the two groups (RR =0.84; 95% CI: 0.65–1.09; P=0.20; I2=0%). Cardiopulmonary bypass, aortic cross clamp and circulatory arrest times were significantly longer in total arch replacement. During follow up, no significant difference was reported from current studies between the two operative approaches in terms of aortic re-intervention and freedom from aortic reoperation. Conclusions Within the context of publication bias by high volume aortic centres and non-randomized data sets, there was no difference in mortality outcomes between the two groups. This analysis serves to demonstrate that for those centers doing sufficient total aortic arch activity to allow for publication, excellent and equivalent outcomes are achievable. Conclusions on differences in longer term outcome data are required. We do not, however, advocate total arch as a primary approach by all centers and surgeons irrespective of patient characteristics, but rather, a tailored approach based on surgeon and center experience and patient presentation. PMID:27386403

  4. The enteral vs parenteral nutrition debate revisited.

    PubMed

    Thomson, Andrew

    2008-01-01

    Many trials and several meta-analyses have been devoted to comparing enteral with parenteral nutrition support. In this review, these studies are subjected to critical analysis with particular emphasis on their methodology and clinical relevance. Evidence is produced to suggest that the heterogeneous patient populations of the studies and the rigid approach taken to comparing different nutrition therapies inter alia render their conclusions highly questionable and of very doubtful clinical significance. An alternative approach to nutrition research is suggested in which strategies of nutrition support rather than fixed menus are compared. It is suggested that objective measures of intestinal function be evaluated more fully in patients requiring nonvolitional nutrition support, and these are briefly reviewed. In addition, a more scientific approach to evaluating the physiological effects of nutrition support, including chemical tagging and evaluation of muscle function, is recommended.

  5. Gradient-Based Aerodynamic Shape Optimization Using ADI Method for Large-Scale Problems

    NASA Technical Reports Server (NTRS)

    Pandya, Mohagna J.; Baysal, Oktay

    1997-01-01

    A gradient-based shape optimization methodology, that is intended for practical three-dimensional aerodynamic applications, has been developed. It is based on the quasi-analytical sensitivities. The flow analysis is rendered by a fully implicit, finite volume formulation of the Euler equations.The aerodynamic sensitivity equation is solved using the alternating-direction-implicit (ADI) algorithm for memory efficiency. A flexible wing geometry model, that is based on surface parameterization and platform schedules, is utilized. The present methodology and its components have been tested via several comparisons. Initially, the flow analysis for for a wing is compared with those obtained using an unfactored, preconditioned conjugate gradient approach (PCG), and an extensively validated CFD code. Then, the sensitivities computed with the present method have been compared with those obtained using the finite-difference and the PCG approaches. Effects of grid refinement and convergence tolerance on the analysis and shape optimization have been explored. Finally the new procedure has been demonstrated in the design of a cranked arrow wing at Mach 2.4. Despite the expected increase in the computational time, the results indicate that shape optimization, which require large numbers of grid points can be resolved with a gradient-based approach.

  6. Comparative genomic analysis by microbial COGs self-attraction rate.

    PubMed

    Santoni, Daniele; Romano-Spica, Vincenzo

    2009-06-21

    Whole genome analysis provides new perspectives to determine phylogenetic relationships among microorganisms. The availability of whole nucleotide sequences allows different levels of comparison among genomes by several approaches. In this work, self-attraction rates were considered for each cluster of orthologous groups of proteins (COGs) class in order to analyse gene aggregation levels in physical maps. Phylogenetic relationships among microorganisms were obtained by comparing self-attraction coefficients. Eighteen-dimensional vectors were computed for a set of 168 completely sequenced microbial genomes (19 archea, 149 bacteria). The components of the vector represent the aggregation rate of the genes belonging to each of 18 COGs classes. Genes involved in nonessential functions or related to environmental conditions showed the highest aggregation rates. On the contrary genes involved in basic cellular tasks showed a more uniform distribution along the genome, except for translation genes. Self-attraction clustering approach allowed classification of Proteobacteria, Bacilli and other species belonging to Firmicutes. Rearrangement and Lateral Gene Transfer events may influence divergences from classical taxonomy. Each set of COG classes' aggregation values represents an intrinsic property of the microbial genome. This novel approach provides a new point of view for whole genome analysis and bacterial characterization.

  7. Advancing Alternative Analysis: Integration of Decision Science.

    PubMed

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina M; Blake, Ann; Carroll, William F; Corbett, Charles J; Hansen, Steffen Foss; Lempert, Robert J; Linkov, Igor; McFadden, Roger; Moran, Kelly D; Olivetti, Elsa; Ostrom, Nancy K; Romero, Michelle; Schoenung, Julie M; Seager, Thomas P; Sinsheimer, Peter; Thayer, Kristina A

    2017-06-13

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups' findings. We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. We advance four recommendations: a ) engaging the systematic development and evaluation of decision approaches and tools; b ) using case studies to advance the integration of decision analysis into alternatives analysis; c ) supporting transdisciplinary research; and d ) supporting education and outreach efforts. https://doi.org/10.1289/EHP483.

  8. A generalized estimating equations approach for resting-state functional MRI group analysis.

    PubMed

    D'Angelo, Gina M; Lazar, Nicole A; Eddy, William F; Morris, John C; Sheline, Yvette I

    2011-01-01

    An Alzheimer's fMRI study has motivated us to evaluate inter-regional correlations between groups. The overall objective is to assess inter-regional correlations at a resting-state with no stimulus or task. We propose using a generalized estimating equation (GEE) transition model and a GEE marginal model to model the within-subject correlation for each region. Residuals calculated from the GEE models are used to correlate brain regions and assess between group differences. The standard pooling approach of group averages of the Fisher-z transformation assuming temporal independence is a typical approach used to compare group correlations. The GEE approaches and standard Fisher-z pooling approach are demonstrated with an Alzheimer's disease (AD) connectivity study in a population of AD subjects and healthy control subjects. We also compare these methods using simulation studies and show that the transition model may have better statistical properties.

  9. Data Intensive Analysis of Biomolecular Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straatsma, TP; Soares, Thereza A.

    2007-12-01

    The advances in biomolecular modeling and simulation made possible by the availability of increasingly powerful high performance computing resources is extending molecular simulations to biological more relevant system size and time scales. At the same time, advances in simulation methodologies are allowing more complex processes to be described more accurately. These developments make a systems approach to computational structural biology feasible, but this will require a focused emphasis on the comparative analysis of the increasing number of molecular simulations that are being carried out for biomolecular systems with more realistic models, multi-component environments, and for longer simulation times. Just asmore » in the case of the analysis of the large data sources created by the new high-throughput experimental technologies, biomolecular computer simulations contribute to the progress in biology through comparative analysis. The continuing increase in available protein structures allows the comparative analysis of the role of structure and conformational flexibility in protein function, and is the foundation of the discipline of structural bioinformatics. This creates the opportunity to derive general findings from the comparative analysis of molecular dynamics simulations of a wide range of proteins, protein-protein complexes and other complex biological systems. Because of the importance of protein conformational dynamics for protein function, it is essential that the analysis of molecular trajectories is carried out using a novel, more integrative and systematic approach. We are developing a much needed rigorous computer science based framework for the efficient analysis of the increasingly large data sets resulting from molecular simulations. Such a suite of capabilities will also provide the required tools for access and analysis of a distributed library of generated trajectories. Our research is focusing on the following areas: (1) the development of an efficient analysis framework for very large scale trajectories on massively parallel architectures, (2) the development of novel methodologies that allow automated detection of events in these very large data sets, and (3) the efficient comparative analysis of multiple trajectories. The goal of the presented work is the development of new algorithms that will allow biomolecular simulation studies to become an integral tool to address the challenges of post-genomic biological research. The strategy to deliver the required data intensive computing applications that can effectively deal with the volume of simulation data that will become available is based on taking advantage of the capabilities offered by the use of large globally addressable memory architectures. The first requirement is the design of a flexible underlying data structure for single large trajectories that will form an adaptable framework for a wide range of analysis capabilities. The typical approach to trajectory analysis is to sequentially process trajectories time frame by time frame. This is the implementation found in molecular simulation codes such as NWChem, and has been designed in this way to be able to run on workstation computers and other architectures with an aggregate amount of memory that would not allow entire trajectories to be held in core. The consequence of this approach is an I/O dominated solution that scales very poorly on parallel machines. We are currently using an approach of developing tools specifically intended for use on large scale machines with sufficient main memory that entire trajectories can be held in core. This greatly reduces the cost of I/O as trajectories are read only once during the analysis. In our current Data Intensive Analysis (DIANA) implementation, each processor determines and skips to the entry within the trajectory that typically will be available in multiple files and independently from all other processors read the appropriate frames.« less

  10. Data Intensive Analysis of Biomolecular Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straatsma, TP

    2008-03-01

    The advances in biomolecular modeling and simulation made possible by the availability of increasingly powerful high performance computing resources is extending molecular simulations to biological more relevant system size and time scales. At the same time, advances in simulation methodologies are allowing more complex processes to be described more accurately. These developments make a systems approach to computational structural biology feasible, but this will require a focused emphasis on the comparative analysis of the increasing number of molecular simulations that are being carried out for biomolecular systems with more realistic models, multi-component environments, and for longer simulation times. Just asmore » in the case of the analysis of the large data sources created by the new high-throughput experimental technologies, biomolecular computer simulations contribute to the progress in biology through comparative analysis. The continuing increase in available protein structures allows the comparative analysis of the role of structure and conformational flexibility in protein function, and is the foundation of the discipline of structural bioinformatics. This creates the opportunity to derive general findings from the comparative analysis of molecular dynamics simulations of a wide range of proteins, protein-protein complexes and other complex biological systems. Because of the importance of protein conformational dynamics for protein function, it is essential that the analysis of molecular trajectories is carried out using a novel, more integrative and systematic approach. We are developing a much needed rigorous computer science based framework for the efficient analysis of the increasingly large data sets resulting from molecular simulations. Such a suite of capabilities will also provide the required tools for access and analysis of a distributed library of generated trajectories. Our research is focusing on the following areas: (1) the development of an efficient analysis framework for very large scale trajectories on massively parallel architectures, (2) the development of novel methodologies that allow automated detection of events in these very large data sets, and (3) the efficient comparative analysis of multiple trajectories. The goal of the presented work is the development of new algorithms that will allow biomolecular simulation studies to become an integral tool to address the challenges of post-genomic biological research. The strategy to deliver the required data intensive computing applications that can effectively deal with the volume of simulation data that will become available is based on taking advantage of the capabilities offered by the use of large globally addressable memory architectures. The first requirement is the design of a flexible underlying data structure for single large trajectories that will form an adaptable framework for a wide range of analysis capabilities. The typical approach to trajectory analysis is to sequentially process trajectories time frame by time frame. This is the implementation found in molecular simulation codes such as NWChem, and has been designed in this way to be able to run on workstation computers and other architectures with an aggregate amount of memory that would not allow entire trajectories to be held in core. The consequence of this approach is an I/O dominated solution that scales very poorly on parallel machines. We are currently using an approach of developing tools specifically intended for use on large scale machines with sufficient main memory that entire trajectories can be held in core. This greatly reduces the cost of I/O as trajectories are read only once during the analysis. In our current Data Intensive Analysis (DIANA) implementation, each processor determines and skips to the entry within the trajectory that typically will be available in multiple files and independently from all other processors read the appropriate frames.« less

  11. Comparison of Requirements for Composite Structures for Aircraft and Space Applications

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Elliott, Kenny B.; Hampton, Roy W.; Knight, Norman F., Jr.; Aggarwal, Pravin; Engelstad, Stephen P.; Chang, James B.

    2010-01-01

    In this paper, the aircraft and space vehicle requirements for composite structures are compared. It is a valuable exercise to study composite structural design approaches used in the airframe industry, and to adopt methodology that is applicable for space vehicles. The missions, environments, analysis methods, analysis validation approaches, testing programs, build quantities, inspection, and maintenance procedures used by the airframe industry, in general, are not transferable to spaceflight hardware. Therefore, while the application of composite design approaches from other industries is appealing, many aspects cannot be directly utilized. Nevertheless, experiences and research for composite aircraft structures may be of use in unexpected arenas as space exploration technology develops, and so continued technology exchanges are encouraged.

  12. Evaluation of the Mg doping approach for Si mass fractionation correction on Nu Instruments MC-ICP Mass Spectrometers

    NASA Astrophysics Data System (ADS)

    Zhao, Ye; Hsieh, Yu-Te; Belshaw, Nick

    2015-04-01

    Silicon (Si) stable isotopes have been used in a broad range of geochemical and cosmochemical applications. A precise and accurate determination of Si isotopes is desirable to distinguish their small natural variations (< 0.2‰) in many of these studies. In the past decade, the advent of the MC-ICP-MS has spurred a remarkable improvement in the precision and accuracy of Si isotopic analysis. The instrumental mass fractionation correction is one crucial aspect of the analysis of Si isotopes. Two options are currently available: the sample-standard bracketing approach and the Mg doping approach. However, there has been a debate over the validity of the Mg doping approach. Some studies (Cardinal et al., 2003; Engström et al., 2006) favoured it compared to the sample-standard bracketing approach, whereas some other studies (e.g. De La Rocha, 2002) considered it unsuitable. This study investigates the Mg doping approach on both the Nu Plasma II and the Nu Plasma 1700. Experiments were performed in both the wet plasma and the dry plasma modes, using a number of different combinations of cones. A range of different Mg to Si ratios as well as different matrices have been used in the experiments. A sample-standard bracketing approach has also been adopted for the Si mass fractionation correction to compare with the Mg doping approach. Through assessing the mass fractionation behaviours of both Si and Mg under different instrument settings, this study aims to identity the factors which may affect the Mg doping approach and answer some key questions to the debate.

  13. Impact of totally laparoscopic combined management of colorectal cancer with synchronous hepatic metastases on severity of complications: a propensity-score-based analysis.

    PubMed

    Ratti, Francesca; Catena, Marco; Di Palo, Saverio; Staudacher, Carlo; Aldrighetti, Luca

    2016-11-01

    Thanks to widespread diffusion of minimally invasive approach in the setting of both colorectal and hepatic surgeries, the interest in combined resections for colorectal cancer and synchronous liver metastases (SCLM) by totally laparoscopic approach (TLA) has increased. Aim of this study was to compare outcome of combined resections for SCLM performed by TLA or by open approach, in a propensity-score-based study. All 25 patients undergoing combined TLA for SCLM at San Raffaele Hospital in Milano were compared in a case-matched analysis with 25 out of 91 patients undergoing totally open approach (TOA group). Groups were matched with 1:2 ratio using propensity scores based on covariates representing disease severity. Main endpoints were postoperative morbidity and long-term outcome. The Modified Accordion Severity Grading System was used to quantify complications. The groups resulted comparable in terms of patients and disease characteristics. The TLA group, as compared to the TOA group, had lower blood loss (350 vs 600 mL), shorter postoperative stay (9 vs 12 days), lower postoperative morbidity index (0.14 vs 0.20) and severity score for complicated patients (0.60 vs 0.85). Colonic anastomosis leakage had the highest fractional complication burden in both groups. In spite of comparable long-term overall survival, the TLA group had better recurrence-free survival. TLA for combined resections is feasible, and its indications can be widened to encompass a larger population of patients, provided its benefits in terms of reduced overall risk and severity of complications, rapid functional recovery and favorable long-term outcomes.

  14. Automated model-based quantitative analysis of phantoms with spherical inserts in FDG PET scans.

    PubMed

    Ulrich, Ethan J; Sunderland, John J; Smith, Brian J; Mohiuddin, Imran; Parkhurst, Jessica; Plichta, Kristin A; Buatti, John M; Beichel, Reinhard R

    2018-01-01

    Quality control plays an increasingly important role in quantitative PET imaging and is typically performed using phantoms. The purpose of this work was to develop and validate a fully automated analysis method for two common PET/CT quality assurance phantoms: the NEMA NU-2 IQ and SNMMI/CTN oncology phantom. The algorithm was designed to only utilize the PET scan to enable the analysis of phantoms with thin-walled inserts. We introduce a model-based method for automated analysis of phantoms with spherical inserts. Models are first constructed for each type of phantom to be analyzed. A robust insert detection algorithm uses the model to locate all inserts inside the phantom. First, candidates for inserts are detected using a scale-space detection approach. Second, candidates are given an initial label using a score-based optimization algorithm. Third, a robust model fitting step aligns the phantom model to the initial labeling and fixes incorrect labels. Finally, the detected insert locations are refined and measurements are taken for each insert and several background regions. In addition, an approach for automated selection of NEMA and CTN phantom models is presented. The method was evaluated on a diverse set of 15 NEMA and 20 CTN phantom PET/CT scans. NEMA phantoms were filled with radioactive tracer solution at 9.7:1 activity ratio over background, and CTN phantoms were filled with 4:1 and 2:1 activity ratio over background. For quantitative evaluation, an independent reference standard was generated by two experts using PET/CT scans of the phantoms. In addition, the automated approach was compared against manual analysis, which represents the current clinical standard approach, of the PET phantom scans by four experts. The automated analysis method successfully detected and measured all inserts in all test phantom scans. It is a deterministic algorithm (zero variability), and the insert detection RMS error (i.e., bias) was 0.97, 1.12, and 1.48 mm for phantom activity ratios 9.7:1, 4:1, and 2:1, respectively. For all phantoms and at all contrast ratios, the average RMS error was found to be significantly lower for the proposed automated method compared to the manual analysis of the phantom scans. The uptake measurements produced by the automated method showed high correlation with the independent reference standard (R 2 ≥ 0.9987). In addition, the average computing time for the automated method was 30.6 s and was found to be significantly lower (P ≪ 0.001) compared to manual analysis (mean: 247.8 s). The proposed automated approach was found to have less error when measured against the independent reference than the manual approach. It can be easily adapted to other phantoms with spherical inserts. In addition, it eliminates inter- and intraoperator variability in PET phantom analysis and is significantly more time efficient, and therefore, represents a promising approach to facilitate and simplify PET standardization and harmonization efforts. © 2017 American Association of Physicists in Medicine.

  15. Psychobiological operationalization of RDoC constructs: Methodological and conceptual opportunities and challenges.

    PubMed

    MacNamara, Annmarie; Phan, K Luan

    2016-03-01

    NIMH's Research Domain Criteria (RDoC) project seeks to advance the diagnosis, prevention, and treatment of mental disorders by promoting psychobiological research on dimensional constructs that might cut across traditional diagnostic boundaries (Kozak & Cuthbert, ). At the core of this approach is the notion that these dimensional constructs can be assessed across different units of analysis (e.g., genes, physiology, behavior), enriching the constructs and providing more complete explanations of clinical problems. While the conceptual aspects of RDoC have been discussed in several prior papers, its methodological aspects have received comparatively less attention. For example, how to integrate data from different units of analysis has been relatively unclear. Here, we discuss one means of psychobiologically operationalizing RDoC constructs across different units of analysis (the psychoneurometric approach; Yancey et al., ), highlighting ways in which this approach might be refined in future iterations. We conclude that there is much to be learned from this technique; however, greater attention to scale-development methods and to psychometrics will likely benefit this and other methodological approaches to combining measurements across multiple units of analysis. © 2016 Society for Psychophysiological Research.

  16. Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm.

    PubMed

    Al-Saffar, Ahmed; Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-Bared, Mohammed

    2018-01-01

    Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach.

  17. Malay sentiment analysis based on combined classification approaches and Senti-lexicon algorithm

    PubMed Central

    Awang, Suryanti; Tao, Hai; Omar, Nazlia; Al-Saiagh, Wafaa; Al-bared, Mohammed

    2018-01-01

    Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. In this paper, a Malay sentiment analysis classification model is proposed to improve classification performances based on the semantic orientation and machine learning approaches. First, a total of 2,478 Malay sentiment-lexicon phrases and words are assigned with a synonym and stored with the help of more than one Malay native speaker, and the polarity is manually allotted with a score. In addition, the supervised machine learning approaches and lexicon knowledge method are combined for Malay sentiment classification with evaluating thirteen features. Finally, three individual classifiers and a combined classifier are used to evaluate the classification accuracy. In experimental results, a wide-range of comparative experiments is conducted on a Malay Reviews Corpus (MRC), and it demonstrates that the feature extraction improves the performance of Malay sentiment analysis based on the combined classification. However, the results depend on three factors, the features, the number of features and the classification approach. PMID:29684036

  18. Support vector methods for survival analysis: a comparison between ranking and regression approaches.

    PubMed

    Van Belle, Vanya; Pelckmans, Kristiaan; Van Huffel, Sabine; Suykens, Johan A K

    2011-10-01

    To compare and evaluate ranking, regression and combined machine learning approaches for the analysis of survival data. The literature describes two approaches based on support vector machines to deal with censored observations. In the first approach the key idea is to rephrase the task as a ranking problem via the concordance index, a problem which can be solved efficiently in a context of structural risk minimization and convex optimization techniques. In a second approach, one uses a regression approach, dealing with censoring by means of inequality constraints. The goal of this paper is then twofold: (i) introducing a new model combining the ranking and regression strategy, which retains the link with existing survival models such as the proportional hazards model via transformation models; and (ii) comparison of the three techniques on 6 clinical and 3 high-dimensional datasets and discussing the relevance of these techniques over classical approaches fur survival data. We compare svm-based survival models based on ranking constraints, based on regression constraints and models based on both ranking and regression constraints. The performance of the models is compared by means of three different measures: (i) the concordance index, measuring the model's discriminating ability; (ii) the logrank test statistic, indicating whether patients with a prognostic index lower than the median prognostic index have a significant different survival than patients with a prognostic index higher than the median; and (iii) the hazard ratio after normalization to restrict the prognostic index between 0 and 1. Our results indicate a significantly better performance for models including regression constraints above models only based on ranking constraints. This work gives empirical evidence that svm-based models using regression constraints perform significantly better than svm-based models based on ranking constraints. Our experiments show a comparable performance for methods including only regression or both regression and ranking constraints on clinical data. On high dimensional data, the former model performs better. However, this approach does not have a theoretical link with standard statistical models for survival data. This link can be made by means of transformation models when ranking constraints are included. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Comparison of connectivity analyses for resting state EEG data

    NASA Astrophysics Data System (ADS)

    Olejarczyk, Elzbieta; Marzetti, Laura; Pizzella, Vittorio; Zappasodi, Filippo

    2017-06-01

    Objective. In the present work, a nonlinear measure (transfer entropy, TE) was used in a multivariate approach for the analysis of effective connectivity in high density resting state EEG data in eyes open and eyes closed. Advantages of the multivariate approach in comparison to the bivariate one were tested. Moreover, the multivariate TE was compared to an effective linear measure, i.e. directed transfer function (DTF). Finally, the existence of a relationship between the information transfer and the level of brain synchronization as measured by phase synchronization value (PLV) was investigated. Approach. The comparison between the connectivity measures, i.e. bivariate versus multivariate TE, TE versus DTF, TE versus PLV, was performed by means of statistical analysis of indexes based on graph theory. Main results. The multivariate approach is less sensitive to false indirect connections with respect to the bivariate estimates. The multivariate TE differentiated better between eyes closed and eyes open conditions compared to DTF. Moreover, the multivariate TE evidenced non-linear phenomena in information transfer, which are not evidenced by the use of DTF. We also showed that the target of information flow, in particular the frontal region, is an area of greater brain synchronization. Significance. Comparison of different connectivity analysis methods pointed to the advantages of nonlinear methods, and indicated a relationship existing between the flow of information and the level of synchronization of the brain.

  20. Multiple Imputation of a Randomly Censored Covariate Improves Logistic Regression Analysis.

    PubMed

    Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A

    2016-01-01

    Randomly censored covariates arise frequently in epidemiologic studies. The most commonly used methods, including complete case and single imputation or substitution, suffer from inefficiency and bias. They make strong parametric assumptions or they consider limit of detection censoring only. We employ multiple imputation, in conjunction with semi-parametric modeling of the censored covariate, to overcome these shortcomings and to facilitate robust estimation. We develop a multiple imputation approach for randomly censored covariates within the framework of a logistic regression model. We use the non-parametric estimate of the covariate distribution or the semiparametric Cox model estimate in the presence of additional covariates in the model. We evaluate this procedure in simulations, and compare its operating characteristics to those from the complete case analysis and a survival regression approach. We apply the procedures to an Alzheimer's study of the association between amyloid positivity and maternal age of onset of dementia. Multiple imputation achieves lower standard errors and higher power than the complete case approach under heavy and moderate censoring and is comparable under light censoring. The survival regression approach achieves the highest power among all procedures, but does not produce interpretable estimates of association. Multiple imputation offers a favorable alternative to complete case analysis and ad hoc substitution methods in the presence of randomly censored covariates within the framework of logistic regression.

  1. Bayesian approach for counting experiment statistics applied to a neutrino point source analysis

    NASA Astrophysics Data System (ADS)

    Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.

    2013-12-01

    In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.

  2. Cost analysis of adjustments of the epidemiological surveillance system to mass gatherings.

    PubMed

    Zieliński, Andrzej

    2011-01-01

    The article deals with the problem of economical analysis of public health activities at mass gatherings. After presentation of elementary review of basic economical approaches to cost analysis author tries to analyze applicability of those methods to planning of mass gatherings. Difficulties in comparability of different events and lack of the outcome data at the stage of planning make most of the economic approaches unsuitable to application at the planning stage. Even applicability of cost minimization analysis may be limited to comparison of predicted costs of preconceived standards of epidemiological surveillance. Cost effectiveness performed ex post after the event when both costs and obtained effects are known, may bring more information for future selection of most effective procedures.

  3. Differential analysis between somatic mutation and germline variation profiles reveals cancer-related genes.

    PubMed

    Przytycki, Pawel F; Singh, Mona

    2017-08-25

    A major aim of cancer genomics is to pinpoint which somatically mutated genes are involved in tumor initiation and progression. We introduce a new framework for uncovering cancer genes, differential mutation analysis, which compares the mutational profiles of genes across cancer genomes with their natural germline variation across healthy individuals. We present DiffMut, a fast and simple approach for differential mutational analysis, and demonstrate that it is more effective in discovering cancer genes than considerably more sophisticated approaches. We conclude that germline variation across healthy human genomes provides a powerful means for characterizing somatic mutation frequency and identifying cancer driver genes. DiffMut is available at https://github.com/Singh-Lab/Differential-Mutation-Analysis .

  4. DOT/NASA comparative assessment of Brayton engines for guideway vehicles and busses. Volume 2: Analysis and results

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Gas turbine engines were assessed for application to hear duty transportation. A summary of the assumptions, applications, and methods of analysis is included along with a discussion of the approach taken, the technical program flow chart, and weighting criteria used for performance evaluation. The various engines are compared on the bases of weight, performance, emissions and noise, technology status, and growth potential. The results of the engine screening phase and the conceptual design phase are presented.

  5. A Comparative Analysis between the Assessment Criteria Used to Assess Graduating Teachers at Rustaq College (Oman) and Griffith University (Australia) during the Teaching Practicum

    ERIC Educational Resources Information Center

    Al-Malki, Moza Abdullah; Weir, Katie

    2014-01-01

    This article reports the findings from a study that compares the assessment criteria used to measure pre-service teachers' professional competencies at Rustaq College of Applied Sciences in Oman, and at Griffith University in Queensland, Australia. The study adopts a discourse analytic approach to deconstruct and critically compare the assessment…

  6. Quantitative and qualitative analysis of the working area obtained by endoscope and microscope in pterional and orbitozigomatic approach to the basilar artery bifurcation using computed tomography based frameless stereotaxy: A cadaver study

    PubMed Central

    Filipce, Venko; Ammirati, Mario

    2015-01-01

    Objective: Basilar aneurisms are one of the most complex and challenging pathologies for neurosurgeons to treat. Endoscopy is a recently rediscovered neurosurgical technique that could lend itself well to overcome some of the vascular visualization challenges associated with this pathology. The purpose of this study was to quantify and compare the basilar artery (BA) bifurcation (tip of the basilar) working area afforded by the microscope and the endoscope using different approaches and image guidance. Materials and Methods: We performed a total of 9 dissections, including pterional (PT) and orbitozygomatic (OZ) approaches bilaterally in five whole, fresh cadaver heads. We used computed tomography based image guidance for intraoperative navigation as well as for quantitative measurements. We estimated the working area of the tip of the basilar, using both a rigid endoscope and an operating microscope. Operability was qualitatively assessed by the senior authors. Results: In microscopic exposure, the OZ approach provided greater working area (160 ± 34.3 mm2) compared to the PT approach (129.8 ± 37.6 mm2) (P > 0.05). The working area in both PT and OZ approaches using 0° and 30° endoscopes was larger than the one available using the microscope alone (P < 0.05). In the PT approach, both 0° and 30° endoscopes provided a working area greater than a microscopic OZ approach (P < 0.05) and an area comparable to the OZ endoscopic approach (P > 0.05). Conclusion: Integration of endoscope and microscope in both PT and OZ approaches can provide significantly greater surgical exposure of the BA bifurcation compared to that afforded by the conventional approaches alone. PMID:25972933

  7. Comparison of suprapatellar and infrapatellar intramedullary nailing for tibial shaft fractures: a systematic review and meta-analysis.

    PubMed

    Yang, Liqing; Sun, Yuefeng; Li, Ge

    2018-06-14

    Optimal surgical approach for tibial shaft fractures remains controversial. We perform a meta-analysis from randomized controlled trials (RCTs) to compare the clinical efficacy and prognosis between infrapatellar and suprapatellar intramedullary nail in the treatment of tibial shaft fractures. PubMed, OVID, Embase, ScienceDirect, and Web of Science were searched up to December 2017 for comparative RCTs involving infrapatellar and suprapatellar intramedullary nail in the treatment of tibial shaft fractures. Primary outcomes were blood loss, visual analog scale (VAS) score, range of motion, Lysholm knee scores, and fluoroscopy times. Secondary outcomes were length of hospital stay and postoperative complications. We assessed statistical heterogeneity for each outcome with the use of a standard χ 2 test and the I 2 statistic. The meta-analysis was undertaken using Stata 14.0. Four RCTs involving 293 participants were included in our study. The present meta-analysis indicated that there were significant differences between infrapatellar and suprapatellar intramedullary nail regarding the total blood loss, VAS scores, Lysholm knee scores, and fluoroscopy times. Suprapatellar intramedullary nailing could significantly reduce total blood loss, postoperative knee pain, and fluoroscopy times compared to infrapatellar approach. Additionally, it was associated with an improved Lysholm knee scores. High-quality RCTs were still required for further investigation.

  8. Implementing informative priors for heterogeneity in meta-analysis using meta-regression and pseudo data.

    PubMed

    Rhodes, Kirsty M; Turner, Rebecca M; White, Ian R; Jackson, Dan; Spiegelhalter, David J; Higgins, Julian P T

    2016-12-20

    Many meta-analyses combine results from only a small number of studies, a situation in which the between-study variance is imprecisely estimated when standard methods are applied. Bayesian meta-analysis allows incorporation of external evidence on heterogeneity, providing the potential for more robust inference on the effect size of interest. We present a method for performing Bayesian meta-analysis using data augmentation, in which we represent an informative conjugate prior for between-study variance by pseudo data and use meta-regression for estimation. To assist in this, we derive predictive inverse-gamma distributions for the between-study variance expected in future meta-analyses. These may serve as priors for heterogeneity in new meta-analyses. In a simulation study, we compare approximate Bayesian methods using meta-regression and pseudo data against fully Bayesian approaches based on importance sampling techniques and Markov chain Monte Carlo (MCMC). We compare the frequentist properties of these Bayesian methods with those of the commonly used frequentist DerSimonian and Laird procedure. The method is implemented in standard statistical software and provides a less complex alternative to standard MCMC approaches. An importance sampling approach produces almost identical results to standard MCMC approaches, and results obtained through meta-regression and pseudo data are very similar. On average, data augmentation provides closer results to MCMC, if implemented using restricted maximum likelihood estimation rather than DerSimonian and Laird or maximum likelihood estimation. The methods are applied to real datasets, and an extension to network meta-analysis is described. The proposed method facilitates Bayesian meta-analysis in a way that is accessible to applied researchers. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  9. Contemporary management of frontal sinus mucoceles: a meta-analysis.

    PubMed

    Courson, Andy M; Stankiewicz, James A; Lal, Devyani

    2014-02-01

    To analyze trends in the surgical management of frontal and fronto-ethmoid mucoceles through meta-analysis. Meta-analysis and case series. A systematic literature review on surgical management of frontal and fronto-ethmoid mucoceles was conducted. Studies were divided into historical (1975-2001) and contemporary (2002-2012) groups. A meta-analysis of these studies was performed. The historical and contemporary cohorts were compared (surgical approach, recurrence, and complications). To study evolution in surgical management, a senior surgeon's experience over 28 years was analyzed separately. Thirty-one studies were included for meta-analysis. The historical cohort included 425 mucoceles from 11 studies. The contemporary cohort included 542 mucoceles from 20 studies. More endoscopic techniques were used in the contemporary versus historical cohort (53.9% vs. 24.7%; P = <0.001). In the authors' series, a higher percentage was treated endoscopically (82.8% of 122 mucoceles). Recurrence (P = 0.20) and major complication (P = 0.23) rates were similar between cohorts. Minor complication rates were superior for endoscopic techniques in both cohorts (P = 0.02 historical; P = <0.001 contemporary). In the historical cohort, higher recurrence was noted in the external group (P = 0.03). Results from endoscopic and open approaches are comparable. Although endoscopic techniques are being increasingly adopted, comparison with our series shows that more cases could potentially be treated endoscopically. Frequent use of open approaches may reflect efficacy, or perhaps lack of expertise and equipment required for endoscopic management. Most contemporary authors favor endoscopic management, limiting open approaches for specific indications (unfavorable anatomy, lateral disease, and scarring). N/A. Copyright © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  10. A Qualitative Assessment of the Learning Outcomes of Teaching Introductory American Politics in Comparative Perspective

    ERIC Educational Resources Information Center

    Gelbman, Shamira M.

    2011-01-01

    This article discusses the findings of an ethnographic content analysis of students' written reflections as a means for assessing the learning outcomes of teaching introductory American politics in comparative perspective. It focuses especially on determining whether and how this approach enhanced students' understanding and retention of knowledge…

  11. Strategic Role of HRM in Turkey: A Three-Country Comparative Analysis

    ERIC Educational Resources Information Center

    Ozcelik, Ayse Oya; Aydinli, Fulya

    2006-01-01

    Purpose: To explore the strategic role of human resource management (HRM) in Turkey by comparing Turkish companies to Spanish and German companies. Design/methodology/approach: The questionnaire form of the Cranet-G 1999-2000 Survey (Cranfield Network on Strategic International Human Resource Management) has been used to collect the data. The…

  12. Comparative analysis of forest lands cadastral appraisal estimated with regards to wood and food resources

    NASA Astrophysics Data System (ADS)

    Kovyazin, V.; Romanchikov, A.; Pasko, O.

    2015-11-01

    Cadastral appraisal of forest fund is one of the topical challenges of modern natural resource management. The paper delivers comparison of different approaches to cadastral appraisal of forest lands. The authors suggest a uniformed model to compare objectively and choose the most effective use of parcels.

  13. Comparing Two Theories of Grammatical Knowledge Assessment: A Bifactor-MIRT Analysis

    ERIC Educational Resources Information Center

    Cai, Yuyang

    2014-01-01

    This study compares two approaches to grammatical knowledge in language assessment: the structural view that regards grammatical knowledge as vocabulary and syntax (Bachman 1990), and the communicative view that perceives it as the binary combination of grammatical form and meaning (Purpura 2004). 1,491 second-year nursing students from eight…

  14. Intradural Procedural Time to Assess Technical Difficulty of Superciliary Keyhole and Pterional Approaches for Unruptured Middle Cerebral Artery Aneurysms

    PubMed Central

    Choi, Yeon-Ju; Son, Wonsoo; Park, Ki-Su

    2016-01-01

    Objective This study used the intradural procedural time to assess the overall technical difficulty involved in surgically clipping an unruptured middle cerebral artery (MCA) aneurysm via a pterional or superciliary approach. The clinical and radiological variables affecting the intradural procedural time were investigated, and the intradural procedural time compared between a superciliary keyhole approach and a pterional approach. Methods During a 5.5-year period, patients with a single MCA aneurysm were enrolled in this retrospective study. The selection criteria for a superciliary keyhole approach included : 1) maximum diameter of the unruptured MCA aneurysm <15 mm, 2) neck diameter of the MCA aneurysm <10 mm, and 3) aneurysm location involving the sphenoidal or horizontal segment of MCA (M1) segment and MCA bifurcation, excluding aneurysms distal to the MCA genu. Meanwhile, the control comparison group included patients with the same selection criteria as for a superciliary approach, yet who preferred a pterional approach to avoid a postoperative facial wound or due to preoperative skin trouble in the supraorbital area. To determine the variables affecting the intradural procedural time, a multiple regression analysis was performed using such data as the patient age and gender, maximum aneurysm diameter, aneurysm neck diameter, and length of the pre-aneurysm M1 segment. In addition, the intradural procedural times were compared between the superciliary and pterional patient groups, along with the other variables. Results A total of 160 patients underwent a superciliary (n=124) or pterional (n=36) approach for an unruptured MCA aneurysm. In the multiple regression analysis, an increase in the diameter of the aneurysm neck (p<0.001) was identified as a statistically significant factor increasing the intradural procedural time. A Pearson correlation analysis also showed a positive correlation (r=0.340) between the neck diameter and the intradural procedural time. When comparing the superciliary and pterional groups, no statistically significant between-group difference was found in terms of the intradural procedural time reflecting the technical difficulty (mean±standard deviation : 29.8±13.0 min versus 27.7±9.6 min). Conclusion A superciliary keyhole approach can be a useful alternative to a pterional approach for an unruptured MCA aneurysm with a maximum diameter <15 mm and neck diameter <10 mm, representing no more of a technical challenge. For both surgical approaches, the technical difficulty increases along with the neck diameter of the MCA aneurysm. PMID:27847568

  15. Critical assessment of inverse gas chromatography as means of assessing surface free energy and acid-base interaction of pharmaceutical powders.

    PubMed

    Telko, Martin J; Hickey, Anthony J

    2007-10-01

    Inverse gas chromatography (IGC) has been employed as a research tool for decades. Despite this record of use and proven utility in a variety of applications, the technique is not routinely used in pharmaceutical research. In other fields the technique has flourished. IGC is experimentally relatively straightforward, but analysis requires that certain theoretical assumptions are satisfied. The assumptions made to acquire some of the recently reported data are somewhat modified compared to initial reports. Most publications in the pharmaceutical literature have made use of a simplified equation for the determination of acid/base surface properties resulting in parameter values that are inconsistent with prior methods. In comparing the surface properties of different batches of alpha-lactose monohydrate, new data has been generated and compared with literature to allow critical analysis of the theoretical assumptions and their importance to the interpretation of the data. The commonly used (simplified) approach was compared with the more rigorous approach originally outlined in the surface chemistry literature. (c) 2007 Wiley-Liss, Inc.

  16. Enabling High-performance Interactive Geoscience Data Analysis Through Data Placement and Movement Optimization

    NASA Astrophysics Data System (ADS)

    Zhu, F.; Yu, H.; Rilee, M. L.; Kuo, K. S.; Yu, L.; Pan, Y.; Jiang, H.

    2017-12-01

    Since the establishment of data archive centers and the standardization of file formats, scientists are required to search metadata catalogs for data needed and download the data files to their local machines to carry out data analysis. This approach has facilitated data discovery and access for decades, but it inevitably leads to data transfer from data archive centers to scientists' computers through low-bandwidth Internet connections. Data transfer becomes a major performance bottleneck in such an approach. Combined with generally constrained local compute/storage resources, they limit the extent of scientists' studies and deprive them of timely outcomes. Thus, this conventional approach is not scalable with respect to both the volume and variety of geoscience data. A much more viable solution is to couple analysis and storage systems to minimize data transfer. In our study, we compare loosely coupled approaches (exemplified by Spark and Hadoop) and tightly coupled approaches (exemplified by parallel distributed database management systems, e.g., SciDB). In particular, we investigate the optimization of data placement and movement to effectively tackle the variety challenge, and boost the popularization of parallelization to address the volume challenge. Our goal is to enable high-performance interactive analysis for a good portion of geoscience data analysis exercise. We show that tightly coupled approaches can concentrate data traffic between local storage systems and compute units, and thereby optimizing bandwidth utilization to achieve a better throughput. Based on our observations, we develop a geoscience data analysis system that tightly couples analysis engines with storages, which has direct access to the detailed map of data partition locations. Through an innovation data partitioning and distribution scheme, our system has demonstrated scalable and interactive performance in real-world geoscience data analysis applications.

  17. Simultaneous Proteomic Discovery and Targeted Monitoring using Liquid Chromatography, Ion Mobility Spectrometry, and Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burnum-Johnson, Kristin E.; Nie, Song; Casey, Cameron P.

    Current proteomics approaches are comprised of both broad discovery measurements as well as more quantitative targeted measurements. These two different measurement types are used to initially identify potentially important proteins (e.g., candidate biomarkers) and then enable improved quantification for a limited number of selected proteins. However, both approaches suffer from limitations, particularly the lower sensitivity, accuracy, and quantitation precision for discovery approaches compared to targeted approaches, and the limited proteome coverage provided by targeted approaches. Herein, we describe a new proteomics approach that allows both discovery and targeted monitoring (DTM) in a single analysis using liquid chromatography, ion mobility spectrometrymore » and mass spectrometry (LC-IMS-MS). In DTM, heavy labeled peptides for target ions are spiked into tryptic digests and both the labeled and unlabeled peptides are broadly detected using LC-IMS-MS instrumentation, allowing the benefits of discovery and targeted approaches. To understand the possible improvement of the DTM approach, it was compared to LC-MS broad measurements using an accurate mass and time tag database and selected reaction monitoring (SRM) targeted measurements. The DTM results yielded greater peptide/protein coverage and a significant improvement in the detection of lower abundance species compared to LC-MS discovery measurements. DTM was also observed to have similar detection limits as SRM for the targeted measurements indicating its potential for combining the discovery and targeted approaches.« less

  18. Evaluation of automated sample preparation, retention time locked gas chromatography-mass spectrometry and data analysis methods for the metabolomic study of Arabidopsis species.

    PubMed

    Gu, Qun; David, Frank; Lynen, Frédéric; Rumpel, Klaus; Dugardeyn, Jasper; Van Der Straeten, Dominique; Xu, Guowang; Sandra, Pat

    2011-05-27

    In this paper, automated sample preparation, retention time locked gas chromatography-mass spectrometry (GC-MS) and data analysis methods for the metabolomics study were evaluated. A miniaturized and automated derivatisation method using sequential oximation and silylation was applied to a polar extract of 4 types (2 types×2 ages) of Arabidopsis thaliana, a popular model organism often used in plant sciences and genetics. Automation of the derivatisation process offers excellent repeatability, and the time between sample preparation and analysis was short and constant, reducing artifact formation. Retention time locked (RTL) gas chromatography-mass spectrometry was used, resulting in reproducible retention times and GC-MS profiles. Two approaches were used for data analysis. XCMS followed by principal component analysis (approach 1) and AMDIS deconvolution combined with a commercially available program (Mass Profiler Professional) followed by principal component analysis (approach 2) were compared. Several features that were up- or down-regulated in the different types were detected. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Alternate Methods in Refining the SLS Nozzle Plug Loads

    NASA Technical Reports Server (NTRS)

    Burbank, Scott; Allen, Andrew

    2013-01-01

    Numerical analysis has shown that the SLS nozzle environmental barrier (nozzle plug) design is inadequate for the prelaunch condition, which consists of two dominant loads: 1) the main engines startup pressure and 2) an environmentally induced pressure. Efforts to reduce load conservatisms included a dynamic analysis which showed a 31% higher safety factor compared to the standard static analysis. The environmental load is typically approached with a deterministic method using the worst possible combinations of pressures and temperatures. An alternate probabilistic approach, utilizing the distributions of pressures and temperatures, resulted in a 54% reduction in the environmental pressure load. A Monte Carlo simulation of environmental load that used five years of historical pressure and temperature data supported the results of the probabilistic analysis, indicating the probabilistic load is reflective of a 3-sigma condition (1 in 370 probability). Utilizing the probabilistic load analysis eliminated excessive conservatisms and will prevent a future overdesign of the nozzle plug. Employing a similar probabilistic approach to other design and analysis activities can result in realistic yet adequately conservative solutions.

  20. The soft constraints hypothesis: a rational analysis approach to resource allocation for interactive behavior.

    PubMed

    Gray, Wayne D; Sims, Chris R; Fu, Wai-Tat; Schoelles, Michael J

    2006-07-01

    Soft constraints hypothesis (SCH) is a rational analysis approach that holds that the mixture of perceptual-motor and cognitive resources allocated for interactive behavior is adjusted based on temporal cost-benefit tradeoffs. Alternative approaches maintain that cognitive resources are in some sense protected or conserved in that greater amounts of perceptual-motor effort will be expended to conserve lesser amounts of cognitive effort. One alternative, the minimum memory hypothesis (MMH), holds that people favor strategies that minimize the use of memory. SCH is compared with MMH across 3 experiments and with predictions of an Ideal Performer Model that uses ACT-R's memory system in a reinforcement learning approach that maximizes expected utility by minimizing time. Model and data support the SCH view of resource allocation; at the under 1000-ms level of analysis, mixtures of cognitive and perceptual-motor resources are adjusted based on their cost-benefit tradeoffs for interactive behavior. ((c) 2006 APA, all rights reserved).

  1. Two different approaches to the affective profiles model: median splits (variable-oriented) and cluster analysis (person-oriented).

    PubMed

    Garcia, Danilo; MacDonald, Shane; Archer, Trevor

    2015-01-01

    Background. The notion of the affective system as being composed of two dimensions led Archer and colleagues to the development of the affective profiles model. The model consists of four different profiles based on combinations of individuals' experience of high/low positive and negative affect: self-fulfilling, low affective, high affective, and self-destructive. During the past 10 years, an increasing number of studies have used this person-centered model as the backdrop for the investigation of between and within individual differences in ill-being and well-being. The most common approach to this profiling is by dividing individuals' scores of self-reported affect using the median of the population as reference for high/low splits. However, scores just-above and just-below the median might become high and low by arbitrariness, not by reality. Thus, it is plausible to criticize the validity of this variable-oriented approach. Our aim was to compare the median splits approach with a person-oriented approach, namely, cluster analysis. Method. The participants (N = 2, 225) were recruited through Amazons' Mechanical Turk and asked to self-report affect using the Positive Affect Negative Affect Schedule. We compared the profiles' homogeneity and Silhouette coefficients to discern differences in homogeneity and heterogeneity between approaches. We also conducted exact cell-wise analyses matching the profiles from both approaches and matching profiles and gender to investigate profiling agreement with respect to affectivity levels and affectivity and gender. All analyses were conducted using the ROPstat software. Results. The cluster approach (weighted average of cluster homogeneity coefficients = 0.62, Silhouette coefficients = 0.68) generated profiles with greater homogeneity and more distinctive from each other compared to the median splits approach (weighted average of cluster homogeneity coefficients = 0.75, Silhouette coefficients = 0.59). Most of the participants (n = 1,736, 78.0%) were allocated to the same profile (Rand Index = .83), however, 489 (21.98%) were allocated to different profiles depending on the approach. Both approaches allocated females and males similarly in three of the four profiles. Only the cluster analysis approach classified men significantly more often than chance to a self-fulfilling profile (type) and females less often than chance to this very same profile (antitype). Conclusions. Although the question whether one approach is more appropriate than the other is still without answer, the cluster method allocated individuals to profiles that are more in accordance with the conceptual basis of the model and also to expected gender differences. More importantly, regardless of the approach, our findings suggest that the model mirrors a complex and dynamic adaptive system.

  2. LinkIT: a ludic elicitation game for eliciting risk perceptions.

    PubMed

    Cao, Yan; McGill, William L

    2013-06-01

    The mental models approach, a leading strategy to develop risk communications, involves a time- and labor-intensive interview process and a lengthy questionnaire to elicit group-level risk perceptions. We propose that a similarity ratings approach for structural knowledge elicitation can be adopted to assist the risk mental models approach. The LinkIT game, inspired by games with a purpose (GWAP) technology, is a ludic elicitation tool designed to elicit group understanding of the relations between risk factors in a more enjoyable and productive manner when compared to traditional approaches. That is, consistent with the idea of ludic elicitation, LinkIT was designed to make the elicitation process fun and enjoyable in the hopes of increasing participation and data quality in risk studies. Like the mental models approach, the group mental model obtained via the LinkIT game can hence be generated and represented in a form of influence diagrams. In order to examine the external validity of LinkIT, we conducted a study to compare its performance with respect to a more conventional questionnaire-driven approach. Data analysis results conclude that the two group mental models elicited from the two approaches are similar to an extent. Yet, LinkIT was more productive and enjoyable than the questionnaire. However, participants commented that the current game has some usability concerns. This presentation summarizes the design and evaluation of the LinkIT game and suggests areas for future work. © 2012 Society for Risk Analysis.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ha, Kyoo-Man, E-mail: ha1999@hotmail.com

    Although the number of emergency managers has risen in South Korea (hereafter referred to as Korea) over the years, their role is not yet as defined and noteworthy compared to other professions because of its unidisciplinary approach. This article investigates how Korea has to improve emergency managers' disciplinary approach to ultimately contribute to the goal of effective transnational disaster management. This study uses qualitative content analysis of government policies, college curricula, nongovernmental organizations' (NGOs') emergency-manager certification, and mass media coverage to compare emergency managers' unidisciplinary and multidisciplinary approaches. The key tenet is that Korea must change its emergency managers' unidisciplinarymore » approach into a multidisciplinary approach because the former is less effective when dealing with complicated disaster management systems. To achieve this change, the stakeholders must carry out their assigned responsibilities under risk-oriented management. As for the study's international implications, developing nations may consider the enhancement of related educational curricula, collaborative learning, continuous evaluation, disaster awareness, and disaster prevention for the emergency managers' multidisciplinary approach.« less

  4. Comparison of CTT and Rasch-based approaches for the analysis of longitudinal Patient Reported Outcomes.

    PubMed

    Blanchin, Myriam; Hardouin, Jean-Benoit; Le Neel, Tanguy; Kubis, Gildas; Blanchard, Claire; Mirallié, Eric; Sébille, Véronique

    2011-04-15

    Health sciences frequently deal with Patient Reported Outcomes (PRO) data for the evaluation of concepts, in particular health-related quality of life, which cannot be directly measured and are often called latent variables. Two approaches are commonly used for the analysis of such data: Classical Test Theory (CTT) and Item Response Theory (IRT). Longitudinal data are often collected to analyze the evolution of an outcome over time. The most adequate strategy to analyze longitudinal latent variables, which can be either based on CTT or IRT models, remains to be identified. This strategy must take into account the latent characteristic of what PROs are intended to measure as well as the specificity of longitudinal designs. A simple and widely used IRT model is the Rasch model. The purpose of our study was to compare CTT and Rasch-based approaches to analyze longitudinal PRO data regarding type I error, power, and time effect estimation bias. Four methods were compared: the Score and Mixed models (SM) method based on the CTT approach, the Rasch and Mixed models (RM), the Plausible Values (PV), and the Longitudinal Rasch model (LRM) methods all based on the Rasch model. All methods have shown comparable results in terms of type I error, all close to 5 per cent. LRM and SM methods presented comparable power and unbiased time effect estimations, whereas RM and PV methods showed low power and biased time effect estimations. This suggests that RM and PV methods should be avoided to analyze longitudinal latent variables. Copyright © 2010 John Wiley & Sons, Ltd.

  5. Short Tree, Long Tree, Right Tree, Wrong Tree: New Acquisition Bias Corrections for Inferring SNP Phylogenies

    PubMed Central

    Leaché, Adam D.; Banbury, Barbara L.; Felsenstein, Joseph; de Oca, Adrián nieto-Montes; Stamatakis, Alexandros

    2015-01-01

    Single nucleotide polymorphisms (SNPs) are useful markers for phylogenetic studies owing in part to their ubiquity throughout the genome and ease of collection. Restriction site associated DNA sequencing (RADseq) methods are becoming increasingly popular for SNP data collection, but an assessment of the best practises for using these data in phylogenetics is lacking. We use computer simulations, and new double digest RADseq (ddRADseq) data for the lizard family Phrynosomatidae, to investigate the accuracy of RAD loci for phylogenetic inference. We compare the two primary ways RAD loci are used during phylogenetic analysis, including the analysis of full sequences (i.e., SNPs together with invariant sites), or the analysis of SNPs on their own after excluding invariant sites. We find that using full sequences rather than just SNPs is preferable from the perspectives of branch length and topological accuracy, but not of computational time. We introduce two new acquisition bias corrections for dealing with alignments composed exclusively of SNPs, a conditional likelihood method and a reconstituted DNA approach. The conditional likelihood method conditions on the presence of variable characters only (the number of invariant sites that are unsampled but known to exist is not considered), while the reconstituted DNA approach requires the user to specify the exact number of unsampled invariant sites prior to the analysis. Under simulation, branch length biases increase with the amount of missing data for both acquisition bias correction methods, but branch length accuracy is much improved in the reconstituted DNA approach compared to the conditional likelihood approach. Phylogenetic analyses of the empirical data using concatenation or a coalescent-based species tree approach provide strong support for many of the accepted relationships among phrynosomatid lizards, suggesting that RAD loci contain useful phylogenetic signal across a range of divergence times despite the presence of missing data. Phylogenetic analysis of RAD loci requires careful attention to model assumptions, especially if downstream analyses depend on branch lengths. PMID:26227865

  6. Radiative Transfer Modeling and Retrievals for Advanced Hyperspectral Sensors

    NASA Technical Reports Server (NTRS)

    Liu, Xu; Zhou, Daniel K.; Larar, Allen M.; Smith, William L., Sr.; Mango, Stephen A.

    2009-01-01

    A novel radiative transfer model and a physical inversion algorithm based on principal component analysis will be presented. Instead of dealing with channel radiances, the new approach fits principal component scores of these quantities. Compared to channel-based radiative transfer models, the new approach compresses radiances into a much smaller dimension making both forward modeling and inversion algorithm more efficient.

  7. Evaluating Views of Lecturers on the Consistency of Teaching Content with Teaching Approach: Traditional versus Reform Calculus

    ERIC Educational Resources Information Center

    Sevimli, Eyup

    2016-01-01

    This study aims to evaluate the consistency of teaching content with teaching approaches in calculus on the basis of lecturers' views. In this sense, the structures of the examples given in two commonly used calculus textbooks, both in traditional and reform classrooms, are compared. The content analysis findings show that the examples in both…

  8. Methodological Issues in Meta-Analyzing Standard Deviations: Comment on Bond and DePaulo (2008)

    ERIC Educational Resources Information Center

    Pigott, Therese D.; Wu, Meng-Jia

    2008-01-01

    In this comment on C. F. Bond and B. M. DePaulo, the authors raise methodological concerns about the approach used to analyze the data. The authors suggest further refinement of the procedures used, and they compare the approach taken by Bond and DePaulo with standard methods for meta-analysis. (Contains 1 table and 2 figures.)

  9. The Use of New Technologies in Basic Education: An Approach to Profile of Indigenous Ecuadorians

    ERIC Educational Resources Information Center

    Stefos, Efstathios; Castellano, José Manuel; Marchán, Andrés Bonilla; Biloon, Julia Raina Sevy

    2017-01-01

    This article aims to define the profile of Ecuadorian indigenous students who study at different levels of basic education in Ecuador in the context of the application and use of emerging technologies in the last five years. This approach focuses on a comparative analysis between indigenous and non-indigenous students, based on the national data…

  10. Continuous Training and Wages: An Empirical Analysis Using a Comparison-Group Approach

    ERIC Educational Resources Information Center

    Gorlitz, Katja

    2011-01-01

    Using German linked employer-employee data, this paper investigates the short-term impact of on-the-job training on wages. The applied estimation approach was first introduced by Leuven and Oosterbeek (2008). Wages of employees who intended to participate in training but did not do so because of a random event are compared to wages of training…

  11. Equity and Segregation in the Spanish Education System

    ERIC Educational Resources Information Center

    Ferrer, Ferran; Ferrer, Gerard; Baldellou, Jose Luis Castel

    2006-01-01

    This article discusses educational inequalities within the territorial context of Spain, and more particularly in the autonomous community of Catalonia. The analysis, which takes a comparative international approach, looks at the question from two points of view. First, from the angle of students, an analysis is made of the impact produced by…

  12. Important Literature in Endocrinology: Citation Analysis and Historial Methodology.

    ERIC Educational Resources Information Center

    Hurt, C. D.

    1982-01-01

    Results of a study comparing two approaches to the identification of important literature in endocrinology reveals that association between ranking of cited items using the two methods is not statistically significant and use of citation or historical analysis alone will not result in same set of literature. Forty-two sources are appended. (EJS)

  13. AN APPROACH FOR DETERMINING REGIONAL LAND COVER AND SPECIES HABITAT CONSERVATION STATUS IN THE AMERICAN SOUTHWEST: THE SOUTHWEST REGIONAL GAP ANALYSIS PROJECT

    EPA Science Inventory

    The Gap Analysis Program (GAP) is a national interagency progranl that maps the distribution of plant communities and selected animal species and compares these distributions with land stewardship to identify biotic elements at potential risk of endangerment. GAP uses remote sens...

  14. Toward exploratory analysis of diversity unified across fields of study: an information visualization approach

    Treesearch

    Tuan Pham; Julia Jones; Ronald Metoyer; Frederick Colwell

    2014-01-01

    The study of the diversity of multivariate objects shares common characteristics and goals across disciplines, including ecology and organizational management. Nevertheless, subject-matter experts have adopted somewhat separate diversity concepts and analysis techniques, limiting the potential for sharing and comparing across disciplines. Moreover, while large and...

  15. Context Matters in Educational Research and International Development: Learning from the Small States Experience

    ERIC Educational Resources Information Center

    Crossley, Michael

    2010-01-01

    The article argues that greater attention should be paid to contextual factors in educational research and international development cooperation. The analysis draws upon principles that underpin socio-cultural approaches to comparative education, a critical analysis of the political economy of contemporary educational research, and recent research…

  16. Advancing Alternative Analysis: Integration of Decision Science

    PubMed Central

    Zaunbrecher, Virginia M.; Batteate, Christina M.; Blake, Ann; Carroll, William F.; Corbett, Charles J.; Hansen, Steffen Foss; Lempert, Robert J.; Linkov, Igor; McFadden, Roger; Moran, Kelly D.; Olivetti, Elsa; Ostrom, Nancy K.; Romero, Michelle; Schoenung, Julie M.; Seager, Thomas P.; Sinsheimer, Peter; Thayer, Kristina A.

    2017-01-01

    Background: Decision analysis—a systematic approach to solving complex problems—offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. Objectives: We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. Methods: A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups’ findings. Results: We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. Conclusions: We advance four recommendations: a) engaging the systematic development and evaluation of decision approaches and tools; b) using case studies to advance the integration of decision analysis into alternatives analysis; c) supporting transdisciplinary research; and d) supporting education and outreach efforts. https://doi.org/10.1289/EHP483 PMID:28669940

  17. Comparing model-based and model-free analysis methods for QUASAR arterial spin labeling perfusion quantification.

    PubMed

    Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J

    2013-05-01

    Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.

  18. Minimally Invasive Lateral Access Surgery and Reoperation Rates: A Multi-Institution Retrospective Review of 2060 Patients.

    PubMed

    Nayar, Gautam; Wang, Timothy; Sankey, Eric W; Berry-Candelario, John; Elsamadicy, Aladine A; Back, Adam; Karikari, Isaac; Isaacs, Robert

    2018-05-19

    Risk factors for surgical revision remain important because of additional readmission, anesthesia, and morbidity for the patient and significant cost for health care systems. Although the rate of reoperation (RRO) is well described for traditional open posterior (OP) approaches, the RRO in minimally invasive lateral (MIL) surgery remains poorly characterized. This study compares the RRO in patients undergoing decompressive lumbar spine surgery via MIL versus OP approaches. Patient demographics and comorbidities were retrospectively collected for 2060 patients undergoing single-stage elective lumbar spinal surgery at multiple institutions. A subset of 1484 patients had long-term data (long-term cohort [LT cohort]). The RRO was compared between approaches through univariate and multivariate analysis. There were 1292 patients (62.7%) who underwent lateral access surgery, whereas 768 patients (37.3%) underwent OP surgery. The MIL cohort was significantly older, had a higher proportion of men, and had more comorbidities than the OP cohort. In the LT cohort, lateral patients were significantly older and had more comorbidities, with a lower body mass index and a lower proportion of men and smokers. Surgical complications between the groups trended to be similar. The MIL cohort had a significantly lower RRO at both 30 days (approximately 57% lower, MIL cohort: 1.01% vs. OP cohort: 2.36%, P = 0.02) and 2 years (approximately 61% lower, MIL cohort: 2.09% vs. OP cohort: 5.37%, P < 0.01) after surgery. On multivariate analysis, surgical approach was the only significant predictor for the RRO at both 30 days (open posterior approach odds ratio [OR], 4.47; 95% confidence interval [CI], 1.33-15.09; P = 0.02) and 2 years (open posterior approach OR, 3.26; 95% CI, 1.26-8.42; P = 0.01). This study shows that MIL surgical approaches, compared with OP approaches, have a significantly lower RRO after lumbar spine surgery. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Quantitative comparison of microarray experiments with published leukemia related gene expression signatures.

    PubMed

    Klein, Hans-Ulrich; Ruckert, Christian; Kohlmann, Alexander; Bullinger, Lars; Thiede, Christian; Haferlach, Torsten; Dugas, Martin

    2009-12-15

    Multiple gene expression signatures derived from microarray experiments have been published in the field of leukemia research. A comparison of these signatures with results from new experiments is useful for verification as well as for interpretation of the results obtained. Currently, the percentage of overlapping genes is frequently used to compare published gene signatures against a signature derived from a new experiment. However, it has been shown that the percentage of overlapping genes is of limited use for comparing two experiments due to the variability of gene signatures caused by different array platforms or assay-specific influencing parameters. Here, we present a robust approach for a systematic and quantitative comparison of published gene expression signatures with an exemplary query dataset. A database storing 138 leukemia-related published gene signatures was designed. Each gene signature was manually annotated with terms according to a leukemia-specific taxonomy. Two analysis steps are implemented to compare a new microarray dataset with the results from previous experiments stored and curated in the database. First, the global test method is applied to assess gene signatures and to constitute a ranking among them. In a subsequent analysis step, the focus is shifted from single gene signatures to chromosomal aberrations or molecular mutations as modeled in the taxonomy. Potentially interesting disease characteristics are detected based on the ranking of gene signatures associated with these aberrations stored in the database. Two example analyses are presented. An implementation of the approach is freely available as web-based application. The presented approach helps researchers to systematically integrate the knowledge derived from numerous microarray experiments into the analysis of a new dataset. By means of example leukemia datasets we demonstrate that this approach detects related experiments as well as related molecular mutations and may help to interpret new microarray data.

  20. Joint protection and hand exercises for hand osteoarthritis: an economic evaluation comparing methods for the analysis of factorial trials

    PubMed Central

    Oppong, Raymond; Nicholls, Elaine; Whitehurst, David G. T.; Hill, Susan; Hammond, Alison; Hay, Elaine M.; Dziedzic, Krysia

    2015-01-01

    Objectives. Evidence regarding the cost-effectiveness of joint protection and hand exercises for the management of hand OA is not well established. The primary aim of this study is to assess the cost-effectiveness (cost-utility) of these management options. In addition, given the absence of consensus regarding the conduct of economic evaluation alongside factorial trials, we compare different analytical methodologies. Methods. A trial-based economic evaluation to assess the cost-utility of joint protection only, hand exercises only and joint protection plus hand exercises compared with leaflet and advice was undertaken over a 12 month period from a UK National Health Service perspective. Patient-level mean costs and mean quality-adjusted life years (QALYs) were calculated for each trial arm. Incremental cost-effectiveness ratios (ICERs) were estimated and cost-effectiveness acceptability curves were constructed. The base case analysis used a within-the-table analysis methodology. Two further methods were explored: the at-the-margins approach and a regression-based approach with or without an interaction term. Results. Mean costs (QALYs) were £58.46 (s.d. 0.662) for leaflet and advice, £92.12 (s.d. 0.659) for joint protection, £64.51 (s.d. 0.681) for hand exercises and £112.38 (s.d. 0.658) for joint protection plus hand exercises. In the base case, hand exercises were the cost-effective option, with an ICER of £318 per QALY gained. Hand exercises remained the most cost-effective management strategy when adopting alternative methodological approaches. Conclusion. This is the first trial evaluating the cost-effectiveness of occupational therapy-supported approaches to self-management for hand OA. Our findings showed that hand exercises were the most cost-effective option. PMID:25339642

  1. Comparison of Object-Based Image Analysis Approaches to Mapping New Buildings in Accra, Ghana Using Multi-Temporal QuickBird Satellite Imagery

    PubMed Central

    Tsai, Yu Hsin; Stow, Douglas; Weeks, John

    2013-01-01

    The goal of this study was to map and quantify the number of newly constructed buildings in Accra, Ghana between 2002 and 2010 based on high spatial resolution satellite image data. Two semi-automated feature detection approaches for detecting and mapping newly constructed buildings based on QuickBird very high spatial resolution satellite imagery were analyzed: (1) post-classification comparison; and (2) bi-temporal layerstack classification. Feature Analyst software based on a spatial contextual classifier and ENVI Feature Extraction that uses a true object-based image analysis approach of image segmentation and segment classification were evaluated. Final map products representing new building objects were compared and assessed for accuracy using two object-based accuracy measures, completeness and correctness. The bi-temporal layerstack method generated more accurate results compared to the post-classification comparison method due to less confusion with background objects. The spectral/spatial contextual approach (Feature Analyst) outperformed the true object-based feature delineation approach (ENVI Feature Extraction) due to its ability to more reliably delineate individual buildings of various sizes. Semi-automated, object-based detection followed by manual editing appears to be a reliable and efficient approach for detecting and enumerating new building objects. A bivariate regression analysis was performed using neighborhood-level estimates of new building density regressed on a census-derived measure of socio-economic status, yielding an inverse relationship with R2 = 0.31 (n = 27; p = 0.00). The primary utility of the new building delineation results is to support spatial analyses of land cover and land use and demographic change. PMID:24415810

  2. Episodic Memory: A Comparative Approach

    PubMed Central

    Martin-Ordas, Gema; Call, Josep

    2013-01-01

    Historically, episodic memory has been described as autonoetic, personally relevant, complex, context-rich, and allowing mental time travel. In contrast, semantic memory, which is theorized to be free of context and personal relevance, is noetic and consists of general knowledge of facts about the world. The field of comparative psychology has adopted this distinction in order to study episodic memory in non-human animals. Our aim in this article is not only to reflect on the concept of episodic memory and the experimental approaches used in comparative psychology to study this phenomenon, but also to provide a critical analysis of these paradigms. We conclude the article by providing new avenues for future research. PMID:23781179

  3. Effect Size as the Essential Statistic in Developing Methods for mTBI Diagnosis.

    PubMed

    Gibson, Douglas Brandt

    2015-01-01

    The descriptive statistic known as "effect size" measures the distinguishability of two sets of data. Distingishability is at the core of diagnosis. This article is intended to point out the importance of effect size in the development of effective diagnostics for mild traumatic brain injury and to point out the applicability of the effect size statistic in comparing diagnostic efficiency across the main proposed TBI diagnostic methods: psychological, physiological, biochemical, and radiologic. Comparing diagnostic approaches is difficult because different researcher in different fields have different approaches to measuring efficacy. Converting diverse measures to effect sizes, as is done in meta-analysis, is a relatively easy way to make studies comparable.

  4. Statistical primer: how to deal with missing data in scientific research?

    PubMed

    Papageorgiou, Grigorios; Grant, Stuart W; Takkenberg, Johanna J M; Mokhles, Mostafa M

    2018-05-10

    Missing data are a common challenge encountered in research which can compromise the results of statistical inference when not handled appropriately. This paper aims to introduce basic concepts of missing data to a non-statistical audience, list and compare some of the most popular approaches for handling missing data in practice and provide guidelines and recommendations for dealing with and reporting missing data in scientific research. Complete case analysis and single imputation are simple approaches for handling missing data and are popular in practice, however, in most cases they are not guaranteed to provide valid inferences. Multiple imputation is a robust and general alternative which is appropriate for data missing at random, surpassing the disadvantages of the simpler approaches, but should always be conducted with care. The aforementioned approaches are illustrated and compared in an example application using Cox regression.

  5. Molecular Dynamics Approach in Designing Thermostable Aspergillus niger Xylanase

    NASA Astrophysics Data System (ADS)

    Malau, N. D.; Sianturi, M.

    2017-03-01

    Molecular dynamics methods we have applied as a tool in designing thermostable Aspergillus niger Xylanase, by examining Root Mean Square Deviation (RMSD) and The Stability of the Secondary Structure of enzymes structure at its optimum temperature and compare with its high temperature behavior. As RMSD represents structural fluctuation at a particular temperature, a better understanding of this factor will suggest approaches to bioengineer these enzymes to enhance their thermostability. In this work molecular dynamic simulations of Aspergillus niger xylanase (ANX) have been carried at 400K (optimum catalytic temperature) for 2.5 ns and 500K (ANX reported inactive temperature) for 2.5 ns. Analysis have shown that the Root Mean Square Deviation (RMSD) significant increase at higher temperatures compared at optimum temperature and some of the secondary structures of ANX that have been damaged at high temperature. Structural analysis revealed that the fluctuations of the α-helix and β-sheet regions are larger at higher temperatures compared to the fluctuations at optimum temperature.

  6. Biochemometrics for Natural Products Research: Comparison of Data Analysis Approaches and Application to Identification of Bioactive Compounds.

    PubMed

    Kellogg, Joshua J; Todd, Daniel A; Egan, Joseph M; Raja, Huzefa A; Oberlies, Nicholas H; Kvalheim, Olav M; Cech, Nadja B

    2016-02-26

    A central challenge of natural products research is assigning bioactive compounds from complex mixtures. The gold standard approach to address this challenge, bioassay-guided fractionation, is often biased toward abundant, rather than bioactive, mixture components. This study evaluated the combination of bioassay-guided fractionation with untargeted metabolite profiling to improve active component identification early in the fractionation process. Key to this methodology was statistical modeling of the integrated biological and chemical data sets (biochemometric analysis). Three data analysis approaches for biochemometric analysis were compared, namely, partial least-squares loading vectors, S-plots, and the selectivity ratio. Extracts from the endophytic fungi Alternaria sp. and Pyrenochaeta sp. with antimicrobial activity against Staphylococcus aureus served as test cases. Biochemometric analysis incorporating the selectivity ratio performed best in identifying bioactive ions from these extracts early in the fractionation process, yielding altersetin (3, MIC 0.23 μg/mL) and macrosphelide A (4, MIC 75 μg/mL) as antibacterial constituents from Alternaria sp. and Pyrenochaeta sp., respectively. This study demonstrates the potential of biochemometrics coupled with bioassay-guided fractionation to identify bioactive mixture components. A benefit of this approach is the ability to integrate multiple stages of fractionation and bioassay data into a single analysis.

  7. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  8. Utility of a Systematic Approach to Teaching Photographic Nasal Analysis to Otolaryngology Residents.

    PubMed

    Robitschek, Jon; Dresner, Harley; Hilger, Peter

    2017-12-01

    Photographic nasal analysis constitutes a critical step along the path toward accurate diagnosis and precise surgical planning in rhinoplasty. The learned process by which one assesses photographs, analyzes relevant anatomical landmarks, and generates a global view of the nasal aesthetic is less widely described. To discern the common pitfalls in performing photographic nasal analysis and to quantify the utility of a systematic approach model in teaching photographic nasal analysis to otolaryngology residents. This prospective observational study included 20 participants from a university-based otolaryngology residency program. The control and intervention groups underwent baseline graded assessment of 3 patients. The intervention group received instruction on a systematic approach model for nasal analysis, and both groups underwent postintervention testing at 10 weeks. Data were collected from October 1, 2015, through June 1, 2016. A 10-minute, 11-slide presentation provided instruction on a systematic approach to nasal analysis to the intervention group. Graded photographic nasal analysis using a binary 18-point system. The 20 otolaryngology residents (15 men and 5 women; age range, 24-34 years) were adept at mentioning dorsal deviation and dorsal profile with focused descriptions of tip angle and contour. Areas commonly omitted by residents included verification of the Frankfort plane, position of the lower lateral crura, radix position, and ratio of the ala to tip lobule. The intervention group demonstrated immediate improvement after instruction on the teaching model, with the mean (SD) postintervention test score doubling compared with their baseline performance (7.5 [2.7] vs 10.3 [2.5]; P < .001). At 10 weeks after the intervention, the mean comparative improvement in overall graded nasal analysis was 17% (95% CI, 10%-23%; P < .001). Otolaryngology residents demonstrated proficiency at incorporating nasal deviation, tip angle, and dorsal profile contour into their nasal analysis. They often omitted verification of the Frankfort plane, position of lower lateral crura, radix depth, and ala-to-tip lobule ratio. Findings with this novel 10-minute teaching model should be validated at other teaching institutions, and the instruction model should be further enhanced to teach more sophisticated analysis to residents as they proceed through training. NA.

  9. Exploring relation types for literature-based discovery.

    PubMed

    Preiss, Judita; Stevenson, Mark; Gaizauskas, Robert

    2015-09-01

    Literature-based discovery (LBD) aims to identify "hidden knowledge" in the medical literature by: (1) analyzing documents to identify pairs of explicitly related concepts (terms), then (2) hypothesizing novel relations between pairs of unrelated concepts that are implicitly related via a shared concept to which both are explicitly related. Many LBD approaches use simple techniques to identify semantically weak relations between concepts, for example, document co-occurrence. These generate huge numbers of hypotheses, difficult for humans to assess. More complex techniques rely on linguistic analysis, for example, shallow parsing, to identify semantically stronger relations. Such approaches generate fewer hypotheses, but may miss hidden knowledge. The authors investigate this trade-off in detail, comparing techniques for identifying related concepts to discover which are most suitable for LBD. A generic LBD system that can utilize a range of relation types was developed. Experiments were carried out comparing a number of techniques for identifying relations. Two approaches were used for evaluation: replication of existing discoveries and the "time slicing" approach.(1) RESULTS: Previous LBD discoveries could be replicated using relations based either on document co-occurrence or linguistic analysis. Using relations based on linguistic analysis generated many fewer hypotheses, but a significantly greater proportion of them were candidates for hidden knowledge. The use of linguistic analysis-based relations improves accuracy of LBD without overly damaging coverage. LBD systems often generate huge numbers of hypotheses, which are infeasible to manually review. Improving their accuracy has the potential to make these systems significantly more usable. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  10. hctsa: A Computational Framework for Automated Time-Series Phenotyping Using Massive Feature Extraction.

    PubMed

    Fulcher, Ben D; Jones, Nick S

    2017-11-22

    Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Harnessing Whole Genome Sequencing in Medical Mycology.

    PubMed

    Cuomo, Christina A

    2017-01-01

    Comparative genome sequencing studies of human fungal pathogens enable identification of genes and variants associated with virulence and drug resistance. This review describes current approaches, resources, and advances in applying whole genome sequencing to study clinically important fungal pathogens. Genomes for some important fungal pathogens were only recently assembled, revealing gene family expansions in many species and extreme gene loss in one obligate species. The scale and scope of species sequenced is rapidly expanding, leveraging technological advances to assemble and annotate genomes with higher precision. By using iteratively improved reference assemblies or those generated de novo for new species, recent studies have compared the sequence of isolates representing populations or clinical cohorts. Whole genome approaches provide the resolution necessary for comparison of closely related isolates, for example, in the analysis of outbreaks or sampled across time within a single host. Genomic analysis of fungal pathogens has enabled both basic research and diagnostic studies. The increased scale of sequencing can be applied across populations, and new metagenomic methods allow direct analysis of complex samples.

  12. Evaluation of hierarchical agglomerative cluster analysis methods for discrimination of primary biological aerosol

    NASA Astrophysics Data System (ADS)

    Crawford, I.; Ruske, S.; Topping, D. O.; Gallagher, M. W.

    2015-11-01

    In this paper we present improved methods for discriminating and quantifying primary biological aerosol particles (PBAPs) by applying hierarchical agglomerative cluster analysis to multi-parameter ultraviolet-light-induced fluorescence (UV-LIF) spectrometer data. The methods employed in this study can be applied to data sets in excess of 1 × 106 points on a desktop computer, allowing for each fluorescent particle in a data set to be explicitly clustered. This reduces the potential for misattribution found in subsampling and comparative attribution methods used in previous approaches, improving our capacity to discriminate and quantify PBAP meta-classes. We evaluate the performance of several hierarchical agglomerative cluster analysis linkages and data normalisation methods using laboratory samples of known particle types and an ambient data set. Fluorescent and non-fluorescent polystyrene latex spheres were sampled with a Wideband Integrated Bioaerosol Spectrometer (WIBS-4) where the optical size, asymmetry factor and fluorescent measurements were used as inputs to the analysis package. It was found that the Ward linkage with z-score or range normalisation performed best, correctly attributing 98 and 98.1 % of the data points respectively. The best-performing methods were applied to the BEACHON-RoMBAS (Bio-hydro-atmosphere interactions of Energy, Aerosols, Carbon, H2O, Organics and Nitrogen-Rocky Mountain Biogenic Aerosol Study) ambient data set, where it was found that the z-score and range normalisation methods yield similar results, with each method producing clusters representative of fungal spores and bacterial aerosol, consistent with previous results. The z-score result was compared to clusters generated with previous approaches (WIBS AnalysiS Program, WASP) where we observe that the subsampling and comparative attribution method employed by WASP results in the overestimation of the fungal spore concentration by a factor of 1.5 and the underestimation of bacterial aerosol concentration by a factor of 5. We suggest that this likely due to errors arising from misattribution due to poor centroid definition and failure to assign particles to a cluster as a result of the subsampling and comparative attribution method employed by WASP. The methods used here allow for the entire fluorescent population of particles to be analysed, yielding an explicit cluster attribution for each particle and improving cluster centroid definition and our capacity to discriminate and quantify PBAP meta-classes compared to previous approaches.

  13. Using Qualitative Comparative Analysis (QCA) of Key Informant Interviews in Health Services Research: Enhancing a Study of Adjuvant Therapy Use in Breast Cancer Care

    PubMed Central

    McAlearney, Ann Scheck; Walker, Daniel; Moss, Alexandra DeNardis; Bickell, Nina A.

    2015-01-01

    Background Qualitative Comparative Analysis (QCA) is a methodology created to address causal complexity in social sciences research by preserving the objectivity of quantitative data analysis without losing detail inherent in qualitative research. However, its use in health services research (HSR) is limited, and questions remain about its application in this context. Objective To explore the strengths and weaknesses of using QCA for HSR. Research Design Using data from semi-structured interviews conducted as part of a multiple case study about adjuvant treatment underuse among underserved breast cancer patients, findings were compared using qualitative approaches with and without QCA to identify strengths, challenges, and opportunities presented by QCA. Subjects Ninety administrative and clinical key informants interviewed across ten NYC area safety net hospitals. Measures Transcribed interviews were coded by three investigators using an iterative and interactive approach. Codes were calibrated for QCA, as well as examined using qualitative analysis without QCA. Results Relative to traditional qualitative analysis, QCA strengths include: (1) addressing causal complexity, (2) results presentation as pathways as opposed to a list, (3) identification of necessary conditions, (4) the option of fuzzy-set calibrations, and (5) QCA-specific parameters of fit that allow researchers to compare outcome pathways. Weaknesses include: (1) few guidelines and examples exist for calibrating interview data, (2) not designed to create predictive models, and (3) unidirectionality. Conclusions Through its presentation of results as pathways, QCA can highlight factors most important for production of an outcome. This strength can yield unique benefits for HSR not available through other methods. PMID:26908085

  14. Using Qualitative Comparative Analysis of Key Informant Interviews in Health Services Research: Enhancing a Study of Adjuvant Therapy Use in Breast Cancer Care.

    PubMed

    McAlearney, Ann Scheck; Walker, Daniel; Moss, Alexandra D; Bickell, Nina A

    2016-04-01

    Qualitative comparative analysis (QCA) is a methodology created to address causal complexity in social sciences research by preserving the objectivity of quantitative data analysis without losing detail inherent in qualitative research. However, its use in health services research (HSR) is limited, and questions remain about its application in this context. To explore the strengths and weaknesses of using QCA for HSR. Using data from semistructured interviews conducted as part of a multiple case study about adjuvant treatment underuse among underserved breast cancer patients, findings were compared using qualitative approaches with and without QCA to identify strengths, challenges, and opportunities presented by QCA. Ninety administrative and clinical key informants interviewed across 10 NYC area safety net hospitals. Transcribed interviews were coded by 3 investigators using an iterative and interactive approach. Codes were calibrated for QCA, as well as examined using qualitative analysis without QCA. Relative to traditional qualitative analysis, QCA strengths include: (1) addressing causal complexity, (2) results presentation as pathways as opposed to a list, (3) identification of necessary conditions, (4) the option of fuzzy-set calibrations, and (5) QCA-specific parameters of fit that allow researchers to compare outcome pathways. Weaknesses include: (1) few guidelines and examples exist for calibrating interview data, (2) not designed to create predictive models, and (3) unidirectionality. Through its presentation of results as pathways, QCA can highlight factors most important for production of an outcome. This strength can yield unique benefits for HSR not available through other methods.

  15. Systems thinking, the Swiss Cheese Model and accident analysis: a comparative systemic analysis of the Grayrigg train derailment using the ATSB, AcciMap and STAMP models.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2014-07-01

    The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. A non-stationary cost-benefit analysis approach for extreme flood estimation to explore the nexus of 'Risk, Cost and Non-stationarity'

    NASA Astrophysics Data System (ADS)

    Qi, Wei

    2017-11-01

    Cost-benefit analysis is commonly used for engineering planning and design problems in practice. However, previous cost-benefit based design flood estimation is based on stationary assumption. This study develops a non-stationary cost-benefit based design flood estimation approach. This approach integrates a non-stationary probability distribution function into cost-benefit analysis, and influence of non-stationarity on expected total cost (including flood damage and construction costs) and design flood estimation can be quantified. To facilitate design flood selections, a 'Risk-Cost' analysis approach is developed, which reveals the nexus of extreme flood risk, expected total cost and design life periods. Two basins, with 54-year and 104-year flood data respectively, are utilized to illustrate the application. It is found that the developed approach can effectively reveal changes of expected total cost and extreme floods in different design life periods. In addition, trade-offs are found between extreme flood risk and expected total cost, which reflect increases in cost to mitigate risk. Comparing with stationary approaches which generate only one expected total cost curve and therefore only one design flood estimation, the proposed new approach generate design flood estimation intervals and the 'Risk-Cost' approach selects a design flood value from the intervals based on the trade-offs between extreme flood risk and expected total cost. This study provides a new approach towards a better understanding of the influence of non-stationarity on expected total cost and design floods, and could be beneficial to cost-benefit based non-stationary design flood estimation across the world.

  17. Regional analysis of annual maximum rainfall using TL-moments method

    NASA Astrophysics Data System (ADS)

    Shabri, Ani Bin; Daud, Zalina Mohd; Ariff, Noratiqah Mohd

    2011-06-01

    Information related to distributions of rainfall amounts are of great importance for designs of water-related structures. One of the concerns of hydrologists and engineers is the probability distribution for modeling of regional data. In this study, a novel approach to regional frequency analysis using L-moments is revisited. Subsequently, an alternative regional frequency analysis using the TL-moments method is employed. The results from both methods were then compared. The analysis was based on daily annual maximum rainfall data from 40 stations in Selangor Malaysia. TL-moments for the generalized extreme value (GEV) and generalized logistic (GLO) distributions were derived and used to develop the regional frequency analysis procedure. TL-moment ratio diagram and Z-test were employed in determining the best-fit distribution. Comparison between the two approaches showed that the L-moments and TL-moments produced equivalent results. GLO and GEV distributions were identified as the most suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation was used for performance evaluation, and it showed that the method of TL-moments was more efficient for lower quantile estimation compared with the L-moments.

  18. A Comparison of Tissue Spray and Lipid Extract Direct Injection Electrospray Ionization Mass Spectrometry for the Differentiation of Eutopic and Ectopic Endometrial Tissues

    NASA Astrophysics Data System (ADS)

    Chagovets, Vitaliy; Wang, Zhihao; Kononikhin, Alexey; Starodubtseva, Natalia; Borisova, Anna; Salimova, Dinara; Popov, Igor; Kozachenko, Andrey; Chingin, Konstantin; Chen, Huanwen; Frankevich, Vladimir; Adamyan, Leila; Sukhikh, Gennady

    2018-02-01

    Recent research revealed that tissue spray mass spectrometry enables rapid molecular profiling of biological tissues, which is of great importance for the search of disease biomarkers as well as for online surgery control. However, the payback for the high speed of analysis in tissue spray analysis is the generally lower chemical sensitivity compared with the traditional approach based on the offline chemical extraction and electrospray ionization mass spectrometry detection. In this study, high resolution mass spectrometry analysis of endometrium tissues of different localizations obtained using direct tissue spray mass spectrometry in positive ion mode is compared with the results of electrospray ionization analysis of lipid extracts. Identified features in both cases belong to three lipid classes: phosphatidylcholines, phosphoethanolamines, and sphingomyelins. Lipids coverage is validated by hydrophilic interaction liquid chromatography with mass spectrometry of lipid extracts. Multivariate analysis of data from both methods reveals satisfactory differentiation of eutopic and ectopic endometrium tissues. Overall, our results indicate that the chemical information provided by tissue spray ionization is sufficient to allow differentiation of endometrial tissues by localization with similar reliability but higher speed than in the traditional approach relying on offline extraction.

  19. Neurostimulation options for failed back surgery syndrome: The need for rational and objective measurements. Proposal of an international clinical network using an integrated database and health economic analysis: the PROBACK network.

    PubMed

    Rigoard, P; Slavin, K

    2015-03-01

    In the context of failed back surgery syndrome (FBSS) treatment, the current practice in neurostimulation varies from center-to-center and most clinical decisions are based on an individual diagnosis. Neurostimulation evaluation tools and pain relief assessment are of major concern, as they now constitute one of the main biases of clinical trials. Moreover, the proliferation of technological devices, in a fertile and unsatisfied market, fosters and only furthers the confusion. There are three options available to apply scientific debates to our daily neurostimulation practice: intentional ignorance, standardized evidence-based practice or alternative data mining approach. In view of the impossibility of conducting multiple randomized clinical trials comparing various devices, one by one, the proposed concept would be to redefine the indications and the respective roles of the various spinal cord and peripheral nerve stimulation devices with large-scale computational modeling/data mining approach, by conducting a multicenter prospective database registry, supported by a clinician's global network called "PROBACK". We chose to specifically analyze 6 parameters: device coverage performance/coverage selectivity/persistence of the long-term electrical response (technical criteria) and comparative mapping of patient pain relief/persistence of the long-term clinical response/safety and complications occurrence (clinical criteria). Two types of analysis will be performed: immediate analysis (including cost analysis) and computational analysis, i.e. demonstration of the robustness of certain correlations of variables, in order to extract response predictors. By creating an international prospective database, the purpose of the PROBACK project was to set up a process of extraction and comparative analysis of data derived from the selection, implantation and follow-up of FBSS patients candidates for implanted neurostimulation. This evaluation strategy should help to change the opinions of each implanter and each health system towards a more rational decision-making approach subtended by mathematical reality. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  20. Comparison of Image Processing Techniques for Nonviable Tissue Quantification in Late Gadolinium Enhancement Cardiac Magnetic Resonance Images.

    PubMed

    Carminati, M Chiara; Boniotti, Cinzia; Fusini, Laura; Andreini, Daniele; Pontone, Gianluca; Pepi, Mauro; Caiani, Enrico G

    2016-05-01

    The aim of this study was to compare the performance of quantitative methods, either semiautomated or automated, for left ventricular (LV) nonviable tissue analysis from cardiac magnetic resonance late gadolinium enhancement (CMR-LGE) images. The investigated segmentation techniques were: (i) n-standard deviations thresholding; (ii) full width at half maximum thresholding; (iii) Gaussian mixture model classification; and (iv) fuzzy c-means clustering. These algorithms were applied either in each short axis slice (single-slice approach) or globally considering the entire short-axis stack covering the LV (global approach). CMR-LGE images from 20 patients with ischemic cardiomyopathy were retrospectively selected, and results from each technique were assessed against manual tracing. All methods provided comparable performance in terms of accuracy in scar detection, computation of local transmurality, and high correlation in scar mass compared with the manual technique. In general, no significant difference between single-slice and global approach was noted. The reproducibility of manual and investigated techniques was confirmed in all cases with slightly lower results for the nSD approach. Automated techniques resulted in accurate and reproducible evaluation of LV scars from CMR-LGE in ischemic patients with performance similar to the manual technique. Their application could minimize user interaction and computational time, even when compared with semiautomated approaches.

  1. A matching framework to improve causal inference in interrupted time-series analysis.

    PubMed

    Linden, Ariel

    2018-04-01

    Interrupted time-series analysis (ITSA) is a popular evaluation methodology in which a single treatment unit's outcome is studied over time and the intervention is expected to "interrupt" the level and/or trend of the outcome, subsequent to its introduction. When ITSA is implemented without a comparison group, the internal validity may be quite poor. Therefore, adding a comparable control group to serve as the counterfactual is always preferred. This paper introduces a novel matching framework, ITSAMATCH, to create a comparable control group by matching directly on covariates and then use these matches in the outcomes model. We evaluate the effect of California's Proposition 99 (passed in 1988) for reducing cigarette sales, by comparing California to other states not exposed to smoking reduction initiatives. We compare ITSAMATCH results to 2 commonly used matching approaches, synthetic controls (SYNTH), and regression adjustment; SYNTH reweights nontreated units to make them comparable to the treated unit, and regression adjusts covariates directly. Methods are compared by assessing covariate balance and treatment effects. Both ITSAMATCH and SYNTH achieved covariate balance and estimated similar treatment effects. The regression model found no treatment effect and produced inconsistent covariate adjustment. While the matching framework achieved results comparable to SYNTH, it has the advantage of being technically less complicated, while producing statistical estimates that are straightforward to interpret. Conversely, regression adjustment may "adjust away" a treatment effect. Given its advantages, ITSAMATCH should be considered as a primary approach for evaluating treatment effects in multiple-group time-series analysis. © 2017 John Wiley & Sons, Ltd.

  2. Gold-standard for computer-assisted morphological sperm analysis.

    PubMed

    Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen

    2017-04-01

    Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm heads. By using the Fourier descriptor and SVM, we achieved the best mean correct classification: only 49%. We conclude that the SCIAN-MorphoSpermGS will provide a standard tool for evaluation of characterization and classification approaches for human sperm heads. Indeed, there is a clear need for a specific shape-based descriptor for human sperm heads and a specific classification approach to tackle the problem of high variability within subcategories of abnormal sperm cells. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Semi-automatic motion compensation of contrast-enhanced ultrasound images from abdominal organs for perfusion analysis.

    PubMed

    Schäfer, Sebastian; Nylund, Kim; Sævik, Fredrik; Engjom, Trond; Mézl, Martin; Jiřík, Radovan; Dimcevski, Georg; Gilja, Odd Helge; Tönnies, Klaus

    2015-08-01

    This paper presents a system for correcting motion influences in time-dependent 2D contrast-enhanced ultrasound (CEUS) images to assess tissue perfusion characteristics. The system consists of a semi-automatic frame selection method to find images with out-of-plane motion as well as a method for automatic motion compensation. Translational and non-rigid motion compensation is applied by introducing a temporal continuity assumption. A study consisting of 40 clinical datasets was conducted to compare the perfusion with simulated perfusion using pharmacokinetic modeling. Overall, the proposed approach decreased the mean average difference between the measured perfusion and the pharmacokinetic model estimation. It was non-inferior for three out of four patient cohorts to a manual approach and reduced the analysis time by 41% compared to manual processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Use of Multiscale Entropy to Facilitate Artifact Detection in Electroencephalographic Signals

    PubMed Central

    Mariani, Sara; Borges, Ana F. T.; Henriques, Teresa; Goldberger, Ary L.; Costa, Madalena D.

    2016-01-01

    Electroencephalographic (EEG) signals present a myriad of challenges to analysis, beginning with the detection of artifacts. Prior approaches to noise detection have utilized multiple techniques, including visual methods, independent component analysis and wavelets. However, no single method is broadly accepted, inviting alternative ways to address this problem. Here, we introduce a novel approach based on a statistical physics method, multiscale entropy (MSE) analysis, which quantifies the complexity of a signal. We postulate that noise corrupted EEG signals have lower information content, and, therefore, reduced complexity compared with their noise free counterparts. We test the new method on an open-access database of EEG signals with and without added artifacts due to electrode motion. PMID:26738116

  5. A review of automatic mass detection and segmentation in mammographic images.

    PubMed

    Oliver, Arnau; Freixenet, Jordi; Martí, Joan; Pérez, Elsa; Pont, Josep; Denton, Erika R E; Zwiggelaar, Reyer

    2010-04-01

    The aim of this paper is to review existing approaches to the automatic detection and segmentation of masses in mammographic images, highlighting the key-points and main differences between the used strategies. The key objective is to point out the advantages and disadvantages of the various approaches. In contrast with other reviews which only describe and compare different approaches qualitatively, this review also provides a quantitative comparison. The performance of seven mass detection methods is compared using two different mammographic databases: a public digitised database and a local full-field digital database. The results are given in terms of Receiver Operating Characteristic (ROC) and Free-response Receiver Operating Characteristic (FROC) analysis. Copyright 2009 Elsevier B.V. All rights reserved.

  6. A Comparison of a Centralized Versus De-Centralized Recruitment Schema in Two Community-Based Participatory Research Studies for Cancer Prevention

    PubMed Central

    Adams, Swann Arp; Heiney, Sue P.; Brandt, Heather M.; Wirth, Michael D.; Khan, Samira; Johnson, Hiluv; Davis, Lisa; Wineglass, Cassandra M.; Warren-Jones, Tatiana Y.; Felder, Tisha M.; Drayton, Ruby F.; Davis, Briana; Farr, Deeonna E.; Hébert, James R.

    2014-01-01

    Use of community-based participatory research (CBPR) approaches is increasing with the goal of making more meaningful and impactful advances in eliminating cancer-related health disparities. While many reports have espoused its advantages, few investigations have focused on comparing CBPR-oriented recruitment and retention. Consequently, the purpose of this analysis was to report and compare two different CBPR approaches in two cancer prevention studies. We utilized frequencies and chi-squared tests to compare and contrast subject recruitment and retention for two studies that incorporated a randomized, controlled intervention design of a dietary and physical activity intervention among African Americans. One study utilized a de-centralized approach to recruitment in which primary responsibility for recruitment was assigned to the general AA community of various church partners whereas the other incorporated a centralized approach to recruitment in which a single lay community individual was hired as research personnel to lead recruitment and intervention delivery. Both studies performed equally well for both recruitment and retention (75 and 88% recruitment rates and 71 and 66% retention rates) far exceeding those rates traditionally cited for cancer clinical trials (~5%). The de-centralized approach to retention appeared to result in statistically greater retention for the control participants compared to the centralized approach (77 vs 51%, P<0.01). Consequently, both CBPR approaches appeared to greatly enhance recruitment and retention rates of AA populations. We further note lessons learned and challenges to consider for future research opportunities. PMID:25086566

  7. Accuracy of Digital vs Conventional Implant Impression Approach: A Three-Dimensional Comparative In Vitro Analysis.

    PubMed

    Basaki, Kinga; Alkumru, Hasan; De Souza, Grace; Finer, Yoav

    To assess the three-dimensional (3D) accuracy and clinical acceptability of implant definitive casts fabricated using a digital impression approach and to compare the results with those of a conventional impression method in a partially edentulous condition. A mandibular reference model was fabricated with implants in the first premolar and molar positions to simulate a patient with bilateral posterior edentulism. Ten implant-level impressions per method were made using either an intraoral scanner with scanning abutments for the digital approach or an open-tray technique and polyvinylsiloxane material for the conventional approach. 3D analysis and comparison of implant location on resultant definitive casts were performed using laser scanner and quality control software. The inter-implant distances and interimplant angulations for each implant pair were measured for the reference model and for each definitive cast (n = 20 per group); these measurements were compared to calculate the magnitude of error in 3D for each definitive cast. The influence of implant angulation on definitive cast accuracy was evaluated for both digital and conventional approaches. Statistical analysis was performed using t test (α = .05) for implant position and angulation. Clinical qualitative assessment of accuracy was done via the assessment of the passivity of a master verification stent for each implant pair, and significance was analyzed using chi-square test (α = .05). A 3D error of implant positioning was observed for the two impression techniques vs the reference model, with mean ± standard deviation (SD) error of 116 ± 94 μm and 56 ± 29 μm for the digital and conventional approaches, respectively (P = .01). In contrast, the inter-implant angulation errors were not significantly different between the two techniques (P = .83). Implant angulation did not have a significant influence on definitive cast accuracy within either technique (P = .64). The verification stent demonstrated acceptable passive fit for 11 out of 20 casts and 18 out of 20 casts for the digital and conventional methods, respectively (P = .01). Definitive casts fabricated using the digital impression approach were less accurate than those fabricated from the conventional impression approach for this simulated clinical scenario. A significant number of definitive casts generated by the digital technique did not meet clinically acceptable accuracy for the fabrication of a multiple implant-supported restoration.

  8. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    PubMed Central

    Seeley, Matthew K.; Francom, Devin; Reese, C. Shane; Hopkins, J. Ty

    2017-01-01

    Abstract In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle, knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. When using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function. PMID:29339984

  9. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach.

    PubMed

    Park, Jihong; Seeley, Matthew K; Francom, Devin; Reese, C Shane; Hopkins, J Ty

    2017-12-01

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle, knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. When using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.

  10. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jihong; Seeley, Matthew K.; Francom, Devin

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle,more » knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. Thus when using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.« less

  11. Functional vs. Traditional Analysis in Biomechanical Gait Data: An Alternative Statistical Approach

    DOE PAGES

    Park, Jihong; Seeley, Matthew K.; Francom, Devin; ...

    2017-12-28

    In human motion studies, discrete points such as peak or average kinematic values are commonly selected to test hypotheses. The purpose of this study was to describe a functional data analysis and describe the advantages of using functional data analyses when compared with a traditional analysis of variance (ANOVA) approach. Nineteen healthy participants (age: 22 ± 2 yrs, body height: 1.7 ± 0.1 m, body mass: 73 ± 16 kg) walked under two different conditions: control and pain+effusion. Pain+effusion was induced by injection of sterile saline into the joint capsule and hypertonic saline into the infrapatellar fat pad. Sagittal-plane ankle,more » knee, and hip joint kinematics were recorded and compared following injections using 2×2 mixed model ANOVAs and FANOVAs. The results of ANOVAs detected a condition × time interaction for the peak ankle (F1,18 = 8.56, p = 0.01) and hip joint angle (F1,18 = 5.77, p = 0.03), but did not for the knee joint angle (F1,18 = 0.36, p = 0.56). The functional data analysis, however, found several differences at initial contact (ankle and knee joint), in the mid-stance (each joint) and at toe off (ankle). Although a traditional ANOVA is often appropriate for discrete or summary data, in biomechanical applications, the functional data analysis could be a beneficial alternative. Thus when using the functional data analysis approach, a researcher can (1) evaluate the entire data as a function, and (2) detect the location and magnitude of differences within the evaluated function.« less

  12. Comparative SIFT-MS, GC-MS and FTIR analysis of methane fuel produced in biogas stations and in artificial photosynthesis over acidic anatase TiO2 and montmorillonite

    NASA Astrophysics Data System (ADS)

    Knížek, Antonín; Dryahina, Ksenyia; Španěl, Patrik; Kubelík, Petr; Kavan, Ladislav; Zukalová, Markéta; Ferus, Martin; Civiš, Svatopluk

    2018-06-01

    The era of fossil fuels is slowly nearing its inevitable end and the urgency of alternative energy sources basic research, exploration and testing becomes ever more important. Storage and alternative production of energy from fuels, such as methane, represents one of the many alternative approaches. Natural gas containing methane represents a powerful source of energy producing large volume of greenhouse gases. However, methane can be also produced in closed, CO2-neutral cycles. In our study, we compare detailed chemical composition of CH4 fuel produced in two different processes: Classical production of biogas in a rendering station, industrial wastewater treatment station and landfill gas station together with novel approach of artificial photosynthesis from CO2 over acidic anatase TiO2 in experimental apparatus developed in our laboratory. The analysis of CH4 fuel produced in these processes is important. Trace gaseous traces can be for example corrosive or toxic, low quality of the mixture suppresses effectivity of energy production, etc. In this analysis, we present a combination of two methods: High resolution Fourier transform infrared spectroscopy (HR-FTIR) suitable for the main component analysis; and the complementary extremely sensitive method of Selected Ion Flow Tube Mass Spectrometry (SIFT-MS) and gas chromatography (GC-MS), which are in turn best suited for trace analysis. The combination of these methods provides more information than any single of them would be able to and promises a new possible analytical approach to fuel and gaseous mixture analysis.

  13. Insight and Evidence Motivating the Simplification of Dual-Analysis Hybrid Systems into Single-Analysis Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo; Diniz, F. L. R.; Takacs, L. L.; Suarez, M. J.

    2018-01-01

    Many hybrid data assimilation systems currently used for NWP employ some form of dual-analysis system approach. Typically a hybrid variational analysis is responsible for creating initial conditions for high-resolution forecasts, and an ensemble analysis system is responsible for creating sample perturbations used to form the flow-dependent part of the background error covariance required in the hybrid analysis component. In many of these, the two analysis components employ different methodologies, e.g., variational and ensemble Kalman filter. In such cases, it is not uncommon to have observations treated rather differently between the two analyses components; recentering of the ensemble analysis around the hybrid analysis is used to compensated for such differences. Furthermore, in many cases, the hybrid variational high-resolution system implements some type of four-dimensional approach, whereas the underlying ensemble system relies on a three-dimensional approach, which again introduces discrepancies in the overall system. Connected to these is the expectation that one can reliably estimate observation impact on forecasts issued from hybrid analyses by using an ensemble approach based on the underlying ensemble strategy of dual-analysis systems. Just the realization that the ensemble analysis makes substantially different use of observations as compared to their hybrid counterpart should serve as enough evidence of the implausibility of such expectation. This presentation assembles numerous anecdotal evidence to illustrate the fact that hybrid dual-analysis systems must, at the very minimum, strive for consistent use of the observations in both analysis sub-components. Simpler than that, this work suggests that hybrid systems can reliably be constructed without the need to employ a dual-analysis approach. In practice, the idea of relying on a single analysis system is appealing from a cost-maintenance perspective. More generally, single-analysis systems avoid contradictions such as having to choose one sub-component to generate performance diagnostics to another, possibly not fully consistent, component.

  14. Integration of heterogeneous data for classification in hyperspectral satellite imagery

    NASA Astrophysics Data System (ADS)

    Benedetto, J.; Czaja, W.; Dobrosotskaya, J.; Doster, T.; Duke, K.; Gillis, D.

    2012-06-01

    As new remote sensing modalities emerge, it becomes increasingly important to nd more suitable algorithms for fusion and integration of dierent data types for the purposes of target/anomaly detection and classication. Typical techniques that deal with this problem are based on performing detection/classication/segmentation separately in chosen modalities, and then integrating the resulting outcomes into a more complete picture. In this paper we provide a broad analysis of a new approach, based on creating fused representations of the multi- modal data, which then can be subjected to analysis by means of the state-of-the-art classiers or detectors. In this scenario we shall consider the hyperspectral imagery combined with spatial information. Our approach involves machine learning techniques based on analysis of joint data-dependent graphs and their associated diusion kernels. Then, the signicant eigenvectors of the derived fused graph Laplace operator form the new representation, which provides integrated features from the heterogeneous input data. We compare these fused approaches with analysis of integrated outputs of spatial and spectral graph methods.

  15. Comparison of Requirements for Composite Structures for Aircraft and Space Applications

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Elliot, Kenny B.; Hampton, Roy W.; Knight, Norman F., Jr.; Aggarwal, Pravin; Engelstad, Stephen P.; Chang, James B.

    2010-01-01

    In this report, the aircraft and space vehicle requirements for composite structures are compared. It is a valuable exercise to study composite structural design approaches used in the airframe industry and to adopt methodology that is applicable for space vehicles. The missions, environments, analysis methods, analysis validation approaches, testing programs, build quantities, inspection, and maintenance procedures used by the airframe industry, in general, are not transferable to spaceflight hardware. Therefore, while the application of composite design approaches from aircraft and other industries is appealing, many aspects cannot be directly utilized. Nevertheless, experiences and research for composite aircraft structures may be of use in unexpected arenas as space exploration technology develops, and so continued technology exchanges are encouraged.

  16. Application of meta-analysis methods for identifying proteomic expression level differences.

    PubMed

    Amess, Bob; Kluge, Wolfgang; Schwarz, Emanuel; Haenisch, Frieder; Alsaif, Murtada; Yolken, Robert H; Leweke, F Markus; Guest, Paul C; Bahn, Sabine

    2013-07-01

    We present new statistical approaches for identification of proteins with expression levels that are significantly changed when applying meta-analysis to two or more independent experiments. We showed that the Euclidean distance measure has reduced risk of false positives compared to the rank product method. Our Ψ-ranking method has advantages over the traditional fold-change approach by incorporating both the fold-change direction as well as the p-value. In addition, the second novel method, Π-ranking, considers the ratio of the fold-change and thus integrates all three parameters. We further improved the latter by introducing our third technique, Σ-ranking, which combines all three parameters in a balanced nonparametric approach. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Approach for classification and taxonomy within family Rickettsiaceae based on the Formal Order Analysis.

    PubMed

    Shpynov, S; Pozdnichenko, N; Gumenuk, A

    2015-01-01

    Genome sequences of 36 Rickettsia and Orientia were analyzed using Formal Order Analysis (FOA). This approach takes into account arrangement of nucleotides in each sequence. A numerical characteristic, the average distance (remoteness) - "g" was used to compare of genomes. Our results corroborated previous separation of three groups within the genus Rickettsia, including typhus group, classic spotted fever group, and the ancestral group and Orientia as a separate genus. Rickettsia felis URRWXCal2 and R. akari Hartford were not in the same group based on FOA, therefore designation of a so-called transitional Rickettsia group could not be confirmed with this approach. Copyright © 2015 Institut Pasteur. Published by Elsevier Masson SAS. All rights reserved.

  18. Preparation of sumoylated substrates for biochemical analysis.

    PubMed

    Knipscheer, Puck; Klug, Helene; Sixma, Titia K; Pichler, Andrea

    2009-01-01

    Covalent modification of proteins with SUMO (small ubiquitin related modifier) affects many cellular processes like transcription, nuclear transport, DNA repair and cell cycle progression. Although hundreds of SUMO targets have been identified, for several of them the function remains obscure. In the majority of cases sumoylation is investigated via "loss of modification" analysis by mutating the relevant target lysine. However, in other cases this approach is not successful since mapping of the modification site is problematic or mutation does not cause an obvious phenotype. These latter cases ask for different approaches to investigate the target modification. One possibility is to choose the opposite approach, a "gain in modification" analysis by producing both SUMO modified and unmodified protein in vitro and comparing them in functional assays. Here, we describe the purification of the ubiquitin conjugating enzyme E2-25K, its in vitro sumoylation with recombinant enzymes and the subsequent separation and purification of the modified and the unmodified forms.

  19. Spatiotemporal Bayesian analysis of Lyme disease in New York state, 1990-2000.

    PubMed

    Chen, Haiyan; Stratton, Howard H; Caraco, Thomas B; White, Dennis J

    2006-07-01

    Mapping ordinarily increases our understanding of nontrivial spatial and temporal heterogeneities in disease rates. However, the large number of parameters required by the corresponding statistical models often complicates detailed analysis. This study investigates the feasibility of a fully Bayesian hierarchical regression approach to the problem and identifies how it outperforms two more popular methods: crude rate estimates (CRE) and empirical Bayes standardization (EBS). In particular, we apply a fully Bayesian approach to the spatiotemporal analysis of Lyme disease incidence in New York state for the period 1990-2000. These results are compared with those obtained by CRE and EBS in Chen et al. (2005). We show that the fully Bayesian regression model not only gives more reliable estimates of disease rates than the other two approaches but also allows for tractable models that can accommodate more numerous sources of variation and unknown parameters.

  20. Expedited Selection of NMR Chiral Solvating Agents for Determination of Enantiopurity

    PubMed Central

    2016-01-01

    The use of NMR chiral solvating agents (CSAs) for the analysis of enantiopurity has been known for decades, but has been supplanted in recent years by chromatographic enantioseparation technology. While chromatographic methods for the analysis of enantiopurity are now commonplace and easy to implement, there are still individual compounds and entire classes of analytes where enantioseparation can prove extremely difficult, notably, compounds that are chiral by virtue of very subtle differences such as isotopic substitution or small differences in alkyl chain length. NMR analysis using CSAs can often be useful for such problems, but the traditional approach to selection of an appropriate CSA and the development of an NMR-based analysis method often involves a trial-and-error approach that can be relatively slow and tedious. In this study we describe a high-throughput experimentation approach to the selection of NMR CSAs that employs automation-enabled screening of prepared libraries of CSAs in a systematic fashion. This approach affords excellent results for a standard set of enantioenriched compounds, providing a valuable comparative data set for the effectiveness of CSAs for different classes of compounds. In addition, the technique has been successfully applied to challenging pharmaceutical development problems that are not amenable to chromatographic solutions. Overall, this methodology provides a rapid and powerful approach for investigating enantiopurity that compliments and augments conventional chromatographic approaches. PMID:27280168

  1. Deep Learning MR Imaging-based Attenuation Correction for PET/MR Imaging.

    PubMed

    Liu, Fang; Jang, Hyungseok; Kijowski, Richard; Bradshaw, Tyler; McMillan, Alan B

    2018-02-01

    Purpose To develop and evaluate the feasibility of deep learning approaches for magnetic resonance (MR) imaging-based attenuation correction (AC) (termed deep MRAC) in brain positron emission tomography (PET)/MR imaging. Materials and Methods A PET/MR imaging AC pipeline was built by using a deep learning approach to generate pseudo computed tomographic (CT) scans from MR images. A deep convolutional auto-encoder network was trained to identify air, bone, and soft tissue in volumetric head MR images coregistered to CT data for training. A set of 30 retrospective three-dimensional T1-weighted head images was used to train the model, which was then evaluated in 10 patients by comparing the generated pseudo CT scan to an acquired CT scan. A prospective study was carried out for utilizing simultaneous PET/MR imaging for five subjects by using the proposed approach. Analysis of covariance and paired-sample t tests were used for statistical analysis to compare PET reconstruction error with deep MRAC and two existing MR imaging-based AC approaches with CT-based AC. Results Deep MRAC provides an accurate pseudo CT scan with a mean Dice coefficient of 0.971 ± 0.005 for air, 0.936 ± 0.011 for soft tissue, and 0.803 ± 0.021 for bone. Furthermore, deep MRAC provides good PET results, with average errors of less than 1% in most brain regions. Significantly lower PET reconstruction errors were realized with deep MRAC (-0.7% ± 1.1) compared with Dixon-based soft-tissue and air segmentation (-5.8% ± 3.1) and anatomic CT-based template registration (-4.8% ± 2.2). Conclusion The authors developed an automated approach that allows generation of discrete-valued pseudo CT scans (soft tissue, bone, and air) from a single high-spatial-resolution diagnostic-quality three-dimensional MR image and evaluated it in brain PET/MR imaging. This deep learning approach for MR imaging-based AC provided reduced PET reconstruction error relative to a CT-based standard within the brain compared with current MR imaging-based AC approaches. © RSNA, 2017 Online supplemental material is available for this article.

  2. Treating multi-level cervical disc disease with hybrid surgery compared to anterior cervical discectomy and fusion: a systematic review and meta-analysis.

    PubMed

    Lu, Victor M; Zhang, Lucy; Scherman, Daniel B; Rao, Prashanth J; Mobbs, Ralph J; Phan, Kevin

    2017-02-01

    The traditional surgical approach to treat multi-level cervical disc disease (mCDD) has been anterior cervical discectomy and fusion (ACDF). There has been recent development of other surgical approaches to further improve clinical outcomes. Collectively, when elements of these different approaches are combined in surgery, it is known as hybrid surgery (HS) which remains a novel treatment option. A systematic review and meta-analysis was conducted to compare the outcomes of HS versus ACDF for the treatment of mCDD. Relevant articles were identified from six electronic databases from their inception to January 2016. From 8 relevant studies identified, 169 patients undergoing HS were compared with 193 ACDF procedures. Operative time was greater after HS by 42 min (p < 0.00001), with less intraoperative blood loss by 26 mL (p < 0.00001) and shorter return to work by 32 days (p < 0.00001). In terms of clinical outcomes, HS was associated with greater C2-C7 range of motion (ROM) preservation (p < 0.00001) and less functional impairment (p = 0.008) after surgery compared to ACDF. There was no significant difference between HS and ACDF with respect to postoperative pain (p = 0.12). The postoperative course following HS was not significantly different to ACDF in terms of length of stay (p = 0.24) and postoperative complication rates (p = 0.18). HS is a novel surgical approach to treat mCDD, associated with a greater operative time, less intraoperative blood loss and comparable if not superior clinical outcomes compared to ACDF. While it remains a viable consideration, there is a lack of robust clinical evidence in the literature. Future large prospective registries and randomised trials are warranted to validate the findings of this study.

  3. Integrative analysis of environmental sequences using MEGAN4.

    PubMed

    Huson, Daniel H; Mitra, Suparna; Ruscheweyh, Hans-Joachim; Weber, Nico; Schuster, Stephan C

    2011-09-01

    A major challenge in the analysis of environmental sequences is data integration. The question is how to analyze different types of data in a unified approach, addressing both the taxonomic and functional aspects. To facilitate such analyses, we have substantially extended MEGAN, a widely used taxonomic analysis program. The new program, MEGAN4, provides an integrated approach to the taxonomic and functional analysis of metagenomic, metatranscriptomic, metaproteomic, and rRNA data. While taxonomic analysis is performed based on the NCBI taxonomy, functional analysis is performed using the SEED classification of subsystems and functional roles or the KEGG classification of pathways and enzymes. A number of examples illustrate how such analyses can be performed, and show that one can also import and compare classification results obtained using others' tools. MEGAN4 is freely available for academic purposes, and installers for all three major operating systems can be downloaded from www-ab.informatik.uni-tuebingen.de/software/megan.

  4. Critical Discourse Analysis in Comparative Education: A Discursive Study of "Partnership" in Tanzania's Poverty Reduction Policies

    ERIC Educational Resources Information Center

    Vavrus, Frances; Seghers, Maud

    2010-01-01

    The study of policy in comparative education has been approached using methods associated with the principal social science disciplines that have informed the field since its inception. In particular, the disciplines of history, political science, sociology, and anthropology have had a significant influence on determining the acceptable methods…

  5. Growth or Steady State? A Bibliometric Focus on International Comparative Higher Education Research

    ERIC Educational Resources Information Center

    Kosmützky, Anna; Krücken, Georg

    2014-01-01

    The study combines a bibliometric approach with a content analysis of abstracts of articles to explore the patterns of international comparative higher education research in leading international journals. The overall data set covers 4,095 publications from the Web of Science for the period 1992-2012 and the amount of international comparative…

  6. Multi-criteria comparative evaluation of spallation reaction models

    NASA Astrophysics Data System (ADS)

    Andrianov, Andrey; Andrianova, Olga; Konobeev, Alexandr; Korovin, Yury; Kuptsov, Ilya

    2017-09-01

    This paper presents an approach to a comparative evaluation of the predictive ability of spallation reaction models based on widely used, well-proven multiple-criteria decision analysis methods (MAVT/MAUT, AHP, TOPSIS, PROMETHEE) and the results of such a comparison for 17 spallation reaction models in the presence of the interaction of high-energy protons with natPb.

  7. Quality Assurance in an International Higher Education Area: A Summary of a Case-Study Approach and Comparative Analysis

    ERIC Educational Resources Information Center

    Bernhard, Andrea

    2012-01-01

    Transparency and comparability of higher education institutions especially in terms of their academic programmes and research activities are important issues for today's working environment. This paper is an overview of a recently completed PhD thesis which outlines examples of selected Organization for Economic Co-operation and Development…

  8. A Comparative Analysis regarding Pictures Included in Secondary School Geography Textbooks Taught in Turkey

    ERIC Educational Resources Information Center

    Yasar, Okan; Seremet, Mehmet

    2007-01-01

    This study brings in a comparative approach regarding pictures involved in secondary school (14-17 ages) textbooks taught in Turkey. In this respect, following the classification of pictures (line drawings and photographs) included in secondary school education geography textbooks, evaluation of the photographs in books in question in terms of…

  9. Transition Systems and Non-Standard Employment in Early Career: Comparing Japan and Switzerland

    ERIC Educational Resources Information Center

    Imdorf, Christian; Helbling, Laura Alexandra; Inui, Akio

    2017-01-01

    Even though Japan and Switzerland are characterised by comparatively low youth unemployment rates, non-standard forms of employment are on the rise, posing a risk to the stable integration of young labour market entrants. Drawing on the French approach of societal analysis, this paper investigates how country-specific school-to-work transition…

  10. Comparing Delivery Approaches to Teaching Abnormal Psychology: Investigating Student Perceptions and Learning Outcomes

    ERIC Educational Resources Information Center

    Goette, William F.; Delello, Julie A.; Schmitt, Andrew L.; Sullivan, Jeremy R.; Rangel, Angelica

    2017-01-01

    This study compares the academic performance and perceptions of 114 undergraduate students enrolled in an abnormal psychology course. Specifically, this study focuses on whether face-to-face (F2F) or blended modalities are associated with student learning outcomes. In this study, data analysis was based upon the examination of end-of-course…

  11. Comparing Child Protective Investigation Performance between Law Enforcement Agencies and Child Welfare Agencies

    ERIC Educational Resources Information Center

    Jordan, Neil; Yampolskaya, Svetlana; Gustafson, Mara; Armstrong, Mary; McNeish, Roxann; Vargo, Amy

    2011-01-01

    This study examines the comparative effectiveness of using law enforcement agencies for child protective investigation (CPI), in contrast with the traditional approach of CPI conducted by the public child welfare agency. The analysis uses 2006-2007 data from a natural experiment conducted in Florida to show modest differences in performance and…

  12. Wavelength dispersive X-ray fluorescence analysis using fundamental parameter approach of Catha edulis and other related plant samples

    NASA Astrophysics Data System (ADS)

    Shaltout, Abdallah A.; Moharram, Mohammed A.; Mostafa, Nasser Y.

    2012-01-01

    This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method.

  13. Comparison of a rational vs. high throughput approach for rapid salt screening and selection.

    PubMed

    Collman, Benjamin M; Miller, Jonathan M; Seadeek, Christopher; Stambek, Julie A; Blackburn, Anthony C

    2013-01-01

    In recent years, high throughput (HT) screening has become the most widely used approach for early phase salt screening and selection in a drug discovery/development setting. The purpose of this study was to compare a rational approach for salt screening and selection to those results previously generated using a HT approach. The rational approach involved a much smaller number of initial trials (one salt synthesis attempt per counterion) that were selected based on a few strategic solubility determinations of the free form combined with a theoretical analysis of the ideal solvent solubility conditions for salt formation. Salt screening results for sertraline, tamoxifen, and trazodone using the rational approach were compared to those previously generated by HT screening. The rational approach produced similar results to HT screening, including identification of the commercially chosen salt forms, but with a fraction of the crystallization attempts. Moreover, the rational approach provided enough solid from the very initial crystallization of a salt for more thorough and reliable solid-state characterization and thus rapid decision-making. The crystallization techniques used in the rational approach mimic larger-scale process crystallization, allowing smoother technical transfer of the selected salt to the process chemist.

  14. A Cost-Benefit Analysis of Low-Dose Aspirin Prophylaxis for the Prevention of Preeclampsia in the United States.

    PubMed

    Werner, Erika F; Hauspurg, Alisse K; Rouse, Dwight J

    2015-12-01

    To develop a decision model to evaluate the risks, benefits, and costs of different approaches to aspirin prophylaxis for the approximately 4 million pregnant women in the United States annually. We created a decision model to evaluate four approaches to aspirin prophylaxis in the United States: no prophylaxis, prophylaxis per American College of Obstetricians and Gynecologists (the College) recommendations, prophylaxis per U.S. Preventive Services Task Force recommendations, and universal prophylaxis. We included the costs associated with aspirin, preeclampsia, preterm birth, and potential aspirin-associated adverse effects. TreeAge Pro 2011 was used to perform the analysis. The estimated rate of preeclampsia would be 4.18% without prophylaxis compared with 4.17% with the College approach in which 0.35% (n=14,000) of women receive aspirin, 3.83% with the U.S. Preventive Services Task Force approach in which 23.5% (n=940,800) receive aspirin, and 3.81% with universal prophylaxis. Compared with no prophylaxis, the U.S. Preventive Services Task Force approach would save $377.4 million in direct medical care costs annually, and universal prophylaxis would save $365 million assuming 4 million births each year. The U.S. Preventive Services Task Force approach is the most cost-beneficial in 79% of probabilistic simulations. Assuming a willingness to pay of $100,000 per neonatal quality-adjusted life-year gained, the universal approach is the most cost-effective in more than 99% of simulations. Both the U.S. Preventive Services Task Force approach and universal prophylaxis would reduce morbidity, save lives, and lower health care costs in the United States to a much greater degree than the approach currently recommended by the College.

  15. Early Versus Delayed Surgical Decompression of Spinal Cord after Traumatic Cervical Spinal Cord Injury: A Cost-Utility Analysis.

    PubMed

    Furlan, Julio C; Craven, B Catharine; Massicotte, Eric M; Fehlings, Michael G

    2016-04-01

    This cost-utility analysis was undertaken to compare early (≤24 hours since trauma) versus delayed surgical decompression of spinal cord to determine which approach is more cost effective in the management of patients with acute traumatic cervical spinal cord injury (SCI). This study includes the patients enrolled into the Surgical Timing in Acute Spinal Cord Injury Study (STASCIS) and admitted at Toronto Western Hospital. Cases were grouped into patients with motor complete SCI and individuals with motor incomplete SCI. A cost-utility analysis was performed for each group of patients by the use of data for the first 6 months after SCI. The perspective of a public health care insurer was adopted. Costs were estimated in 2014 U.S. dollars. Utilities were estimated from the STASCIS. The baseline analysis indicates early spinal decompression is more cost-effective approach compared with the delayed spinal decompression. When we considered the delayed spinal decompression as the baseline strategy, the incremental cost-effectiveness ratio analysis revealed a saving of US$ 58,368,024.12 per quality-adjusted life years gained for patients with complete SCI and a saving of US$ 536,217.33 per quality-adjusted life years gained in patients with incomplete SCI for the early spinal decompression. The probabilistic analysis confirmed the early-decompression strategy as more cost effective than the delayed-decompression approach, even though there is no clearly dominant strategy. The results of this economic analysis suggests that early decompression of spinal cord was more cost effective than delayed surgical decompression in the management of patients with motor complete and incomplete SCI, even though no strategy was clearly dominant. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  17. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    PubMed

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Input-output analysis and the hospital budgeting process.

    PubMed Central

    Cleverly, W O

    1975-01-01

    Two hospitals budget systems, a conventional budget and an input-output budget, are compared to determine how they affect management decisions in pricing, output, planning, and cost control. Analysis of data from a 210-bed not-for-profit hospital indicates that adoption of the input-output budget could cause substantial changes in posted hospital rates in individual departments but probably would have no impact on hospital output determination. The input-output approach promises to be a more accurate system for cost control and planning because, unlike the conventional approach, it generates objective signals for investigating variances of expenses from budgeted levels. PMID:1205865

  19. Analysis of a Field Study: Programs, Services, and Approaches Toward the Reduction of Adolescent Pregnancy. Final Report.

    ERIC Educational Resources Information Center

    Moore, Audrey

    This field survey relative to adolescent pregnancy was undertaken through site visits and interviews. Data indicated that: (1) while many people are carrying out excellent programs and activities, the numbers are small compared to the need; (2) in some types of services the old tried-and-found-wanting approaches are perpetuated; (3) in some,…

  20. Plagiarism Detection: A Comparison of Teaching Assistants and a Software Tool in Identifying Cheating in a Psychology Course

    ERIC Educational Resources Information Center

    Seifried, Eva; Lenhard, Wolfgang; Spinath, Birgit

    2015-01-01

    Essays that are assigned as homework in large classes are prone to cheating via unauthorized collaboration. In this study, we compared the ability of a software tool based on Latent Semantic Analysis (LSA) and student teaching assistants to detect plagiarism in a large group of students. To do so, we took two approaches: the first approach was…

  1. Comparison of Nomothetic versus Idiographic-Oriented Methods for Making Predictions about Distal Outcomes from Time Series Data

    ERIC Educational Resources Information Center

    Castro-Schilo, Laura; Ferrer, Emilio

    2013-01-01

    We illustrate the idiographic/nomothetic debate by comparing 3 approaches to using daily self-report data on affect for predicting relationship quality and breakup. The 3 approaches included (a) the first day in the series of daily data; (b) the mean and variability of the daily series; and (c) parameters from dynamic factor analysis, a…

  2. Simultaneous analysis and design

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.

    1984-01-01

    Optimization techniques are increasingly being used for performing nonlinear structural analysis. The development of element by element (EBE) preconditioned conjugate gradient (CG) techniques is expected to extend this trend to linear analysis. Under these circumstances the structural design problem can be viewed as a nested optimization problem. There are computational benefits to treating this nested problem as a large single optimization problem. The response variables (such as displacements) and the structural parameters are all treated as design variables in a unified formulation which performs simultaneously the design and analysis. Two examples are used for demonstration. A seventy-two bar truss is optimized subject to linear stress constraints and a wing box structure is optimized subject to nonlinear collapse constraints. Both examples show substantial computational savings with the unified approach as compared to the traditional nested approach.

  3. Comparison of pre-processing techniques for fluorescence microscopy images of cells labeled for actin.

    PubMed

    Muralidhar, Gautam S; Channappayya, Sumohana S; Slater, John H; Blinka, Ellen M; Bovik, Alan C; Frey, Wolfgang; Markey, Mia K

    2008-11-06

    Automated analysis of fluorescence microscopy images of endothelial cells labeled for actin is important for quantifying changes in the actin cytoskeleton. The current manual approach is laborious and inefficient. The goal of our work is to develop automated image analysis methods, thereby increasing cell analysis throughput. In this study, we present preliminary results on comparing different algorithms for cell segmentation and image denoising.

  4. Statistical Power Analysis with Microsoft Excel: Normal Tests for One or Two Means as a Prelude to Using Non-Central Distributions to Calculate Power

    ERIC Educational Resources Information Center

    Texeira, Antonio; Rosa, Alvaro; Calapez, Teresa

    2009-01-01

    This article presents statistical power analysis (SPA) based on the normal distribution using Excel, adopting textbook and SPA approaches. The objective is to present the latter in a comparative way within a framework that is familiar to textbook level readers, as a first step to understand SPA with other distributions. The analysis focuses on the…

  5. ClinicAl Evaluation of Dental Restorative Materials

    DTIC Science & Technology

    1989-01-01

    use of an Atuarial Life Table Survival Analysis procedure. The median survival time for anterior composites was 13.5 years, as compared to 12.1 years...dental materials. For the first time in clinical biomaterials research, we used a statistical approach of Survival Analysis which utilized the... analysis has been established to assure uniformity in usage. This scale is now in use by clinical investigators throughout the country. Its use at the

  6. Effect Size Measure and Analysis of Single Subject Designs

    ERIC Educational Resources Information Center

    Society for Research on Educational Effectiveness, 2013

    2013-01-01

    One of the vexing problems in the analysis of SSD is in the assessment of the effect of intervention. Serial dependence notwithstanding, the linear model approach that has been advanced involves, in general, the fitting of regression lines (or curves) to the set of observations within each phase of the design and comparing the parameters of these…

  7. From Gain Score t to ANCOVA F (and Vice Versa)

    ERIC Educational Resources Information Center

    Knapp, Thomas R.; Schafer, William D.

    2009-01-01

    Although they test somewhat different hypotheses, analysis of gain scores (or its repeated-measures analog) and analysis of covariance are both common methods that researchers use for pre-post data. The results of the two approaches yield non-comparable outcomes, but since the same generic data are used, it is possible to transform the test…

  8. A comparison of a technical and a participatory application of social impact assessment.

    Treesearch

    Dennis R Becker; Charles C Harris; Erik A Nielsen; William J. McLaughlin

    2004-01-01

    Results of independent applications of a technical and a participatory approach to SIA are compared for an assessment of impacts of the proposed removal of hydroelectric dams to recover threatened and endangered salmon in the Pacific Northwest of the United States. The analysis focuses on empirical differences and similarities between the technical social analysis...

  9. AN INTER-AGENCY APPROACH FOR DETERMINING REGIONAL LAND COVER AND SPECIES HABITAT CONSERVATION STATUS IN THE AMERICAN SOUTHWEST: THE SOUTHWEST REGIONAL GAP ANALYSIS PROJECT

    EPA Science Inventory

    The Gap Analysis Program (GAP) is a national inter-agency program that maps the distribution of plant communities and selected animal species and compares these distributions with land stewardship to identify biotic elements at potential risk of endangerment. GAP uses remote sens...

  10. Comparative Benefit-Cost Analysis of the Abecedarian Program and Its Policy Implications

    ERIC Educational Resources Information Center

    Barnett, W. S.; Masse, Leonard N.

    2007-01-01

    Child care and education are to some extent joint products of preschool programs, but public policy and research frequently approach these two goals independently. We present a benefit-cost analysis of a preschool program that provided intensive education during full-day child care. Data were obtained from a randomized trial with longitudinal…

  11. Higher Education Institutions and the Administration of International Student Rights: A Law and Policy Analysis

    ERIC Educational Resources Information Center

    Ramia, Gaby

    2017-01-01

    The scholarly literature in higher education has not dealt extensively with the responsibilities of institutions for servicing the rights of international students. This paper is a comparative analysis of legal frameworks which guide institutions in their handling of international student rights. Two national approaches, those of Australia and New…

  12. Thirteenth NASTRAN (R) Users' Colloquium

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The application of finite element methods in engineering is discussed and the use of NASTRAN is compared with other approaches. Specific applications, pre- and post-processing or auxiliary programs, and additional methods of analysis with NASTRAN are covered.

  13. MOLECULAR MARKER ANALYSIS OF DEARS SAMPLES

    EPA Science Inventory

    Source apportionment based on organic molecular markers provides a promising approach for meeting the Detroit Exposure and Aerosol Research Study (DEARS) objective of comparing source contributions between community air monitoring stations and various neighborhoods. Source appor...

  14. (ISEA) MOLECULAR MARKER ANALYSIS OF DEARS SAMPLES

    EPA Science Inventory

    Source apportionment based on organic molecular markers provides a promising approach for meeting the Detroit Exposure and Aerosol Research Study (DEARS) objective of comparing source contributions between community air monitoring stations and various neighborhoods. Source appor...

  15. Sharing Resources In Mobile/Satellite Communications

    NASA Technical Reports Server (NTRS)

    Yan, Tsun-Yee; Sue, Miles K.

    1992-01-01

    Report presents preliminary theoretical analysis of several alternative schemes for allocation of satellite resource among terrestrial subscribers of landmobile/satellite communication system. Demand-access and random-access approaches under code-division and frequency-division concepts compared.

  16. Using occupancy modelling to compare environmental DNA to traditional field methods for regional-scale monitoring of an endangered aquatic species.

    PubMed

    Schmelzle, Molly C; Kinziger, Andrew P

    2016-07-01

    Environmental DNA (eDNA) monitoring approaches promise to greatly improve detection of rare, endangered and invasive species in comparison with traditional field approaches. Herein, eDNA approaches and traditional seining methods were applied at 29 research locations to compare method-specific estimates of detection and occupancy probabilities for endangered tidewater goby (Eucyclogobius newberryi). At each location, multiple paired seine hauls and water samples for eDNA analysis were taken, ranging from two to 23 samples per site, depending upon habitat size. Analysis using a multimethod occupancy modelling framework indicated that the probability of detection using eDNA was nearly double (0.74) the rate of detection for seining (0.39). The higher detection rates afforded by eDNA allowed determination of tidewater goby occupancy at two locations where they have not been previously detected and at one location considered to be locally extirpated. Additionally, eDNA concentration was positively related to tidewater goby catch per unit effort, suggesting eDNA could potentially be used as a proxy for local tidewater goby abundance. Compared to traditional field sampling, eDNA provided improved occupancy parameter estimates and can be applied to increase management efficiency across a broad spatial range and within a diversity of habitats. © 2015 John Wiley & Sons Ltd.

  17. A comparative analysis: storm water pollution policy in California, USA and Victoria, Australia.

    PubMed

    Swamikannu, X; Radulescu, D; Young, R; Allison, R

    2003-01-01

    Urban drainage systems historically were developed on principles of hydraulic capacity for the transport of storm water to reduce the risk of flooding. However, with urbanization the percent of impervious surfaces increases dramatically resulting in increased flood volumes, peak discharge rates, velocities and duration, and a significant increase in pollutant loads. Storm water and urban runoff are the leading causes of the impairment of receiving waters and their beneficial uses in Australia and the United States today. Strict environmental and technology controls on wastewater treatment facilities and industry for more than three decades have ensured that these sources are less significant today as the cause of impairment of receiving waters. This paper compares the approach undertaken by the Environmental Protection Authority Victoria for the Melbourne metropolitan area with the approach implemented by the California Environmental Protection Agency for the Los Angeles area to control storm water pollution. Both these communities are largely similar in population size and the extent of urbanization. The authors present an analysis of the different approaches contrasting Australia with the USA, comment on their comparative success, and discuss the relevance of the two experiences for developed and developing nations in the context of environmental policy making to control storm water and urban runoff pollution.

  18. Power flow analysis of two coupled plates with arbitrary characteristics

    NASA Technical Reports Server (NTRS)

    Cuschieri, J. M.

    1988-01-01

    The limitation of keeping two plates identical is removed and the vibrational power input and output are evaluated for different area ratios, plate thickness ratios, and for different values of the structural damping loss factor for the source plate (plate with excitation) and the receiver plate. In performing this parametric analysis, the source plate characteristics are kept constant. The purpose of this parametric analysis is to be able to determine the most critical parameters that influence the flow of vibrational power from the source plate to the receiver plate. In the case of the structural damping parametric analysis, the influence of changes in the source plate damping is also investigated. As was done previously, results obtained from the mobility power flow approach will be compared to results obtained using a statistical energy analysis (SEA) approach. The significance of the power flow results are discussed together with a discussion and a comparison between SEA results and the mobility power flow results. Furthermore, the benefits that can be derived from using the mobility power flow approach, are also examined.

  19. Mesh Denoising based on Normal Voting Tensor and Binary Optimization.

    PubMed

    Yadav, Sunil Kumar; Reitebuch, Ulrich; Polthier, Konrad

    2017-08-17

    This paper presents a two-stage mesh denoising algorithm. Unlike other traditional averaging approaches, our approach uses an element-based normal voting tensor to compute smooth surfaces. By introducing a binary optimization on the proposed tensor together with a local binary neighborhood concept, our algorithm better retains sharp features and produces smoother umbilical regions than previous approaches. On top of that, we provide a stochastic analysis on the different kinds of noise based on the average edge length. The quantitative results demonstrate that the performance of our method is better compared to state-of-the-art smoothing approaches.

  20. The Effect of Visual Information on the Manual Approach and Landing

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1982-01-01

    The effect of visual information in combination with basic display information on the approach performance. A pre-experimental model analysis was performed in terms of the optimal control model. The resulting aircraft approach performance predictions were compared with the results of a moving base simulator program. The results illustrate that the model provides a meaningful description of the visual (scene) perception process involved in the complex (multi-variable, time varying) manual approach task with a useful predictive capability. The theoretical framework was shown to allow a straight-forward investigation of the complex interaction of a variety of task variables.

  1. A spin column-free approach to sodium hydroxide-based glycan permethylation.

    PubMed

    Hu, Yueming; Borges, Chad R

    2017-07-24

    Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues-yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based "glycan node" analysis results. When applied to blood plasma samples from stage III-IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens.

  2. A spin column-free approach to sodium hydroxide-based glycan permethylation†

    PubMed Central

    Hu, Yueming; Borges, Chad R.

    2018-01-01

    Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues—yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based “glycan node” analysis results. When applied to blood plasma samples from stage III–IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens. PMID:28635997

  3. To Control False Positives in Gene-Gene Interaction Analysis: Two Novel Conditional Entropy-Based Approaches

    PubMed Central

    Lin, Meihua; Li, Haoli; Zhao, Xiaolei; Qin, Jiheng

    2013-01-01

    Genome-wide analysis of gene-gene interactions has been recognized as a powerful avenue to identify the missing genetic components that can not be detected by using current single-point association analysis. Recently, several model-free methods (e.g. the commonly used information based metrics and several logistic regression-based metrics) were developed for detecting non-linear dependence between genetic loci, but they are potentially at the risk of inflated false positive error, in particular when the main effects at one or both loci are salient. In this study, we proposed two conditional entropy-based metrics to challenge this limitation. Extensive simulations demonstrated that the two proposed metrics, provided the disease is rare, could maintain consistently correct false positive rate. In the scenarios for a common disease, our proposed metrics achieved better or comparable control of false positive error, compared to four previously proposed model-free metrics. In terms of power, our methods outperformed several competing metrics in a range of common disease models. Furthermore, in real data analyses, both metrics succeeded in detecting interactions and were competitive with the originally reported results or the logistic regression approaches. In conclusion, the proposed conditional entropy-based metrics are promising as alternatives to current model-based approaches for detecting genuine epistatic effects. PMID:24339984

  4. Parent-only vs. parent-child (family-focused) approaches for weight loss in obese and overweight children: a systematic review and meta-analysis.

    PubMed

    Jull, A; Chen, R

    2013-09-01

    Families are recommended as the agents of change for weight loss in overweight and obese children; family approaches are more effective than those that focus on the child alone. However, interventions that focus on parents alone have not been summarized. The objective of this review was to assess the effectiveness of interventions that compared a parent-only (PO) condition with a parent-child (PC) condition. Four trials using a similar between-group background approaches to overweight and obese children's weight loss met the inclusion criteria, but only one trial reported sufficient data for meta-analysis. Further information was obtained from authors. Meta-analysis showed no significant difference in z-BMI from baseline to end of treatment between the conditions (three trials) or to end of follow up (two trials). The trials were at risk of bias and no single trial was at lower risk of bias than others. There is an absence of high quality evidence regarding the effect of parent-only interventions for weight loss in children compared to parent-child interventions, but current evidence suggests the need for further investigation. © 2013 The Authors. obesity reviews © 2013 International Association for the Study of Obesity.

  5. Optimized Probe Masking for Comparative Transcriptomics of Closely Related Species

    PubMed Central

    Poeschl, Yvonne; Delker, Carolin; Trenner, Jana; Ullrich, Kristian Karsten; Quint, Marcel; Grosse, Ivo

    2013-01-01

    Microarrays are commonly applied to study the transcriptome of specific species. However, many available microarrays are restricted to model organisms, and the design of custom microarrays for other species is often not feasible. Hence, transcriptomics approaches of non-model organisms as well as comparative transcriptomics studies among two or more species often make use of cost-intensive RNAseq studies or, alternatively, by hybridizing transcripts of a query species to a microarray of a closely related species. When analyzing these cross-species microarray expression data, differences in the transcriptome of the query species can cause problems, such as the following: (i) lower hybridization accuracy of probes due to mismatches or deletions, (ii) probes binding multiple transcripts of different genes, and (iii) probes binding transcripts of non-orthologous genes. So far, methods for (i) exist, but these neglect (ii) and (iii). Here, we propose an approach for comparative transcriptomics addressing problems (i) to (iii), which retains only transcript-specific probes binding transcripts of orthologous genes. We apply this approach to an Arabidopsis lyrata expression data set measured on a microarray designed for Arabidopsis thaliana, and compare it to two alternative approaches, a sequence-based approach and a genomic DNA hybridization-based approach. We investigate the number of retained probe sets, and we validate the resulting expression responses by qRT-PCR. We find that the proposed approach combines the benefit of sequence-based stringency and accuracy while allowing the expression analysis of much more genes than the alternative sequence-based approach. As an added benefit, the proposed approach requires probes to detect transcripts of orthologous genes only, which provides a superior base for biological interpretation of the measured expression responses. PMID:24260119

  6. Saliva Proteomics Analysis Offers Insights on Type 1 Diabetes Pathology in a Pediatric Population

    PubMed Central

    Pappa, Eftychia; Vastardis, Heleni; Mermelekas, George; Gerasimidi-Vazeou, Andriani; Zoidakis, Jerome; Vougas, Konstantinos

    2018-01-01

    The composition of the salivary proteome is affected by pathological conditions. We analyzed by high resolution mass spectrometry approaches saliva samples collected from children and adolescents with type 1 diabetes and healthy controls. The list of more than 2000 high confidence protein identifications constitutes a comprehensive characterization of the salivary proteome. Patients with good glycemic regulation and healthy individuals have comparable proteomic profiles. In contrast, a significant number of differentially expressed proteins were identified in the saliva of patients with poor glycemic regulation compared to patients with good glycemic control and healthy children. These proteins are involved in biological processes relevant to diabetic pathology such as endothelial damage and inflammation. Moreover, a putative preventive therapeutic approach was identified based on bioinformatic analysis of the deregulated salivary proteins. Thus, thorough characterization of saliva proteins in diabetic pediatric patients established a connection between molecular changes and disease pathology. This proteomic and bioinformatic approach highlights the potential of salivary diagnostics in diabetes pathology and opens the way for preventive treatment of the disease. PMID:29755368

  7. Molecular Characterization of Transgenic Events Using Next Generation Sequencing Approach.

    PubMed

    Guttikonda, Satish K; Marri, Pradeep; Mammadov, Jafar; Ye, Liang; Soe, Khaing; Richey, Kimberly; Cruse, James; Zhuang, Meibao; Gao, Zhifang; Evans, Clive; Rounsley, Steve; Kumpatla, Siva P

    2016-01-01

    Demand for the commercial use of genetically modified (GM) crops has been increasing in light of the projected growth of world population to nine billion by 2050. A prerequisite of paramount importance for regulatory submissions is the rigorous safety assessment of GM crops. One of the components of safety assessment is molecular characterization at DNA level which helps to determine the copy number, integrity and stability of a transgene; characterize the integration site within a host genome; and confirm the absence of vector DNA. Historically, molecular characterization has been carried out using Southern blot analysis coupled with Sanger sequencing. While this is a robust approach to characterize the transgenic crops, it is both time- and resource-consuming. The emergence of next-generation sequencing (NGS) technologies has provided highly sensitive and cost- and labor-effective alternative for molecular characterization compared to traditional Southern blot analysis. Herein, we have demonstrated the successful application of both whole genome sequencing and target capture sequencing approaches for the characterization of single and stacked transgenic events and compared the results and inferences with traditional method with respect to key criteria required for regulatory submissions.

  8. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    PubMed

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  9. Estimating hazard ratios in cohort data with missing disease information due to death.

    PubMed

    Binder, Nadine; Herrnböck, Anne-Sophie; Schumacher, Martin

    2017-03-01

    In clinical and epidemiological studies information on the primary outcome of interest, that is, the disease status, is usually collected at a limited number of follow-up visits. The disease status can often only be retrieved retrospectively in individuals who are alive at follow-up, but will be missing for those who died before. Right-censoring the death cases at the last visit (ad-hoc analysis) yields biased hazard ratio estimates of a potential risk factor, and the bias can be substantial and occur in either direction. In this work, we investigate three different approaches that use the same likelihood contributions derived from an illness-death multistate model in order to more adequately estimate the hazard ratio by including the death cases into the analysis: a parametric approach, a penalized likelihood approach, and an imputation-based approach. We investigate to which extent these approaches allow for an unbiased regression analysis by evaluating their performance in simulation studies and on a real data example. In doing so, we use the full cohort with complete illness-death data as reference and artificially induce missing information due to death by setting discrete follow-up visits. Compared to an ad-hoc analysis, all considered approaches provide less biased or even unbiased results, depending on the situation studied. In the real data example, the parametric approach is seen to be too restrictive, whereas the imputation-based approach could almost reconstruct the original event history information. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Variable Mach number design approach for a parallel waverider with a wide-speed range based on the osculating cone theory

    NASA Astrophysics Data System (ADS)

    Zhao, Zhen-tao; Huang, Wei; Li, Shi-Bin; Zhang, Tian-Tian; Yan, Li

    2018-06-01

    In the current study, a variable Mach number waverider design approach has been proposed based on the osculating cone theory. The design Mach number of the osculating cone constant Mach number waverider with the same volumetric efficiency of the osculating cone variable Mach number waverider has been determined by writing a program for calculating the volumetric efficiencies of waveriders. The CFD approach has been utilized to verify the effectiveness of the proposed approach. At the same time, through the comparative analysis of the aerodynamic performance, the performance advantage of the osculating cone variable Mach number waverider is studied. The obtained results show that the osculating cone variable Mach number waverider owns higher lift-to-drag ratio throughout the flight profile when compared with the osculating cone constant Mach number waverider, and it has superior low-speed aerodynamic performance while maintaining nearly the same high-speed aerodynamic performance.

  11. Adjoint Sensitivity Analysis of Orbital Mechanics: Application to Computations of Observables' Partials with Respect to Harmonics of the Planetary Gravity Fields

    NASA Technical Reports Server (NTRS)

    Ustinov, Eugene A.; Sunseri, Richard F.

    2005-01-01

    An approach is presented to the inversion of gravity fields based on evaluation of partials of observables with respect to gravity harmonics using the solution of adjoint problem of orbital dynamics of the spacecraft. Corresponding adjoint operator is derived directly from the linear operator of the linearized forward problem of orbital dynamics. The resulting adjoint problem is similar to the forward problem and can be solved by the same methods. For given highest degree N of gravity harmonics desired, this method involves integration of N adjoint solutions as compared to integration of N2 partials of the forward solution with respect to gravity harmonics in the conventional approach. Thus, for higher resolution gravity models, this approach becomes increasingly more effective in terms of computer resources as compared to the approach based on the solution of the forward problem of orbital dynamics.

  12. Efficacy of musculoskeletal manual approach in the treatment of temporomandibular joint disorder: A systematic review with meta-analysis.

    PubMed

    Martins, Wagner Rodrigues; Blasczyk, Juscelino Castro; Aparecida Furlan de Oliveira, Micaele; Lagôa Gonçalves, Karina Ferreira; Bonini-Rocha, Ana Clara; Dugailly, Pierre-Michel; de Oliveira, Ricardo Jacó

    2016-02-01

    Temporomandibular joint disorder (TMD) requires a complex diagnostic and therapeutic approach, which usually involves a multidisciplinary management. Among these treatments, musculoskeletal manual techniques are used to improve health and healing. To assess the effectiveness of musculoskeletal manual approach in temporomandibular joint disorder patients. A systematic review with meta-analysis. During August 2014 a systematic review of relevant databases (PubMed, The Cochrane Library, PEDro and ISI web of knowledge) was performed to identify controlled clinical trials without date restriction and restricted to the English language. Clinical outcomes were pain and range of motion focalized in temporomandibular joint. The mean difference (MD) or standard mean difference (SMD) with 95% confidence intervals (CIs) and overall effect size were calculated at every post treatment. The PEDro scale was used to demonstrate the quality of the included studies. From the 308 articles identified by the search strategy, 8 articles met the inclusion criteria. The meta-analysis showed a significant difference (p < 0.0001) and large effect on active mouth opening (SMD, 0.83; 95% CI, 0.42 to 1.25) and on pain during active mouth opening (MD, 1.69; 95% CI, 1.09 to 2.30) in favor of musculoskeletal manual techniques when compared to other conservative treatments for TMD. Musculoskeletal manual approaches are effective for treating TMD. In the short term, there is a larger effect regarding the latter when compared to other conservative treatments for TMD. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Racial classification in the evolutionary sciences: a comparative analysis.

    PubMed

    Billinger, Michael S

    2007-01-01

    Human racial classification has long been a problem for the discipline of anthropology, but much of the criticism of the race concept has focused on its social and political connotations. The central argument of this paper is that race is not a specifically human problem, but one that exists in evolutionary thought in general. This paper looks at various disciplinary approaches to racial or subspecies classification, extending its focus beyond the anthropological race concept by providing a comparative analysis of the use of racial classification in evolutionary biology, genetics, and anthropology.

  14. Comparative analysis of breakdown mechanism in thin SiO2 oxide films in metal-oxide-semiconductor structures under the action of heavy charged particles and a pulsed voltage

    NASA Astrophysics Data System (ADS)

    Zinchenko, V. F.; Lavrent'ev, K. V.; Emel'yanov, V. V.; Vatuev, A. S.

    2016-02-01

    Regularities in the breakdown of thin SiO2 oxide films in metal-oxide-semiconductors structures of power field-effect transistors under the action of single heavy charged particles and a pulsed voltage are studied experimentally. Using a phenomenological approach, we carry out comparative analysis of physical mechanisms and energy criteria of the SiO2 breakdown in extreme conditions of excitation of the electron subsystem in the subpicosecond time range.

  15. As above, so below? Towards understanding inverse models in BCI

    NASA Astrophysics Data System (ADS)

    Lindgren, Jussi T.

    2018-02-01

    Objective. In brain-computer interfaces (BCI), measurements of the user’s brain activity are classified into commands for the computer. With EEG-based BCIs, the origins of the classified phenomena are often considered to be spatially localized in the cortical volume and mixed in the EEG. We investigate if more accurate BCIs can be obtained by reconstructing the source activities in the volume. Approach. We contrast the physiology-driven source reconstruction with data-driven representations obtained by statistical machine learning. We explain these approaches in a common linear dictionary framework and review the different ways to obtain the dictionary parameters. We consider the effect of source reconstruction on some major difficulties in BCI classification, namely information loss, feature selection and nonstationarity of the EEG. Main results. Our analysis suggests that the approaches differ mainly in their parameter estimation. Physiological source reconstruction may thus be expected to improve BCI accuracy if machine learning is not used or where it produces less optimal parameters. We argue that the considered difficulties of surface EEG classification can remain in the reconstructed volume and that data-driven techniques are still necessary. Finally, we provide some suggestions for comparing approaches. Significance. The present work illustrates the relationships between source reconstruction and machine learning-based approaches for EEG data representation. The provided analysis and discussion should help in understanding, applying, comparing and improving such techniques in the future.

  16. A multilayered approach for the analysis of perinatal mortality using different classification systems.

    PubMed

    Gordijn, Sanne J; Korteweg, Fleurisca J; Erwich, Jan Jaap H M; Holm, Jozien P; van Diem, Mariet Th; Bergman, Klasien A; Timmer, Albertus

    2009-06-01

    Many classification systems for perinatal mortality are available, all with their own strengths and weaknesses: none of them has been universally accepted. We present a systematic multilayered approach for the analysis of perinatal mortality based on information related to the moment of death, the conditions associated with death and the underlying cause of death, using a combination of representatives of existing classification systems. We compared the existing classification systems regarding their definition of the perinatal period, level of complexity, inclusion of maternal, foetal and/or placental factors and whether they focus at a clinical or pathological viewpoint. Furthermore, we allocated the classification systems to one of three categories: 'when', 'what' or 'why', dependent on whether the allocation of the individual cases of perinatal mortality is based on the moment of death ('when'), the clinical conditions associated with death ('what'), or the underlying cause of death ('why'). A multilayered approach for the analysis and classification of perinatal mortality is possible by using combinations of existing systems; for example the Wigglesworth or Nordic Baltic ('when'), ReCoDe ('what') and Tulip ('why') classification systems. This approach is useful not only for in depth analysis of perinatal mortality in the developed world but also for analysis of perinatal mortality in the developing countries, where resources to investigate death are often limited.

  17. Active Detection of Shielded Special Nuclear Material in the Presence of Variable High Backgrounds Using a Mixed Photon-Neutron Source

    NASA Astrophysics Data System (ADS)

    Martin, Philip N.; Clemett, Ceri D.; Hill, Cassie; O'Malley, John; Campbell, Ben

    This paper describes and compares two approaches to the analysis of active interrogation data containing high photon backgrounds associated with mixed photon-neutron source flash active interrogation. Results from liquid scintillation detectors (EJ301/EJ309) fielded at the Naval Research Laboratory (NRL), in collaboration with the Atomic Weapons Establishment (AWE), using the NRL Mercury Inductive Voltage Adder (IVA) operating in both a photon and mixed photon-neutron mode at a Depleted Uranium (DU) target are presented. The standard approach applying a Figure of Merit (FOM) consisting of background sigma above background is compared with an approach looking to fit only the time-decaying photon signal with standard delayed photon emission from ∼10-MeV end-point-energy Bremsstrahlung photofission of DU. Examples where each approach does well and less well are presented together with a discussion of the relative limitations of both approaches to the type of mixed photon-neutron flash active interrogation being considered.

  18. Data assimilation of non-conventional observations using GEOS-R flash lightning: 1D+4D-VAR approach vs. assimilation of images (Invited)

    NASA Astrophysics Data System (ADS)

    Navon, M. I.; Stefanescu, R.

    2013-12-01

    Previous assimilation of lightning used nudging approaches. We develop three approaches namely, 3D-VAR WRFDA and1D+nD-VAR (n=3,4) WRFDA . The present research uses Convective Available Potential Energy (CAPE) as a proxy between lightning data and model variables. To test performance of aforementioned schemes, we assess quality of resulting analysis and forecasts of precipitation compared to those from a control experiment and verify them against NCEP stage IV precipitation. Results demonstrate that assimilating lightning observations improves precipitation statistics during the assimilation window and for 3-7 h thereafter. The 1D+4D-VAR approach yielded the best performance significantly improving precipitation rmse errors by 25% and 27.5%,compared to control during the assimilation window for two tornadic test cases. Finally we propose a new approach to assimilate 2-D images of lightning flashes based on pixel intensity, mitigating dimensionality by a reduced order method.

  19. Comparison of sorting algorithms to increase the range of Hartmann-Shack aberrometry.

    PubMed

    Bedggood, Phillip; Metha, Andrew

    2010-01-01

    Recently many software-based approaches have been suggested for improving the range and accuracy of Hartmann-Shack aberrometry. We compare the performance of four representative algorithms, with a focus on aberrometry for the human eye. Algorithms vary in complexity from the simplistic traditional approach to iterative spline extrapolation based on prior spot measurements. Range is assessed for a variety of aberration types in isolation using computer modeling, and also for complex wavefront shapes using a real adaptive optics system. The effects of common sources of error for ocular wavefront sensing are explored. The results show that the simplest possible iterative algorithm produces comparable range and robustness compared to the more complicated algorithms, while keeping processing time minimal to afford real-time analysis.

  20. Comparison of sorting algorithms to increase the range of Hartmann-Shack aberrometry

    NASA Astrophysics Data System (ADS)

    Bedggood, Phillip; Metha, Andrew

    2010-11-01

    Recently many software-based approaches have been suggested for improving the range and accuracy of Hartmann-Shack aberrometry. We compare the performance of four representative algorithms, with a focus on aberrometry for the human eye. Algorithms vary in complexity from the simplistic traditional approach to iterative spline extrapolation based on prior spot measurements. Range is assessed for a variety of aberration types in isolation using computer modeling, and also for complex wavefront shapes using a real adaptive optics system. The effects of common sources of error for ocular wavefront sensing are explored. The results show that the simplest possible iterative algorithm produces comparable range and robustness compared to the more complicated algorithms, while keeping processing time minimal to afford real-time analysis.

Top