Cınar, Yasin; Cingü, Abdullah Kürşat; Türkcü, Fatih Mehmet; Çınar, Tuba; Yüksel, Harun; Özkurt, Zeynep Gürsel; Çaça, Ihsan
2014-09-01
To compare outcomes of accelerated and conventional corneal cross-linking (CXL) for progressive keratoconus (KC). Patients were divided into two groups as the accelerated CXL group and the conventional CXL group. The uncorrected distant visual acuity (UDVA), corrected distant visual acuity (CDVA), refraction and keratometric values were measured preoperatively and postoperatively. The data of the two groups were compared statistically. The mean UDVA and CDVA were better at the six month postoperative when compared with preoperative values in two groups. While change in UDVA and CDVA was statistically significant in the accelerated CXL group (p = 0.035 and p = 0.047, respectively), it did not reach statistical significance in the conventional CXL group (p = 0.184 and p = 0.113, respectively). The decrease in the mean corneal power (Km) and maximum keratometric value (Kmax) were statistically significant in both groups (p = 0.012 and 0.046, respectively in the accelerated CXL group, p = 0.012 and 0.041, respectively, in the conventional CXL group). There was no statistically significant difference in visual and refractive results between the two groups (p > 0.05). Refractive and visual results of the accelerated CXL method and the conventional CXL method for the treatment of KC in short time period were similar. The accelerated CXL method faster and provide high throughput of the patients.
Revising the lower statistical limit of x-ray grating-based phase-contrast computed tomography.
Marschner, Mathias; Birnbacher, Lorenz; Willner, Marian; Chabior, Michael; Herzen, Julia; Noël, Peter B; Pfeiffer, Franz
2017-01-01
Phase-contrast x-ray computed tomography (PCCT) is currently investigated as an interesting extension of conventional CT, providing high soft-tissue contrast even if examining weakly absorbing specimen. Until now, the potential for dose reduction was thought to be limited compared to attenuation CT, since meaningful phase retrieval fails for scans with very low photon counts when using the conventional phase retrieval method via phase stepping. In this work, we examine the statistical behaviour of the reverse projection method, an alternative phase retrieval approach and compare the results to the conventional phase retrieval technique. We investigate the noise levels in the projections as well as the image quality and quantitative accuracy of the reconstructed tomographic volumes. The results of our study show that this method performs better in a low-dose scenario than the conventional phase retrieval approach, resulting in lower noise levels, enhanced image quality and more accurate quantitative values. Overall, we demonstrate that the lower statistical limit of the phase stepping procedure as proposed by recent literature does not apply to this alternative phase retrieval technique. However, further development is necessary to overcome experimental challenges posed by this method which would enable mainstream or even clinical application of PCCT.
AlBarakati, SF; Kula, KS; Ghoneima, AA
2012-01-01
Objective The aim of this study was to assess the reliability and reproducibility of angular and linear measurements of conventional and digital cephalometric methods. Methods A total of 13 landmarks and 16 skeletal and dental parameters were defined and measured on pre-treatment cephalometric radiographs of 30 patients. The conventional and digital tracings and measurements were performed twice by the same examiner with a 6 week interval between measurements. The reliability within the method was determined using Pearson's correlation coefficient (r2). The reproducibility between methods was calculated by paired t-test. The level of statistical significance was set at p < 0.05. Results All measurements for each method were above 0.90 r2 (strong correlation) except maxillary length, which had a correlation of 0.82 for conventional tracing. Significant differences between the two methods were observed in most angular and linear measurements except for ANB angle (p = 0.5), angle of convexity (p = 0.09), anterior cranial base (p = 0.3) and the lower anterior facial height (p = 0.6). Conclusion In general, both methods of conventional and digital cephalometric analysis are highly reliable. Although the reproducibility of the two methods showed some statistically significant differences, most differences were not clinically significant. PMID:22184624
USDA-ARS?s Scientific Manuscript database
Conventional multivariate statistical methods have been used for decades to calculate environmental indicators. These methods generally work fine if they are used in a situation where the method can be tailored to the data. But there is some skepticism that the methods might fail in the context of s...
Demirel, Gamze; Uras, Nurdan; Celik, Istemi H; Aksoy, Hatice T; Oguz, Serife S; Erdeve, Omer; Erel, Ozcan; Dilmen, Ugur
2010-10-01
We evaluated and compared the oxidant and antioxidant status of hyperbilirubinemic infants before and after the two forms of phototherapy: conventional and LED phototherapy, in order to identify the optimal treatment method. Thirty newborns exposed to conventional (Group I) phototherapy and 30 infants exposed to LED phototherapy (Group II) were studied. The serum total antioxidant capacity (TAC) and the total oxidant status (TOS) were assessed by EREL's method. There were no statistically significant differences in TAC or TOS levels between Group I and Group II prior to phototherapy, and no statistically significant difference in TAC levels between the two groups after phototherapy; however, TOS levels were significantly lower in Group II compared to Group I after phototherapy. Oxidative stress index (OSI) increased after conventional phototherapy (p < 0.05) The increase in TOS following conventional phototherapy was not not observed following LED phototherapy. This difference should be considered when using phototherapy.
Lima, Ana Paula Barbosa; Vitti, Rafael Pino; Amaral, Marina; Neves, Ana Christina Claro; da Silva Concilio, Lais Regiane
2018-04-01
This study evaluated the dimensional stability of a complete-arch prosthesis processed by conventional method in water bath or microwave energy and polymerized by two different curing cycles. Forty maxillary complete-arch prostheses were randomly divided into four groups (n = 10): MW1 - acrylic resin cured by one microwave cycle; MW2 - acrylic resin cured by two microwave cycles: WB1 - conventional acrylic resin polymerized using one curing cycle in a water bath; WB2 - conventional acrylic resin polymerized using two curing cycles in a water bath. For evaluation of dimensional stability, occlusal vertical dimension (OVD) and area of contact points were measured in two different measurement times: before and after the polymerization method. A digital caliper was used for OVD measurement. Occlusal contact registration strips were used between maxillary and mandibular dentures to measure the contact points. The images were measured using the software IpWin32, and the differences before and after the polymerization methods were calculated. The data were statistically analyzed using the one-way ANOVA and Tukey test (α = .05). he results demonstrated significant statistical differences for OVD between different measurement times for all groups. MW1 presented the highest OVD values, while WB2 had the lowest OVD values ( P <.05). No statistical differences were found for area of contact points among the groups ( P =.7150). The conventional acrylic resin polymerized using two curing cycles in a water bath led to less difference in OVD of complete-arch prosthesis.
Vaidya, Sharad; Parkash, Hari; Bhargava, Akshay; Gupta, Sharad
2014-01-01
Abundant resources and techniques have been used for complete coverage crown fabrication. Conventional investing and casting procedures for phosphate-bonded investments require a 2- to 4-h procedure before completion. Accelerated casting techniques have been used, but may not result in castings with matching marginal accuracy. The study measured the marginal gap and determined the clinical acceptability of single cast copings invested in a phosphate-bonded investment with the use of conventional and accelerated methods. One hundred and twenty cast coping samples were fabricated using conventional and accelerated methods, with three finish lines: Chamfer, shoulder and shoulder with bevel. Sixty copings were prepared with each technique. Each coping was examined with a stereomicroscope at four predetermined sites and measurements of marginal gaps were documented for each. A master chart was prepared for all the data and was analyzed using Statistical Package for the Social Sciences version. Evidence of marginal gap was then evaluated by t-test. Analysis of variance and Post-hoc analysis were used to compare two groups as well as to make comparisons between three subgroups . Measurements recorded showed no statistically significant difference between conventional and accelerated groups. Among the three marginal designs studied, shoulder with bevel showed the best marginal fit with conventional as well as accelerated casting techniques. Accelerated casting technique could be a vital alternative to the time-consuming conventional casting technique. The marginal fit between the two casting techniques showed no statistical difference.
New heterogeneous test statistics for the unbalanced fixed-effect nested design.
Guo, Jiin-Huarng; Billard, L; Luh, Wei-Ming
2011-05-01
When the underlying variances are unknown or/and unequal, using the conventional F test is problematic in the two-factor hierarchical data structure. Prompted by the approximate test statistics (Welch and Alexander-Govern methods), the authors develop four new heterogeneous test statistics to test factor A and factor B nested within A for the unbalanced fixed-effect two-stage nested design under variance heterogeneity. The actual significance levels and statistical power of the test statistics were compared in a simulation study. The results show that the proposed procedures maintain better Type I error rate control and have greater statistical power than those obtained by the conventional F test in various conditions. Therefore, the proposed test statistics are recommended in terms of robustness and easy implementation. ©2010 The British Psychological Society.
Li, Xiaohui; Yu, Jianhua; Gong, Yuekun; Ren, Kaijing; Liu, Jun
2015-04-21
To assess the early postoperative clinical and radiographic outcomes after navigation-assisted or standard instrumentation total knee arthroplasty (TKA). From August 2007 to May 2008, 60 KSS-A type patients underwent 67 primary TKA operations by the same surgical team. Twenty-two operations were performed with the Image-free navigation system with an average age of 64.5 years while the remaining 45 underwent conventional manual procedures with an average age of 66 years. Their preoperative demographic and functional data had no statistical differences (P>0.05). The operative duration, blood loss volume and hospitalization days were compared for two groups. And radiographic data included coronal femoral component angle, coronal tibial component angle, sagittal femoral component angle, sagittal tibial component angle and coronal tibiofemoral angle after one month. And functional assessment scores were evaluated at 1, 3 and 6 months postoperatively. Operative duration was significantly longer for computer navigation (P<0.05). The average blood loss volume was 555.26 ml in computer navigation group and 647.56 ml in conventional manual method group (P<0.05). And hospitalization stay was shorter in computer navigation group than that in conventional method group (7.74 vs 8.68 days) (P=0.04). The alignment deviation was better in computer-assisted group than that in conventional manual method group (P<0.05). The percentage of patients with a coronal tibiofemoral angle within ±3 of ideal value was 95.45% for computer-assisted mini-invasive TKA group and 80% for conventional TKA group (P=0.003). The Knee Society Clinical Rating Score was higher in computer-assisted group than that in conventional manual method group at 1 and 3 montha post-operation. However, no statistical inter-group difference existed at 6 months post-operation. Navigation allows a surgeon to precisely implant the components for TKA. And it offers faster functional recovery and shorter hospitalization stay. At 6 months post-operation, there is no statistical inter-group difference in KSS scores.
Engberg, Lovisa; Forsgren, Anders; Eriksson, Kjell; Hårdemark, Björn
2017-06-01
To formulate convex planning objectives of treatment plan multicriteria optimization with explicit relationships to the dose-volume histogram (DVH) statistics used in plan quality evaluation. Conventional planning objectives are designed to minimize the violation of DVH statistics thresholds using penalty functions. Although successful in guiding the DVH curve towards these thresholds, conventional planning objectives offer limited control of the individual points on the DVH curve (doses-at-volume) used to evaluate plan quality. In this study, we abandon the usual penalty-function framework and propose planning objectives that more closely relate to DVH statistics. The proposed planning objectives are based on mean-tail-dose, resulting in convex optimization. We also demonstrate how to adapt a standard optimization method to the proposed formulation in order to obtain a substantial reduction in computational cost. We investigated the potential of the proposed planning objectives as tools for optimizing DVH statistics through juxtaposition with the conventional planning objectives on two patient cases. Sets of treatment plans with differently balanced planning objectives were generated using either the proposed or the conventional approach. Dominance in the sense of better distributed doses-at-volume was observed in plans optimized within the proposed framework. The initial computational study indicates that the DVH statistics are better optimized and more efficiently balanced using the proposed planning objectives than using the conventional approach. © 2017 American Association of Physicists in Medicine.
Change Detection in Rough Time Series
2014-09-01
Business Statistics : An Inferential Approach, Dellen: San Francisco. [18] Winston, W. (1997) Operations Research Applications and Algorithms, Duxbury...distribution that can present significant challenges to conventional statistical tracking techniques. To address this problem the proposed method...applies hybrid fuzzy statistical techniques to series granules instead of to individual measures. Three examples demonstrated the robust nature of the
Prasad, Rahul; Al-Keraif, Abdulaziz Abdullah; Kathuria, Nidhi; Gandhi, P V; Bhide, S V
2014-02-01
The purpose of this study was to determine whether the ringless casting and accelerated wax-elimination techniques can be combined to offer a cost-effective, clinically acceptable, and time-saving alternative for fabricating single unit castings in fixed prosthodontics. Sixty standardized wax copings were fabricated on a type IV stone replica of a stainless steel die. The wax patterns were divided into four groups. The first group was cast using the ringless investment technique and conventional wax-elimination method; the second group was cast using the ringless investment technique and accelerated wax-elimination method; the third group was cast using the conventional metal ring investment technique and conventional wax-elimination method; the fourth group was cast using the metal ring investment technique and accelerated wax-elimination method. The vertical marginal gap was measured at four sites per specimen, using a digital optical microscope at 100× magnification. The results were analyzed using two-way ANOVA to determine statistical significance. The vertical marginal gaps of castings fabricated using the ringless technique (76.98 ± 7.59 μm) were significantly less (p < 0.05) than those castings fabricated using the conventional metal ring technique (138.44 ± 28.59 μm); however, the vertical marginal gaps of the conventional (102.63 ± 36.12 μm) and accelerated wax-elimination (112.79 ± 38.34 μm) castings were not statistically significant (p > 0.05). The ringless investment technique can produce castings with higher accuracy and can be favorably combined with the accelerated wax-elimination method as a vital alternative to the time-consuming conventional technique of casting restorations in fixed prosthodontics. © 2013 by the American College of Prosthodontists.
Gould, A Lawrence
2016-12-30
Conventional practice monitors accumulating information about drug safety in terms of the numbers of adverse events reported from trials in a drug development program. Estimates of between-treatment adverse event risk differences can be obtained readily from unblinded trials with adjustment for differences among trials using conventional statistical methods. Recent regulatory guidelines require monitoring the cumulative frequency of adverse event reports to identify possible between-treatment adverse event risk differences without unblinding ongoing trials. Conventional statistical methods for assessing between-treatment adverse event risks cannot be applied when the trials are blinded. However, CUSUM charts can be used to monitor the accumulation of adverse event occurrences. CUSUM charts for monitoring adverse event occurrence in a Bayesian paradigm are based on assumptions about the process generating the adverse event counts in a trial as expressed by informative prior distributions. This article describes the construction of control charts for monitoring adverse event occurrence based on statistical models for the processes, characterizes their statistical properties, and describes how to construct useful prior distributions. Application of the approach to two adverse events of interest in a real trial gave nearly identical results for binomial and Poisson observed event count likelihoods. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Sprowls, D. O.; Bucci, R. J.; Ponchel, B. M.; Brazill, R. L.; Bretz, P. E.
1984-01-01
A technique is demonstrated for accelerated stress corrosion testing of high strength aluminum alloys. The method offers better precision and shorter exposure times than traditional pass fail procedures. The approach uses data from tension tests performed on replicate groups of smooth specimens after various lengths of exposure to static stress. The breaking strength measures degradation in the test specimen load carrying ability due to the environmental attack. Analysis of breaking load data by extreme value statistics enables the calculation of survival probabilities and a statistically defined threshold stress applicable to the specific test conditions. A fracture mechanics model is given which quantifies depth of attack in the stress corroded specimen by an effective flaw size calculated from the breaking stress and the material strength and fracture toughness properties. Comparisons are made with experimental results from three tempers of 7075 alloy plate tested by the breaking load method and by traditional tests of statistically loaded smooth tension bars and conventional precracked specimens.
Datta, Rakesh; Datta, Karuna; Venkatesh, M D
2015-07-01
The classical didactic lecture has been the cornerstone of the theoretical undergraduate medical education. Their efficacy however reduces due to reduced interaction and short attention span of the students. It is hypothesized that the interactive response pad obviates some of these drawbacks. The aim of this study was to evaluate the effectiveness of an interactive response system by comparing it with conventional classroom teaching. A prospective comparative longitudinal study was conducted on 192 students who were exposed to either conventional or interactive teaching over 20 classes. Pre-test, Post-test and retentions test (post 8-12 weeks) scores were collated and statistically analysed. An independent observer measured number of student interactions in each class. Pre-test scores from both groups were similar (p = 0.71). There was significant improvement in both post test scores when compared to pre-test scores in either method (p < 0.001). The interactive post-test score was better than conventional post test score (p < 0.001) by 8-10% (95% CI-difference of means - 8.2%-9.24%-10.3%). The interactive retention test score was better than conventional retention test score (p < 0.001) by 15-18% (95% CI-difference of means - 15.0%-16.64%-18.2%). There were 51 participative events in the interactive group vs 25 in the conventional group. The Interactive Response Pad method was efficacious in teaching. Students taught with the interactive method were likely to score 8-10% higher (statistically significant) in the immediate post class time and 15-18% higher (statistically significant) after 8-12 weeks. The number of student-teacher interactions increases when using the interactive response pads.
A Comparison of Spatial Statistical Methods in a School Finance Policy Context
ERIC Educational Resources Information Center
Slagle, Mike
2010-01-01
A shortcoming of the conventional ordinary least squares (OLS) approaches for estimating median voter models of education demand is the inability to more fully explain the spatial relationships between neighboring school districts. Consequently, two school districts that appear to be descriptively similar in terms of conventional measures of…
Dental enamel defect diagnosis through different technology-based devices.
Kobayashi, Tatiana Yuriko; Vitor, Luciana Lourenço Ribeiro; Carrara, Cleide Felício Carvalho; Silva, Thiago Cruvinel; Rios, Daniela; Machado, Maria Aparecida Andrade Moreira; Oliveira, Thais Marchini
2018-06-01
Dental enamel defects (DEDs) are faulty or deficient enamel formations of primary and permanent teeth. Changes during tooth development result in hypoplasia (a quantitative defect) and/or hypomineralisation (a qualitative defect). To compare technology-based diagnostic methods for detecting DEDs. Two-hundred and nine dental surfaces of anterior permanent teeth were selected in patients, 6-11 years of age, with cleft lip with/without cleft palate. First, a conventional clinical examination was conducted according to the modified Developmental Defects of Enamel Index (DDE Index). Dental surfaces were evaluated using an operating microscope and a fluorescence-based device. Interexaminer reproducibility was determined using the kappa test. To compare groups, McNemar's test was used. Cramer's V test was used for comparing the distribution of index codes obtained after classification of all dental surfaces. Cramer's V test revealed statistically significant differences (P < .0001) in the distribution of index codes obtained using the different methods; the coefficients were 0.365 for conventional clinical examination versus fluorescence, 0.961 for conventional clinical examination versus operating microscope and 0.358 for operating microscope versus fluorescence. The sensitivity of the operating microscope and fluorescence method was statistically significant (P = .008 and P < .0001, respectively). Otherwise, the results did not show statistically significant differences in accuracy and specificity for either the operating microscope or the fluorescence methods. This study suggests that the operating microscope performed better than the fluorescence-based device and could be an auxiliary method for the detection of DEDs. © 2017 FDI World Dental Federation.
Evaluation of bearing capacity of piles from cone penetration test data.
DOT National Transportation Integrated Search
2007-12-01
A statistical analysis and ranking criteria were used to compare the CPT methods and the conventional alpha design method. Based on the results, the de Ruiter/Beringen and LCPC methods showed the best capability in predicting the measured load carryi...
Miyazawa, Arata; Hong, Young-Joo; Makita, Shuichi; Kasaragod, Deepa; Yasuno, Yoshiaki
2017-01-01
Jones matrix-based polarization sensitive optical coherence tomography (JM-OCT) simultaneously measures optical intensity, birefringence, degree of polarization uniformity, and OCT angiography. The statistics of the optical features in a local region, such as the local mean of the OCT intensity, are frequently used for image processing and the quantitative analysis of JM-OCT. Conventionally, local statistics have been computed with fixed-size rectangular kernels. However, this results in a trade-off between image sharpness and statistical accuracy. We introduce a superpixel method to JM-OCT for generating the flexible kernels of local statistics. A superpixel is a cluster of image pixels that is formed by the pixels’ spatial and signal value proximities. An algorithm for superpixel generation specialized for JM-OCT and its optimization methods are presented in this paper. The spatial proximity is in two-dimensional cross-sectional space and the signal values are the four optical features. Hence, the superpixel method is a six-dimensional clustering technique for JM-OCT pixels. The performance of the JM-OCT superpixels and its optimization methods are evaluated in detail using JM-OCT datasets of posterior eyes. The superpixels were found to well preserve tissue structures, such as layer structures, sclera, vessels, and retinal pigment epithelium. And hence, they are more suitable for local statistics kernels than conventional uniform rectangular kernels. PMID:29082073
Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J
2017-11-24
Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Counting the peaks in the excitation function for precompound processes
NASA Astrophysics Data System (ADS)
Bonetti, R.; Hussein, M. S.; Mello, P. A.
1983-08-01
The "counting of maxima" method of Brink and Stephen, conventionally used for the extraction of the correlation width of statistical (compound nucleus) reactions, is generalized to include precompound processes as well. It is found that this method supplies an important independent check of the results obtained from autocorrelation studies. An application is made to the reaction 25Mg(3He,p). NUCLEAR REACTIONS Statistical multistep compound processes discussed.
Parikh, Harshal R; De, Anuradha S; Baveja, Sujata M
2012-07-01
Physicians and microbiologists have long recognized that the presence of living microorganisms in the blood of a patient carries with it considerable morbidity and mortality. Hence, blood cultures have become critically important and frequently performed test in clinical microbiology laboratories for diagnosis of sepsis. To compare the conventional blood culture method with the lysis centrifugation method in cases of sepsis. Two hundred nonduplicate blood cultures from cases of sepsis were analyzed using two blood culture methods concurrently for recovery of bacteria from patients diagnosed clinically with sepsis - the conventional blood culture method using trypticase soy broth and the lysis centrifugation method using saponin by centrifuging at 3000 g for 30 minutes. Overall bacteria recovered from 200 blood cultures were 17.5%. The conventional blood culture method had a higher yield of organisms, especially Gram positive cocci. The lysis centrifugation method was comparable with the former method with respect to Gram negative bacilli. The sensitivity of lysis centrifugation method in comparison to conventional blood culture method was 49.75% in this study, specificity was 98.21% and diagnostic accuracy was 89.5%. In almost every instance, the time required for detection of the growth was earlier by lysis centrifugation method, which was statistically significant. Contamination by lysis centrifugation was minimal, while that by conventional method was high. Time to growth by the lysis centrifugation method was highly significant (P value 0.000) as compared to time to growth by the conventional blood culture method. For the diagnosis of sepsis, combination of the lysis centrifugation method and the conventional blood culture method with trypticase soy broth or biphasic media is advocable, in order to achieve faster recovery and a better yield of microorganisms.
Shivasakthy, M.; Asharaf Ali, Syed
2013-01-01
Statement of Problem: A new material is proposed in dentistry in the form of strips for producing gingival retraction. The clinical efficacy of the material remains untested. Purpose of the Study: This study aimed to determine whether the polyvinyl acetate strips are able to effectively displace the gingival tissues in comparison with the conventional retraction cord. Material and Methods: Complete metal ceramic preparation with supra-gingival margin was performed in fourteen maxillary incisors and gingival retraction was done using Merocel strips and conventional retraction cords alternatively in 2 weeks time interval. The amount of displacement was compared using a digital vernier caliper of 0.01mm accuracy. Results were analyzed statistically using Paired students t-test. Results: The statistical analysis of the data revealed that both the conventional retraction cord and the Merocel strip produce significant retraction. Among both the materials, Merocel proved to be significantly more effective. Conclusion: Merocel strip produces more gingival displacement than the conventional retraction cord. PMID:24298531
Lower conjunctival fornix packing for mydriasis in premature infants: a randomized trial
Thanathanee, Onsiri; Ratanapakorn, Tanapat; Morley, Michael G; Yospaiboon, Yosanan
2012-01-01
Objective To compare the mydriatic effect of lower conjunctival fornix packing to conventional instillation of eyedrops containing 2.5% phenylephrine and 1% tropicamide in premature infants undergoing examination for retinopathy of prematurity. Methods The patients were randomized to receive either conventional instillation of mydriatic drops or lower conjunctival fornix packing in one eye and the alternate method in the fellow eye. For the eyes receiving lower conjunctival fornix packing (study group), one small piece of the cotton wool soaked with one drop of 2.5% phenylephrine and one drop of 1% tropicamide was packed in the lower conjunctival fornix for 15 minutes. For the eyes receiving the conventional instillation (control group), 2.5% phenylephrine and 1% tropicamide were alternately instilled every 5 minutes for two doses each. Horizontal pupil diameter was measured with a ruler in millimeters 40 minutes later. Results The mean dilated pupil diameter in study group and control group were 5.76 ± 1.01 mm and 4.50 ± 1.08 mm, respectively. This difference was statistically significant (P < 0.05). Conclusion The dilated pupil diameter after receiving the lower conjunctival fornix packing was larger than conventional instillation with a statistically significant difference. We recommended the packing method to dilate the preterm infant pupil, especially if the pupil is difficult to dilate. PMID:22368443
Tani, Kazuki; Mio, Motohira; Toyofuku, Tatsuo; Kato, Shinichi; Masumoto, Tomoya; Ijichi, Tetsuya; Matsushima, Masatoshi; Morimoto, Shoichi; Hirata, Takumi
2017-01-01
Spatial normalization is a significant image pre-processing operation in statistical parametric mapping (SPM) analysis. The purpose of this study was to clarify the optimal method of spatial normalization for improving diagnostic accuracy in SPM analysis of arterial spin-labeling (ASL) perfusion images. We evaluated the SPM results of five spatial normalization methods obtained by comparing patients with Alzheimer's disease or normal pressure hydrocephalus complicated with dementia and cognitively healthy subjects. We used the following methods: 3DT1-conventional based on spatial normalization using anatomical images; 3DT1-DARTEL based on spatial normalization with DARTEL using anatomical images; 3DT1-conventional template and 3DT1-DARTEL template, created by averaging cognitively healthy subjects spatially normalized using the above methods; and ASL-DARTEL template created by averaging cognitively healthy subjects spatially normalized with DARTEL using ASL images only. Our results showed that ASL-DARTEL template was small compared with the other two templates. Our SPM results obtained with ASL-DARTEL template method were inaccurate. Also, there were no significant differences between 3DT1-conventional and 3DT1-DARTEL template methods. In contrast, the 3DT1-DARTEL method showed higher detection sensitivity, and precise anatomical location. Our SPM results suggest that we should perform spatial normalization with DARTEL using anatomical images.
Chan, Kwun Chuen Gary; Qin, Jing
2015-10-01
Existing linear rank statistics cannot be applied to cross-sectional survival data without follow-up since all subjects are essentially censored. However, partial survival information are available from backward recurrence times and are frequently collected from health surveys without prospective follow-up. Under length-biased sampling, a class of linear rank statistics is proposed based only on backward recurrence times without any prospective follow-up. When follow-up data are available, the proposed rank statistic and a conventional rank statistic that utilizes follow-up information from the same sample are shown to be asymptotically independent. We discuss four ways to combine these two statistics when follow-up is present. Simulations show that all combined statistics have substantially improved power compared with conventional rank statistics, and a Mantel-Haenszel test performed the best among the proposal statistics. The method is applied to a cross-sectional health survey without follow-up and a study of Alzheimer's disease with prospective follow-up. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
A Non-Intrusive Algorithm for Sensitivity Analysis of Chaotic Flow Simulations
NASA Technical Reports Server (NTRS)
Blonigan, Patrick J.; Wang, Qiqi; Nielsen, Eric J.; Diskin, Boris
2017-01-01
We demonstrate a novel algorithm for computing the sensitivity of statistics in chaotic flow simulations to parameter perturbations. The algorithm is non-intrusive but requires exposing an interface. Based on the principle of shadowing in dynamical systems, this algorithm is designed to reduce the effect of the sampling error in computing sensitivity of statistics in chaotic simulations. We compare the effectiveness of this method to that of the conventional finite difference method.
Identification of the isomers using principal component analysis (PCA) method
NASA Astrophysics Data System (ADS)
Kepceoǧlu, Abdullah; Gündoǧdu, Yasemin; Ledingham, Kenneth William David; Kilic, Hamdi Sukur
2016-03-01
In this work, we have carried out a detailed statistical analysis for experimental data of mass spectra from xylene isomers. Principle Component Analysis (PCA) was used to identify the isomers which cannot be distinguished using conventional statistical methods for interpretation of their mass spectra. Experiments have been carried out using a linear TOF-MS coupled to a femtosecond laser system as an energy source for the ionisation processes. We have performed experiments and collected data which has been analysed and interpreted using PCA as a multivariate analysis of these spectra. This demonstrates the strength of the method to get an insight for distinguishing the isomers which cannot be identified using conventional mass analysis obtained through dissociative ionisation processes on these molecules. The PCA results dependending on the laser pulse energy and the background pressure in the spectrometers have been presented in this work.
Goodwin, Cody R; Sherrod, Stacy D; Marasco, Christina C; Bachmann, Brian O; Schramm-Sapyta, Nicole; Wikswo, John P; McLean, John A
2014-07-01
A metabolic system is composed of inherently interconnected metabolic precursors, intermediates, and products. The analysis of untargeted metabolomics data has conventionally been performed through the use of comparative statistics or multivariate statistical analysis-based approaches; however, each falls short in representing the related nature of metabolic perturbations. Herein, we describe a complementary method for the analysis of large metabolite inventories using a data-driven approach based upon a self-organizing map algorithm. This workflow allows for the unsupervised clustering, and subsequent prioritization of, correlated features through Gestalt comparisons of metabolic heat maps. We describe this methodology in detail, including a comparison to conventional metabolomics approaches, and demonstrate the application of this method to the analysis of the metabolic repercussions of prolonged cocaine exposure in rat sera profiles.
Indoor Location Sensing with Invariant Wi-Fi Received Signal Strength Fingerprinting
Husen, Mohd Nizam; Lee, Sukhan
2016-01-01
A method of location fingerprinting based on the Wi-Fi received signal strength (RSS) in an indoor environment is presented. The method aims to overcome the RSS instability due to varying channel disturbances in time by introducing the concept of invariant RSS statistics. The invariant RSS statistics represent here the RSS distributions collected at individual calibration locations under minimal random spatiotemporal disturbances in time. The invariant RSS statistics thus collected serve as the reference pattern classes for fingerprinting. Fingerprinting is carried out at an unknown location by identifying the reference pattern class that maximally supports the spontaneous RSS sensed from individual Wi-Fi sources. A design guideline is also presented as a rule of thumb for estimating the number of Wi-Fi signal sources required to be available for any given number of calibration locations under a certain level of random spatiotemporal disturbances. Experimental results show that the proposed method not only provides 17% higher success rate than conventional ones but also removes the need for recalibration. Furthermore, the resolution is shown finer by 40% with the execution time more than an order of magnitude faster than the conventional methods. These results are also backed up by theoretical analysis. PMID:27845711
Indoor Location Sensing with Invariant Wi-Fi Received Signal Strength Fingerprinting.
Husen, Mohd Nizam; Lee, Sukhan
2016-11-11
A method of location fingerprinting based on the Wi-Fi received signal strength (RSS) in an indoor environment is presented. The method aims to overcome the RSS instability due to varying channel disturbances in time by introducing the concept of invariant RSS statistics. The invariant RSS statistics represent here the RSS distributions collected at individual calibration locations under minimal random spatiotemporal disturbances in time. The invariant RSS statistics thus collected serve as the reference pattern classes for fingerprinting. Fingerprinting is carried out at an unknown location by identifying the reference pattern class that maximally supports the spontaneous RSS sensed from individual Wi-Fi sources. A design guideline is also presented as a rule of thumb for estimating the number of Wi-Fi signal sources required to be available for any given number of calibration locations under a certain level of random spatiotemporal disturbances. Experimental results show that the proposed method not only provides 17% higher success rate than conventional ones but also removes the need for recalibration. Furthermore, the resolution is shown finer by 40% with the execution time more than an order of magnitude faster than the conventional methods. These results are also backed up by theoretical analysis.
Adaptive interference cancel filter for evoked potential using high-order cumulants.
Lin, Bor-Shyh; Lin, Bor-Shing; Chong, Fok-Ching; Lai, Feipei
2004-01-01
This paper is to present evoked potential (EP) processing using adaptive interference cancel (AIC) filter with second and high order cumulants. In conventional ensemble averaging method, people have to conduct repetitively experiments to record the required data. Recently, the use of AIC structure with second statistics in processing EP has proved more efficiency than traditional averaging method, but it is sensitive to both of the reference signal statistics and the choice of step size. Thus, we proposed higher order statistics-based AIC method to improve these disadvantages. This study was experimented in somatosensory EP corrupted with EEG. Gradient type algorithm is used in AIC method. Comparisons with AIC filter on second, third, fourth order statistics are also presented in this paper. We observed that AIC filter with third order statistics has better convergent performance for EP processing and is not sensitive to the selection of step size and reference input.
Statistical methods and neural network approaches for classification of data from multiple sources
NASA Technical Reports Server (NTRS)
Benediktsson, Jon Atli; Swain, Philip H.
1990-01-01
Statistical methods for classification of data from multiple data sources are investigated and compared to neural network models. A problem with using conventional multivariate statistical approaches for classification of data of multiple types is in general that a multivariate distribution cannot be assumed for the classes in the data sources. Another common problem with statistical classification methods is that the data sources are not equally reliable. This means that the data sources need to be weighted according to their reliability but most statistical classification methods do not have a mechanism for this. This research focuses on statistical methods which can overcome these problems: a method of statistical multisource analysis and consensus theory. Reliability measures for weighting the data sources in these methods are suggested and investigated. Secondly, this research focuses on neural network models. The neural networks are distribution free since no prior knowledge of the statistical distribution of the data is needed. This is an obvious advantage over most statistical classification methods. The neural networks also automatically take care of the problem involving how much weight each data source should have. On the other hand, their training process is iterative and can take a very long time. Methods to speed up the training procedure are introduced and investigated. Experimental results of classification using both neural network models and statistical methods are given, and the approaches are compared based on these results.
Statistical inference for noisy nonlinear ecological dynamic systems.
Wood, Simon N
2010-08-26
Chaotic ecological dynamic systems defy conventional statistical analysis. Systems with near-chaotic dynamics are little better. Such systems are almost invariably driven by endogenous dynamic processes plus demographic and environmental process noise, and are only observable with error. Their sensitivity to history means that minute changes in the driving noise realization, or the system parameters, will cause drastic changes in the system trajectory. This sensitivity is inherited and amplified by the joint probability density of the observable data and the process noise, rendering it useless as the basis for obtaining measures of statistical fit. Because the joint density is the basis for the fit measures used by all conventional statistical methods, this is a major theoretical shortcoming. The inability to make well-founded statistical inferences about biological dynamic models in the chaotic and near-chaotic regimes, other than on an ad hoc basis, leaves dynamic theory without the methods of quantitative validation that are essential tools in the rest of biological science. Here I show that this impasse can be resolved in a simple and general manner, using a method that requires only the ability to simulate the observed data on a system from the dynamic model about which inferences are required. The raw data series are reduced to phase-insensitive summary statistics, quantifying local dynamic structure and the distribution of observations. Simulation is used to obtain the mean and the covariance matrix of the statistics, given model parameters, allowing the construction of a 'synthetic likelihood' that assesses model fit. This likelihood can be explored using a straightforward Markov chain Monte Carlo sampler, but one further post-processing step returns pure likelihood-based inference. I apply the method to establish the dynamic nature of the fluctuations in Nicholson's classic blowfly experiments.
Tuna, Süleyman Hakan; Özçiçek Pekmez, Nuran; Kürkçüoğlu, Işin
2015-11-01
The effects of fabrication methods on the corrosion resistance of frameworks produced with Co-Cr alloys are not clear. The purpose of this in vitro study was to evaluate the electrochemical corrosion resistance of Co-Cr alloy specimens that were fabricated by conventional casting, milling, and laser sintering. The specimens fabricated with 3 different methods were investigated by potentiodynamic tests and electrochemical impedance spectroscopy in an artificial saliva. Ions released into the artificial saliva were estimated with inductively coupled plasma-mass spectrometry, and the results were statistically analyzed. The specimen surfaces were investigated with scanning electron microscopy before and after the tests. In terms of corrosion current and Rct properties, statistically significant differences were found both among the means of the methods and among the means of the material groups (P<.05). With regard to ions released, a statistically significant difference was found among the material groups (P<.05); however, no difference was found among the methods. Scanning electron microscopic imaging revealed that the specimens produced by conventional casting were affected to a greater extent by etching and electrochemical corrosion than those produced by milling and laser sintering. The corrosion resistance of a Co-Cr alloy specimens fabricated by milling or laser sintering was greater than that of the conventionally cast alloy specimens. The Co-Cr specimens produced by the same method also differed from one another in terms of corrosion resistance. These differences may be related to the variations in the alloy compositions. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Evaluation of airborne lidar data to predict vegetation Presence/Absence
Palaseanu-Lovejoy, M.; Nayegandhi, A.; Brock, J.; Woodman, R.; Wright, C.W.
2009-01-01
This study evaluates the capabilities of the Experimental Advanced Airborne Research Lidar (EAARL) in delineating vegetation assemblages in Jean Lafitte National Park, Louisiana. Five-meter-resolution grids of bare earth, canopy height, canopy-reflection ratio, and height of median energy were derived from EAARL data acquired in September 2006. Ground-truth data were collected along transects to assess species composition, canopy cover, and ground cover. To decide which model is more accurate, comparisons of general linear models and generalized additive models were conducted using conventional evaluation methods (i.e., sensitivity, specificity, Kappa statistics, and area under the curve) and two new indexes, net reclassification improvement and integrated discrimination improvement. Generalized additive models were superior to general linear models in modeling presence/absence in training vegetation categories, but no statistically significant differences between the two models were achieved in determining the classification accuracy at validation locations using conventional evaluation methods, although statistically significant improvements in net reclassifications were observed. ?? 2009 Coastal Education and Research Foundation.
Li, Yue; Zhang, Di; Capoglu, Ilker; Hujsak, Karl A; Damania, Dhwanil; Cherkezyan, Lusik; Roth, Eric; Bleher, Reiner; Wu, Jinsong S; Subramanian, Hariharan; Dravid, Vinayak P; Backman, Vadim
2017-06-01
Essentially all biological processes are highly dependent on the nanoscale architecture of the cellular components where these processes take place. Statistical measures, such as the autocorrelation function (ACF) of the three-dimensional (3D) mass-density distribution, are widely used to characterize cellular nanostructure. However, conventional methods of reconstruction of the deterministic 3D mass-density distribution, from which these statistical measures can be calculated, have been inadequate for thick biological structures, such as whole cells, due to the conflict between the need for nanoscale resolution and its inverse relationship with thickness after conventional tomographic reconstruction. To tackle the problem, we have developed a robust method to calculate the ACF of the 3D mass-density distribution without tomography. Assuming the biological mass distribution is isotropic, our method allows for accurate statistical characterization of the 3D mass-density distribution by ACF with two data sets: a single projection image by scanning transmission electron microscopy and a thickness map by atomic force microscopy. Here we present validation of the ACF reconstruction algorithm, as well as its application to calculate the statistics of the 3D distribution of mass-density in a region containing the nucleus of an entire mammalian cell. This method may provide important insights into architectural changes that accompany cellular processes.
Li, Yue; Zhang, Di; Capoglu, Ilker; Hujsak, Karl A.; Damania, Dhwanil; Cherkezyan, Lusik; Roth, Eric; Bleher, Reiner; Wu, Jinsong S.; Subramanian, Hariharan; Dravid, Vinayak P.; Backman, Vadim
2018-01-01
Essentially all biological processes are highly dependent on the nanoscale architecture of the cellular components where these processes take place. Statistical measures, such as the autocorrelation function (ACF) of the three-dimensional (3D) mass–density distribution, are widely used to characterize cellular nanostructure. However, conventional methods of reconstruction of the deterministic 3D mass–density distribution, from which these statistical measures can be calculated, have been inadequate for thick biological structures, such as whole cells, due to the conflict between the need for nanoscale resolution and its inverse relationship with thickness after conventional tomographic reconstruction. To tackle the problem, we have developed a robust method to calculate the ACF of the 3D mass–density distribution without tomography. Assuming the biological mass distribution is isotropic, our method allows for accurate statistical characterization of the 3D mass–density distribution by ACF with two data sets: a single projection image by scanning transmission electron microscopy and a thickness map by atomic force microscopy. Here we present validation of the ACF reconstruction algorithm, as well as its application to calculate the statistics of the 3D distribution of mass–density in a region containing the nucleus of an entire mammalian cell. This method may provide important insights into architectural changes that accompany cellular processes. PMID:28416035
Ramadan, Wijdan H; Khreis, Noura A; Kabbara, Wissam K
2015-01-01
Background The aim of the study was to evaluate the simplicity, safety, patients’ preference, and convenience of the administration of insulin using the pen device versus the conventional vial/syringe in patients with diabetes. Methods This observational study was conducted in multiple community pharmacies in Lebanon. The investigators interviewed patients with diabetes using an insulin pen or conventional vial/syringe. A total of 74 questionnaires were filled over a period of 6 months. Answers were entered into the Statistical Package for Social Sciences (SPSS) software and Excel spreadsheet. t-test, logistic regression analysis, and correlation analysis were used in order to analyze the results. Results A higher percentage of patients from the insulin pen users group (95.2%) found the method easy to use as compared to only 46.7% of the insulin conventional users group (P 0.001, relative risk [RR]: 2.041, 95% confidence interval [CI]: 1.178–3.535). Moreover, 61.9% and 26.7% of pen users and conventional users, respectively, could read the scale easily (P 0.037, RR 2.321, 95% CI: 0.940–5.731), while 85.7% of pen users found it more convenient shifting to pen and 86.7% of the conventional users would want to shift to pen if it had the same cost. Pain perception was statistically different between the groups. A much higher percentage (76.2%) of pen users showed no pain during injection compared to only 26.7% of conventional users (P 0.003, RR 2.857, 95% CI: 1.194–6.838). Conclusion The insulin pen was significantly much easier to use and less painful than the conventional vial/syringe. Proper education on the methods of administration/storage and disposal of needles/syringes is needed in both groups. PMID:25848231
Effect of CorrelatedRotational Noise
NASA Astrophysics Data System (ADS)
Hancock, Benjamin; Wagner, Caleb; Baskaran, Aparna
The traditional model of a self-propelled particle (SPP) is one where the body axis along which the particle travels reorients itself through rotational diffusion. If the reorientation process was driven by colored noise, instead of the standard Gaussian white noise, the resulting statistical mechanics cannot be accessed through conventional methods. In this talk we present results comparing three methods of deriving the statistical mechanics of a SPP with a reorientation process driven by colored noise. We illustrate the differences/similarities in the resulting statistical mechanics by their ability to accurately capture the particles response to external aligning fields.
Statistical image reconstruction from correlated data with applications to PET
Alessio, Adam; Sauer, Ken; Kinahan, Paul
2008-01-01
Most statistical reconstruction methods for emission tomography are designed for data modeled as conditionally independent Poisson variates. In reality, due to scanner detectors, electronics and data processing, correlations are introduced into the data resulting in dependent variates. In general, these correlations are ignored because they are difficult to measure and lead to computationally challenging statistical reconstruction algorithms. This work addresses the second concern, seeking to simplify the reconstruction of correlated data and provide a more precise image estimate than the conventional independent methods. In general, correlated variates have a large non-diagonal covariance matrix that is computationally challenging to use as a weighting term in a reconstruction algorithm. This work proposes two methods to simplify the use of a non-diagonal covariance matrix as the weighting term by (a) limiting the number of dimensions in which the correlations are modeled and (b) adopting flexible, yet computationally tractable, models for correlation structure. We apply and test these methods with simple simulated PET data and data processed with the Fourier rebinning algorithm which include the one-dimensional correlations in the axial direction and the two-dimensional correlations in the transaxial directions. The methods are incorporated into a penalized weighted least-squares 2D reconstruction and compared with a conventional maximum a posteriori approach. PMID:17921576
Comparisons of non-Gaussian statistical models in DNA methylation analysis.
Ma, Zhanyu; Teschendorff, Andrew E; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-06-16
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance.
Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis
Ma, Zhanyu; Teschendorff, Andrew E.; Yu, Hong; Taghia, Jalil; Guo, Jun
2014-01-01
As a key regulatory mechanism of gene expression, DNA methylation patterns are widely altered in many complex genetic diseases, including cancer. DNA methylation is naturally quantified by bounded support data; therefore, it is non-Gaussian distributed. In order to capture such properties, we introduce some non-Gaussian statistical models to perform dimension reduction on DNA methylation data. Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. Experimental results show that the non-Gaussian statistical model-based methods are superior to the conventional Gaussian distribution-based method. They are meaningful tools for DNA methylation analysis. Moreover, among several non-Gaussian methods, the one that captures the bounded nature of DNA methylation data reveals the best clustering performance. PMID:24937687
Statistical evaluation of forecasts
NASA Astrophysics Data System (ADS)
Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn
2014-08-01
Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.
Quick, Jacob A; MacIntyre, Allan D; Barnes, Stephen L
2014-02-01
Surgical airway creation has a high potential for disaster. Conventional methods can be cumbersome and require special instruments. A simple method utilizing three steps and readily available equipment exists, but has yet to be adequately tested. Our objective was to compare conventional cricothyroidotomy with the three-step method utilizing high-fidelity simulation. Utilizing a high-fidelity simulator, 12 experienced flight nurses and paramedics performed both methods after a didactic lecture, simulator briefing, and demonstration of each technique. Six participants performed the three-step method first, and the remaining 6 performed the conventional method first. Each participant was filmed and timed. We analyzed videos with respect to the number of hand repositions, number of airway instrumentations, and technical complications. Times to successful completion were measured from incision to balloon inflation. The three-step method was completed faster (52.1 s vs. 87.3 s; p = 0.007) as compared with conventional surgical cricothyroidotomy. The two methods did not differ statistically regarding number of hand movements (3.75 vs. 5.25; p = 0.12) or instrumentations of the airway (1.08 vs. 1.33; p = 0.07). The three-step method resulted in 100% successful airway placement on the first attempt, compared with 75% of the conventional method (p = 0.11). Technical complications occurred more with the conventional method (33% vs. 0%; p = 0.05). The three-step method, using an elastic bougie with an endotracheal tube, was shown to require fewer total hand movements, took less time to complete, resulted in more successful airway placement, and had fewer complications compared with traditional cricothyroidotomy. Published by Elsevier Inc.
Farjood, Ehsan; Vojdani, Mahroo; Torabi, Kiyanoosh; Khaledi, Amir Ali Reza
2017-01-01
Given the limitations of conventional waxing, computer-aided design and computer-aided manufacturing (CAD-CAM) technologies have been developed as alternative methods of making patterns. The purpose of this in vitro study was to compare the marginal and internal fit of metal copings derived from wax patterns fabricated by rapid prototyping (RP) to those created by the conventional handmade technique. Twenty-four standardized brass dies were milled and divided into 2 groups (n=12) according to the wax pattern fabrication method. The CAD-RP group was assigned to the experimental group, and the conventional group to the control group. The cross-sectional technique was used to assess the marginal and internal discrepancies at 15 points on the master die by using a digital microscope. An independent t test was used for statistical analysis (α=.01). The CAD-RP group had a total mean (±SD) for absolute marginal discrepancy of 117.1 (±11.5) μm and a mean marginal discrepancy of 89.8 (±8.3) μm. The conventional group had an absolute marginal discrepancy 88.1 (±10.7) μm and a mean marginal discrepancy of 69.5 (±15.6) μm. The overall mean (±SD) of the total internal discrepancy, separately calculated as the axial internal discrepancy and occlusal internal discrepancy, was 95.9 (±8.0) μm for the CAD-RP group and 76.9 (±10.2) μm for the conventional group. The independent t test results showed significant differences between the 2 groups. The CAD-RP group had larger discrepancies at all measured areas than the conventional group, which was statistically significant (P<.01). Within the limitations of this in vitro study, the conventional method of wax pattern fabrication produced copings with better marginal and internal fit than the CAD-RP method. However, the marginal and internal fit for both groups were within clinically acceptable ranges. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Peterson, Ivars
1991-01-01
A method that enables people to obtain the benefits of statistics and probability theory without the shortcomings of conventional methods because it is free of mathematical formulas and is easy to understand and use is described. A resampling technique called the "bootstrap" is discussed in terms of application and development. (KR)
a Probability-Based Statistical Method to Extract Water Body of TM Images with Missing Information
NASA Astrophysics Data System (ADS)
Lian, Shizhong; Chen, Jiangping; Luo, Minghai
2016-06-01
Water information cannot be accurately extracted using TM images because true information is lost in some images because of blocking clouds and missing data stripes, thereby water information cannot be accurately extracted. Water is continuously distributed in natural conditions; thus, this paper proposed a new method of water body extraction based on probability statistics to improve the accuracy of water information extraction of TM images with missing information. Different disturbing information of clouds and missing data stripes are simulated. Water information is extracted using global histogram matching, local histogram matching, and the probability-based statistical method in the simulated images. Experiments show that smaller Areal Error and higher Boundary Recall can be obtained using this method compared with the conventional methods.
Evaluation of bending rigidity behaviour of ultrasonic seaming on woven fabrics
NASA Astrophysics Data System (ADS)
Şevkan Macit, Ayşe; Tiber, Bahar
2017-10-01
In recent years ultrasonic seaming that is shown as an alternative method to conventional seaming has been investigated by many researchers. In our study, bending behaviour of this alternative method is examined by changing various parameters such as fabric type, seam type, roller type and seaming velocity. For this purpose fifteen types of sewn fabrics were tested according to bending rigidity test standard before and after washing processes and results were evaluated through SPSS statistical analyze programme. Consequently, bending length values of the ultrasonically sewn fabrics are found to be higher than the bending length values of conventionally sewn fabrics and the effects of seam type on bending length are seen statistically significant. Also it is observed that bending length values are in relationship with the rest of the parameters excluding roller type.
Early-Stage Estimated Value of Blend Sign on the Prognosis of Patients with Intracerebral Hemorrhage
Zhou, Ningquan; Wang, Chao
2018-01-01
Background and Purpose This study aimed to explore the relationship between blend sign and prognosis of patients with intracerebral hemorrhage (ICH). Methods Between January 2014 and December 2016, the results of cranial computed tomography imaging within 24 h after the onset of symptoms from 275 patients with ICH were retrospectively analyzed. The patients with or without blend sign were compared to observe and analyze the difference in coagulation function abnormality, rebleeding, mortality, and bad prognosis rates in the early stages. Results Of the 275 patients with ICH, 47 patients had Blend Sign I (17.09%) and 17 patients had Blend Sign II (6.18%). The coagulation function abnormality rate had no statistical difference among Blend Sign I, Blend Sign II, and conventional groups (P > 0.05). In the Blend Sign I group, the rebleeding rate was 4.26%, bad prognosis rate was 25.53%, and mortality rate was 6.38%, which were not statistically significantly different compared with those in the conventional group (P > 0.05). The rebleeding rate in the Blend Sign II group was 47.06%, bad prognosis rate was 82.35%, and mortality rate was 47.06%, which were statistically significantly different compared with those in the conventional and Blend Sign I groups (P < 0.05). Conclusions For the patients associated with Blend Sign I, the prognosis was equivalent to that in the conventional group, with no statistically significant difference. The rebleeding, bad prognosis, and mortality rates were higher in the Blend Sign II group than in the conventional group and deserved more attention.
Agrawal, Yuvraj; Desai, Aravind; Mehta, Jaysheel
2011-12-01
We aimed to quantify the severity of the hallux valgus based on the lateral sesamoid position and to establish a correlation of our simple assessment method with the conventional radiological assessments. We reviewed one hundred and twenty two dorso-plantar weight bearing radiographs of feet. The intermetatarsal and hallux valgus angles were measured by the conventional methods; and the position of lateral sesamoid in relation to first metatarsal neck was assessed by our new and simple method. Significant correlation was noted between intermetatarsal angle and lateral sesamoid position (Rho 0.74, p < 0.0001); lateral sesamoid position and hallux valgus angle (Rho 0.56, p < 0.0001). Similar trends were noted in different grades of severity of hallux valgus in all the three methods of assessment. Our method of assessing hallux valgus deformity based on the lateral sesamoid position is simple, less time consuming and has statistically significant correlation with that of the established conventional radiological measurements. Copyright © 2011 European Foot and Ankle Society. Published by Elsevier Ltd. All rights reserved.
Conventionalism and Methodological Standards in Contending with Skepticism about Uncertainty
NASA Astrophysics Data System (ADS)
Brumble, K. C.
2012-12-01
What it means to measure and interpret confidence and uncertainty in a result is often particular to a specific scientific community and its methodology of verification. Additionally, methodology in the sciences varies greatly across disciplines and scientific communities. Understanding the accuracy of predictions of a particular science thus depends largely upon having an intimate working knowledge of the methods, standards, and conventions utilized and underpinning discoveries in that scientific field. Thus, valid criticism of scientific predictions and discoveries must be conducted by those who are literate in the field in question: they must have intimate working knowledge of the methods of the particular community and of the particular research under question. The interpretation and acceptance of uncertainty is one such shared, community-based convention. In the philosophy of science, this methodological and community-based way of understanding scientific work is referred to as conventionalism. By applying the conventionalism of historian and philosopher of science Thomas Kuhn to recent attacks upon methods of multi-proxy mean temperature reconstructions, I hope to illuminate how climate skeptics and their adherents fail to appreciate the need for community-based fluency in the methodological standards for understanding uncertainty shared by the wider climate science community. Further, I will flesh out a picture of climate science community standards of evidence and statistical argument following the work of philosopher of science Helen Longino. I will describe how failure to appreciate the conventions of professionalism and standards of evidence accepted in the climate science community results in the application of naïve falsification criteria. Appeal to naïve falsification in turn has allowed scientists outside the standards and conventions of the mainstream climate science community to consider themselves and to be judged by climate skeptics as valid critics of particular statistical reconstructions with naïve and misapplied methodological criticism. Examples will include the skeptical responses to multi-proxy mean temperature reconstructions and congressional hearings criticizing the work of Michael Mann et al.'s Hockey Stick.
Comparison between Bactec Peds Plus F Broth and Conventional Medium for Vitreous Culture.
Tabatabaei, Seyed Ali; Tabatabaei, Seyed Mehdi; Soleimani, Mohammad; Hejrati, Bahman; Mirshahi, Ahmad; Khadabandeh, Alireza; Ahmadraji, Aliasghar; Valipour, Niloofar
2018-05-10
To evaluate the yield of Bactec Peds Plus F broth for vitreous sample culture in cases with infectious endophthalmitis in comparison to conventional medium. Consecutive cases of clinically suspected endophthalmitis were prospectively enrolled in this study. Cultures of the vitreous sample were performed in both Bactec Peds Plus F broth and conventional mediums. Forty eyes of 40 patients who were clinically suspected of infectious endophthalmitis with different etiologies were enrolled in this study. The positive culture yield was 50% and 35% in Bactec Peds Plus F broth and conventional mediums, respectively (p = 0.07). The result of Bactec group was not significantly different among patients who had a history of intravitreal antibiotic injection (p > 0.05) (Table 2). However, results of the conventional method were significantly negative in the previous intravitreal antibiotic injection group (p = 0.02). There was no correlation between the methods of vitreous sampling in both culture methods. Although the difference between two culture methods was not statistically significant in this study, Bactec Peds Plus F broth showed higher positive culture yield in patients with a history of intravitreal antibiotic injection.
Antoszewska-Smith, Joanna; Sarul, Michał; Łyczek, Jan; Konopka, Tomasz; Kawala, Beata
2017-03-01
The aim of this systematic review was to compare the effectiveness of orthodontic miniscrew implants-temporary intraoral skeletal anchorage devices (TISADs)-in anchorage reinforcement during en-masse retraction in relation to conventional methods of anchorage. A search of PubMed, Embase, Cochrane Central Register of Controlled Trials, and Web of Science was performed. The keywords were orthodontic, mini-implants, miniscrews, miniplates, and temporary anchorage device. Relevant articles were assessed for quality according to Cochrane guidelines and the data extracted for statistical analysis. A meta-analysis of raw mean differences concerning anchorage loss, tipping of molars, retraction of incisors, tipping of incisors, and treatment duration was carried out. Initially, we retrieved 10,038 articles. The selection process finally resulted in 14 articles including 616 patients (451 female, 165 male) for detailed analysis. Quality of the included studies was assessed as moderate. Meta-analysis showed that use of TISADs facilitates better anchorage reinforcement compared with conventional methods. On average, TISADs enabled 1.86 mm more anchorage preservation than did conventional methods (P <0.001). The results of the meta-analysis showed that TISADs are more effective than conventional methods of anchorage reinforcement. The average difference of 2 mm seems not only statistically but also clinically significant. However, the results should be interpreted with caution because of the moderate quality of the included studies. More high-quality studies on this issue are necessary to enable drawing more reliable conclusions. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
Velasco, Valeria; Sherwood, Julie S.; Rojas-García, Pedro P.; Logue, Catherine M.
2014-01-01
The aim of this study was to compare a real-time PCR assay, with a conventional culture/PCR method, to detect S. aureus, mecA and Panton-Valentine Leukocidin (PVL) genes in animals and retail meat, using a two-step selective enrichment protocol. A total of 234 samples were examined (77 animal nasal swabs, 112 retail raw meat, and 45 deli meat). The multiplex real-time PCR targeted the genes: nuc (identification of S. aureus), mecA (associated with methicillin resistance) and PVL (virulence factor), and the primary and secondary enrichment samples were assessed. The conventional culture/PCR method included the two-step selective enrichment, selective plating, biochemical testing, and multiplex PCR for confirmation. The conventional culture/PCR method recovered 95/234 positive S. aureus samples. Application of real-time PCR on samples following primary and secondary enrichment detected S. aureus in 111/234 and 120/234 samples respectively. For detection of S. aureus, the kappa statistic was 0.68–0.88 (from substantial to almost perfect agreement) and 0.29–0.77 (from fair to substantial agreement) for primary and secondary enrichments, using real-time PCR. For detection of mecA gene, the kappa statistic was 0–0.49 (from no agreement beyond that expected by chance to moderate agreement) for primary and secondary enrichment samples. Two pork samples were mecA gene positive by all methods. The real-time PCR assay detected the mecA gene in samples that were negative for S. aureus, but positive for Staphylococcus spp. The PVL gene was not detected in any sample by the conventional culture/PCR method or the real-time PCR assay. Among S. aureus isolated by conventional culture/PCR method, the sequence type ST398, and multi-drug resistant strains were found in animals and raw meat samples. The real-time PCR assay may be recommended as a rapid method for detection of S. aureus and the mecA gene, with further confirmation of methicillin-resistant S. aureus (MRSA) using the standard culture method. PMID:24849624
Velasco, Valeria; Sherwood, Julie S; Rojas-García, Pedro P; Logue, Catherine M
2014-01-01
The aim of this study was to compare a real-time PCR assay, with a conventional culture/PCR method, to detect S. aureus, mecA and Panton-Valentine Leukocidin (PVL) genes in animals and retail meat, using a two-step selective enrichment protocol. A total of 234 samples were examined (77 animal nasal swabs, 112 retail raw meat, and 45 deli meat). The multiplex real-time PCR targeted the genes: nuc (identification of S. aureus), mecA (associated with methicillin resistance) and PVL (virulence factor), and the primary and secondary enrichment samples were assessed. The conventional culture/PCR method included the two-step selective enrichment, selective plating, biochemical testing, and multiplex PCR for confirmation. The conventional culture/PCR method recovered 95/234 positive S. aureus samples. Application of real-time PCR on samples following primary and secondary enrichment detected S. aureus in 111/234 and 120/234 samples respectively. For detection of S. aureus, the kappa statistic was 0.68-0.88 (from substantial to almost perfect agreement) and 0.29-0.77 (from fair to substantial agreement) for primary and secondary enrichments, using real-time PCR. For detection of mecA gene, the kappa statistic was 0-0.49 (from no agreement beyond that expected by chance to moderate agreement) for primary and secondary enrichment samples. Two pork samples were mecA gene positive by all methods. The real-time PCR assay detected the mecA gene in samples that were negative for S. aureus, but positive for Staphylococcus spp. The PVL gene was not detected in any sample by the conventional culture/PCR method or the real-time PCR assay. Among S. aureus isolated by conventional culture/PCR method, the sequence type ST398, and multi-drug resistant strains were found in animals and raw meat samples. The real-time PCR assay may be recommended as a rapid method for detection of S. aureus and the mecA gene, with further confirmation of methicillin-resistant S. aureus (MRSA) using the standard culture method.
Liu, Mei-Juan; Men, Yan-Ming; Zhang, Yong-Lin; Zhang, Yu-Xi; Liu, Hao
2017-01-01
We aimed to evaluate the diagnostic values of conventional ultrasound (US), ultrasound contrast (UC) and ultrasound elastography (UE) in distinguishing the benign and malignant thyroid nodules. A total of 100 patients with thyroid nodules receiving operative treatment were selected; they underwent the conventional US, UE and UC examinations before operation, respectively. The nodules received pathological examination after operation to distinguish benign from malignant lesions. The sensitivity, specificity and diagnostic accordance rate of each diagnostic method was evaluated by receiver operating characteristic (ROC) curve, and the area under the curve (AUC) of ROC was calculated. The manifestations of malignant thyroid nodules in conventional US examination were mostly the hypoecho, heterogeneous echo, irregular shape, unclear boundary, aspect ratio <1, microcalcification and irregular peripheral echo halo, and there were statistically significant differences compared with the benign nodules (P<0.05). UE showed that the differences between benign and malignant nodules in 2, 3 and 4 points were statistically significant (P<0.05). The manifestations of malignant nodules in UC were mostly the irregular shape, obscure boundary, no obvious enhancement, heterogeneous enhancement and visible perfusion defects, and there were statistically significant differences compared with the benign nodules (P<0.05). ROC curve showed that both sensitivity and specificity of UE and UC were superior to those of conventional US. AUC was the largest (AUC = 0.908) and the diagnostic value was the highest in the conventional US combined with UE and UC. Conventional US combined with elastography and UC can significantly improve the sensitivity, specificity and accuracy of diagnosis of benign and malignant thyroid nodules. PMID:28693244
Liu, Mei-Juan; Men, Yan-Ming; Zhang, Yong-Lin; Zhang, Yu-Xi; Liu, Hao
2017-07-01
We aimed to evaluate the diagnostic values of conventional ultrasound (US), ultrasound contrast (UC) and ultrasound elastography (UE) in distinguishing the benign and malignant thyroid nodules. A total of 100 patients with thyroid nodules receiving operative treatment were selected; they underwent the conventional US, UE and UC examinations before operation, respectively. The nodules received pathological examination after operation to distinguish benign from malignant lesions. The sensitivity, specificity and diagnostic accordance rate of each diagnostic method was evaluated by receiver operating characteristic (ROC) curve, and the area under the curve (AUC) of ROC was calculated. The manifestations of malignant thyroid nodules in conventional US examination were mostly the hypoecho, heterogeneous echo, irregular shape, unclear boundary, aspect ratio <1, microcalcification and irregular peripheral echo halo, and there were statistically significant differences compared with the benign nodules (P<0.05). UE showed that the differences between benign and malignant nodules in 2, 3 and 4 points were statistically significant (P<0.05). The manifestations of malignant nodules in UC were mostly the irregular shape, obscure boundary, no obvious enhancement, heterogeneous enhancement and visible perfusion defects, and there were statistically significant differences compared with the benign nodules (P<0.05). ROC curve showed that both sensitivity and specificity of UE and UC were superior to those of conventional US. AUC was the largest (AUC = 0.908) and the diagnostic value was the highest in the conventional US combined with UE and UC. Conventional US combined with elastography and UC can significantly improve the sensitivity, specificity and accuracy of diagnosis of benign and malignant thyroid nodules.
Effect of different mixing methods on the bacterial microleakage of calcium-enriched mixture cement.
Shahi, Shahriar; Jeddi Khajeh, Soniya; Rahimi, Saeed; Yavari, Hamid R; Jafari, Farnaz; Samiei, Mohammad; Ghasemi, Negin; Milani, Amin S
2016-10-01
Calcium-enriched mixture (CEM) cement is used in the field of endodontics. It is similar to mineral trioxide aggregate in its main ingredients. The present study investigated the effect of different mixing methods on the bacterial microleakage of CEM cement. A total of 55 human single-rooted human permanent teeth were decoronated so that 14-mm-long samples were obtained and obturated with AH26 sealer and gutta-percha using lateral condensation technique. Three millimeters of the root end were cut off and randomly divided into 3 groups of 15 each (3 mixing methods of amalgamator, ultrasonic and conventional) and 2 negative and positive control groups (each containing 5 samples). BHI (brain-heart infusion agar) suspension containing Enterococcus faecalis was used for bacterial leakage assessment. Statistical analysis was carried out using descriptive statistics, Kaplan-Meier survival analysis with censored data and log rank test. Statistical significance was set at P<0.05. The survival means for conventional, amalgamator and ultrasonic methods were 62.13±12.44, 68.87±12.79 and 77.53±12.52 days, respectively. The log rank test showed no significant differences between the groups. Based on the results of the present study it can be concluded that different mixing methods had no significant effect on the bacterial microleakage of CEM cement.
Blood culture bottles are superior to conventional media for vitreous culture.
Thariya, Patsuda; Yospaiboon, Yosanan; Sinawat, Suthasinee; Sanguansak, Thuss; Bhoomibunchoo, Chavakij; Laovirojjanakul, Wipada
2016-08-01
To compare blood culture bottles and conventional media for the vitreous culture in patients with clinically suspected infectious endophthalmitis. Retrospective comparative study at KKU Eye Center, Khon Kaen University. There were 342 patients with clinically suspected infectious endophthalmitis participated in the study. The vitreous specimens were inoculated in both blood culture bottles and on conventional culture media (blood agar, MacConkey agar, chocolate agar, Sabouraud dextrose agar and thioglycolate broth). The number of positive culture yields in both blood culture bottles and conventional media. Positive culture yields in both methods were found in 151 eyes (49.5%). There were 136 of 151 eyes (90.1%) with positive culture in blood culture bottles, whereas 99 of 151 eyes (65.6%) yielded positive cultures in conventional media. These findings were different with a statistical significance (P < 0.00001) and an odds ratio of 3.47 (95% confidence interval 1.92, 6.63). A combination of blood culture bottles and conventional media improved the yield. Blood culture bottles are superior to conventional media for vitreous culture in clinically suspected infectious endophthalmitis. Vitreous culture using blood culture bottles should be recommended as the primary method for microbiological diagnosis. A combination of both methods further improves the positive culture yield. © 2016 Royal Australian and New Zealand College of Ophthalmologists.
Performance of digital RGB reflectance color extraction for plaque lesion
NASA Astrophysics Data System (ADS)
Hashim, Hadzli; Taib, Mohd Nasir; Jailani, Rozita; Sulaiman, Saadiah; Baba, Roshidah
2005-01-01
Several clinical psoriasis lesion groups are been studied for digital RGB color features extraction. Previous works have used samples size that included all the outliers lying beyond the standard deviation factors from the peak histograms. This paper described the statistical performances of the RGB model with and without removing these outliers. Plaque lesion is experimented with other types of psoriasis. The statistical tests are compared with respect to three samples size; the original 90 samples, the first size reduction by removing outliers from 2 standard deviation distances (2SD) and the second size reduction by removing outliers from 1 standard deviation distance (1SD). Quantification of data images through the normal/direct and differential of the conventional reflectance method is considered. Results performances are concluded by observing the error plots with 95% confidence interval and findings of the inference T-tests applied. The statistical tests outcomes have shown that B component for conventional differential method can be used to distinctively classify plaque from the other psoriasis groups in consistent with the error plots finding with an improvement in p-value greater than 0.5.
Jin, Hong-Ying; Li, Da-Wei; Zhang, Na; Gu, Zhen; Long, Yi-Tao
2015-06-10
We demonstrated a practical method to analyze carbohydrate-protein interaction based on single plasmonic nanoparticles by conventional dark field microscopy (DFM). Protein concanavalin A (ConA) was modified on large sized gold nanoparticles (AuNPs), and dextran was conjugated on small sized AuNPs. As the interaction between ConA and dextran resulted in two kinds of gold nanoparticles coupled together, which caused coupling of plasmonic oscillations, apparent color changes (from green to yellow) of the single AuNPs were observed through DFM. Then, the color information was instantly transformed into a statistic peak wavelength distribution in less than 1 min by a self-developed statistical program (nanoparticleAnalysis). In addition, the interaction between ConA and dextran was proved with biospecific recognition. This approach is high-throughput and real-time, and is a convenient method to analyze carbohydrate-protein interaction at the single nanoparticle level efficiently.
Bautista, Ami C; Zhou, Lei; Jawa, Vibha
2013-10-01
Immunogenicity support during nonclinical biotherapeutic development can be resource intensive if supported by conventional methodologies. A universal indirect species-specific immunoassay can eliminate the need for biotherapeutic-specific anti-drug antibody immunoassays without compromising quality. By implementing the R's of sustainability (reduce, reuse, rethink), conservation of resources and greener laboratory practices were achieved in this study. Statistical analysis across four biotherapeutics supported identification of consistent product performance standards (cut points, sensitivity and reference limits) and a streamlined universal anti-drug antibody immunoassay method implementation strategy. We propose an efficient, fit-for-purpose, scientifically and statistically supported nonclinical immunogenicity assessment strategy. Utilization of a universal method and streamlined validation, while retaining comparability to conventional immunoassays and meeting the industry recommended standards, provides environmental credits in the scientific laboratory. Collectively, individual reductions in critical material consumption, energy usage, waste and non-environment friendly consumables, such as plastic and paper, support a greener laboratory environment.
Manual tracing versus smartphone application (app) tracing: a comparative study.
Sayar, Gülşilay; Kilinc, Delal Dara
2017-11-01
This study aimed to compare the results of conventional manual cephalometric tracing with those acquired with smartphone application cephalometric tracing. The cephalometric radiographs of 55 patients (25 females and 30 males) were traced via the manual and app methods and were subsequently examined with Steiner's analysis. Five skeletal measurements, five dental measurements and two soft tissue measurements were managed based on 21 landmarks. The durations of the performances of the two methods were also compared. SNA (Sella, Nasion, A point angle) and SNB (Sella, Nasion, B point angle) values for the manual method were statistically lower (p < .001) than those for the app method. The ANB value for the manual method was statistically lower than that of app method. L1-NB (°) and upper lip protrusion values for the manual method were statistically higher than those for the app method. Go-GN/SN, U1-NA (°) and U1-NA (mm) values for manual method were statistically lower than those for the app method. No differences between the two methods were found in the L1-NB (mm), occlusal plane to SN, interincisal angle or lower lip protrusion values. Although statistically significant differences were found between the two methods, the cephalometric tracing proceeded faster with the app method than with the manual method.
Strappini, Francesca; Gilboa, Elad; Pitzalis, Sabrina; Kay, Kendrick; McAvoy, Mark; Nehorai, Arye; Snyder, Abraham Z
2017-03-01
Temporal and spatial filtering of fMRI data is often used to improve statistical power. However, conventional methods, such as smoothing with fixed-width Gaussian filters, remove fine-scale structure in the data, necessitating a tradeoff between sensitivity and specificity. Specifically, smoothing may increase sensitivity (reduce noise and increase statistical power) but at the cost loss of specificity in that fine-scale structure in neural activity patterns is lost. Here, we propose an alternative smoothing method based on Gaussian processes (GP) regression for single subjects fMRI experiments. This method adapts the level of smoothing on a voxel by voxel basis according to the characteristics of the local neural activity patterns. GP-based fMRI analysis has been heretofore impractical owing to computational demands. Here, we demonstrate a new implementation of GP that makes it possible to handle the massive data dimensionality of the typical fMRI experiment. We demonstrate how GP can be used as a drop-in replacement to conventional preprocessing steps for temporal and spatial smoothing in a standard fMRI pipeline. We present simulated and experimental results that show the increased sensitivity and specificity compared to conventional smoothing strategies. Hum Brain Mapp 38:1438-1459, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Baker, Bruce D.; Richards, Craig E.
1999-01-01
Applies neural network methods for forecasting 1991-95 per-pupil expenditures in U.S. public elementary and secondary schools. Forecasting models included the National Center for Education Statistics' multivariate regression model and three neural architectures. Regarding prediction accuracy, neural network results were comparable or superior to…
Saadati, Farzaneh; Ahmad Tarmizi, Rohani; Mohd Ayub, Ahmad Fauzi; Abu Bakar, Kamariah
2015-01-01
Because students' ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is 'value added' because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students' problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students.
[Evaluation of Wits appraisal with superimposition method].
Xu, T; Ahn, J; Baumrind, S
1999-07-01
To compare the conventional Wits appraisal with superimposed Wits appraisal in evaluation of sagittal jaw relationship change between pre and post orthodontic treatment. The sample consists of 48-case pre and post treatment lateral head films. Computerized digitizing is used to get the cephalometric landmarks and measure conventional Wits value, superimposed Wits value and ANB angle. The correlation analysis among these three measures was done by SAS statistical package. The change of ANB angle has higher correlation with the change of superimposed Wits than that of the conventional Wits. The r-value is as high as 0.849 (P < 0.001). The superimposed Wits appraisal reflects the change of sagittal jaw relationship more objectively than the conventional one.
Comparison of Time-to-First Event and Recurrent Event Methods in Randomized Clinical Trials.
Claggett, Brian; Pocock, Stuart; Wei, L J; Pfeffer, Marc A; McMurray, John J V; Solomon, Scott D
2018-03-27
Background -Most Phase-3 trials feature time-to-first event endpoints for their primary and/or secondary analyses. In chronic diseases where a clinical event can occur more than once, recurrent-event methods have been proposed to more fully capture disease burden and have been assumed to improve statistical precision and power compared to conventional "time-to-first" methods. Methods -To better characterize factors that influence statistical properties of recurrent-events and time-to-first methods in the evaluation of randomized therapy, we repeatedly simulated trials with 1:1 randomization of 4000 patients to active vs control therapy, with true patient-level risk reduction of 20% (i.e. RR=0.80). For patients who discontinued active therapy after a first event, we assumed their risk reverted subsequently to their original placebo-level risk. Through simulation, we varied a) the degree of between-patient heterogeneity of risk and b) the extent of treatment discontinuation. Findings were compared with those from actual randomized clinical trials. Results -As the degree of between-patient heterogeneity of risk was increased, both time-to-first and recurrent-events methods lost statistical power to detect a true risk reduction and confidence intervals widened. The recurrent-events analyses continued to estimate the true RR=0.80 as heterogeneity increased, while the Cox model produced estimates that were attenuated. The power of recurrent-events methods declined as the rate of study drug discontinuation post-event increased. Recurrent-events methods provided greater power than time-to-first methods in scenarios where drug discontinuation was ≤30% following a first event, lesser power with drug discontinuation rates of ≥60%, and comparable power otherwise. We confirmed in several actual trials in chronic heart failure that treatment effect estimates were attenuated when estimated via the Cox model and that increased statistical power from recurrent-events methods was most pronounced in trials with lower treatment discontinuation rates. Conclusions -We find that the statistical power of both recurrent-events and time-to-first methods are reduced by increasing heterogeneity of patient risk, a parameter not included in conventional power and sample size formulas. Data from real clinical trials are consistent with simulation studies, confirming that the greatest statistical gains from use of recurrent-events methods occur in the presence of high patient heterogeneity and low rates of study drug discontinuation.
Adaptation of zirconia crowns created by conventional versus optical impression: in vitro study
Bahrami, Babak; Fossoyeux, InÈs; Atash, Ramin
2017-01-01
PURPOSE The aim of this study was to compare the precision of optical impression (Trios, 3Shape) versus that of conventional impression (Imprint IV, 3M-ESPE) with three different margins (shoulder, chamfer, and knife-edge) on Frasaco teeth. MATERIALS AND METHODS The sample comprised of 60 zirconia half-crowns, divided into six groups according to the type of impression and margin. Scanning electron microscopy enabled us to analyze the gap between the zirconia crowns and the Frasaco teeth, using ImageJ software, based on eight reproducible and standardized measuring points. RESULTS No statistically significant difference was found between conventional impressions and optical impressions, except for two of the eight points. A statistically significant difference was observed between the three margin types; the chamfer and knife-edge finishing lines appeared to offer better adaptation results than the shoulder margin. CONCLUSION Zirconia crowns created from optical impression and those created from conventional impression present similar adaptation. While offering identical results, the former have many advantages. In view of our findings, we believe the chamfer margin should be favored. PMID:28680553
Time Series Expression Analyses Using RNA-seq: A Statistical Approach
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021
Time series expression analyses using RNA-seq: a statistical approach.
Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P
2013-01-01
RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.
Benic, Goran I; Mühlemann, Sven; Fehmer, Vincent; Hämmerle, Christoph H F; Sailer, Irena
2016-11-01
Trials comparing the overall performance of fully digital and conventional workflows in reconstructive dentistry are needed. The purpose of the first part of this randomized controlled clinical trial was to determine whether optical impressions produce different results from conventional impressions with respect to time efficiency and patient and operator perceptions of the clinical workflow. Three digital impressions and 1 conventional impression were made in each of 10 participants according to a randomly generated sequence. The digital systems were Lava COS, iTero, and Cerec Bluecam. The conventional impression was made with the closed-mouth technique and polyvinyl siloxane material. The time needed for powdering, impressions, and interocclusal record was recorded. Patient and clinician perceptions of the procedures were rated by means of visual analog scales. The paired t test with Bonferroni correction was applied to detect differences (α=.05/6=.0083). The mean total working time ±standard deviation amounted to 260 ±66 seconds for the conventional impression, 493 ±193 seconds for Lava, 372 ±126 seconds for iTero, and 357 ±55 seconds for Cerec. The total working time for the conventional impression was significantly lower than that for Lava and Cerec. With regard to the working time without powdering, the differences between the methods were not statistically significant. The patient rating (very uncomfortable=0; comfortable=100) measured 61 ±34 for conventional impression, 71 ±18 for Lava, 66 ±20 for iTero, and 48 ±18 for Cerec. The differences were not statistically significant. The clinician rating (simple=0; very difficult=100) was 13 ±13 for the conventional impression, 54 ±27 for Lava, 22 ±11 for iTero, and 36 ±23 for Cerec. The differences between the conventional impression and Lava and between iTero and Lava were statistically significant. The conventional impression was more time-effective than the digital impressions. In terms of patient comfort, no differences were found between the conventional and the digital techniques. With respect to the clinician perception of difficulty, the conventional impression and the digital impression with iTero revealed more favorable outcomes than the digital impression with Lava. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Nilsson, Markus; Szczepankiewicz, Filip; van Westen, Danielle; Hansson, Oskar
2015-01-01
Conventional motion and eddy-current correction, where each diffusion-weighted volume is registered to a non diffusion-weighted reference, suffers from poor accuracy for high b-value data. An alternative approach is to extrapolate reference volumes from low b-value data. We aim to compare the performance of conventional and extrapolation-based correction of diffusional kurtosis imaging (DKI) data, and to demonstrate the impact of the correction approach on group comparison studies. DKI was performed in patients with Parkinson's disease dementia (PDD), and healthy age-matched controls, using b-values of up to 2750 s/mm2. The accuracy of conventional and extrapolation-based correction methods was investigated. Parameters from DTI and DKI were compared between patients and controls in the cingulum and the anterior thalamic projection tract. Conventional correction resulted in systematic registration errors for high b-value data. The extrapolation-based methods did not exhibit such errors, yielding more accurate tractography and up to 50% lower standard deviation in DKI metrics. Statistically significant differences were found between patients and controls when using the extrapolation-based motion correction that were not detected when using the conventional method. We recommend that conventional motion and eddy-current correction should be abandoned for high b-value data in favour of more accurate methods using extrapolation-based references.
Mason, H. E.; Uribe, E. C.; Shusterman, J. A.
2018-01-01
Tensor-rank decomposition methods have been applied to variable contact time 29 Si{ 1 H} CP/CPMG NMR data sets to extract NMR dynamics information and dramatically decrease conventional NMR acquisition times.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mason, H. E.; Uribe, E. C.; Shusterman, J. A.
Tensor-rank decomposition methods have been applied to variable contact time 29 Si{ 1 H} CP/CPMG NMR data sets to extract NMR dynamics information and dramatically decrease conventional NMR acquisition times.
Lee, Jung-Ju; Lee, Sang Kun; Choi, Jang Wuk; Kim, Dong-Wook; Park, Kyung Il; Kim, Bom Sahn; Kang, Hyejin; Lee, Dong Soo; Lee, Seo-Young; Kim, Sung Hun; Chung, Chun Kee; Nam, Hyeon Woo; Kim, Kwang Ki
2009-12-01
Ictal single-photon emission computed tomography (SPECT) is a valuable method for localizing the ictal onset zone in the presurgical evaluation of patients with intractable epilepsy. Conventional methods used to localize the ictal onset zone have problems with time lag from seizure onset to injection. To evaluate the clinical usefulness of a method that we developed, which involves an attachable automated injector (AAI), in reducing time lag and improving the ability to localize the zone of seizure onset. Patients admitted to the epilepsy monitoring unit (EMU) between January 1, 2003, and June 30, 2008, were included. The definition of ictal onset zone was made by comprehensive review of medical records, magnetic resonance imaging (MRI), data from video electroencephalography (EEG) monitoring, and invasive EEG monitoring if available. We comprehensively evaluated the time lag to injection and the image patterns of ictal SPECT using traditional visual analysis, statistical parametric mapping-assisted, and subtraction ictal SPECT coregistered to an MRI-assisted means of analysis. Image patterns were classified as localizing, lateralizing, and nonlateralizing. The whole number of patients was 99: 48 in the conventional group and 51 in the AAI group. The mean (SD) delay time to injection from seizure onset was 12.4+/-12.0 s in the group injected by our AAI method and 40.4+/-26.3 s in the group injected by the conventional method (P=0.000). The mean delay time to injection from seizure detection was 3.2+/-2.5 s in the group injected by the AAI method and 21.4+/-9.7 s in the group injected by the conventional method (P=0.000). The AAI method was superior to the conventional method in localizing the area of seizure onset (36 out of 51 with AAI method vs. 21 out of 48 with conventional method, P=0.009), especially in non-temporal lobe epilepsy (non-TLE) patients (17 out of 27 with AAI method vs. 3 out of 13 with conventional method, P=0.041), and in lateralizing the seizure onset hemisphere (47 out of 51 with AAI method vs. 33 out of 48 with conventional method, P=0.004). The AAI method was superior to the conventional method in reducing the time lag of tracer injection and in localizing and lateralizing the ictal onset zone, especially in patients with non-TLE.
Shokry, Mohamed; Aboelsaad, Nayer
2016-01-01
The purpose of this study was to test the effect of the surgical removal of impacted mandibular third molars using piezosurgery versus the conventional surgical technique on postoperative sequelae and bone healing. Material and Methods. This study was carried out as a randomized controlled clinical trial: split mouth design. Twenty patients with bilateral mandibular third molar mesioangular impaction class II position B indicated for surgical extraction were treated randomly using either the piezosurgery or the conventional bur technique on each site. Duration of the procedure, postoperative edema, trismus, pain, healing, and bone density and quantity were evaluated up to 6 months postoperatively. Results. Test and control sites were compared using paired t-test. There was statistical significance in reduction of pain and swelling in test sites, where the time of the procedure was statistically increased in test site. For bone quantity and quality, statistical difference was found where test site showed better results. Conclusion. Piezosurgery technique improves quality of patient's life in form of decrease of postoperative pain, trismus, and swelling. Furthermore, it enhances bone quality within the extraction socket and bone quantity along the distal aspect of the mandibular second molar. PMID:27597866
A feature refinement approach for statistical interior CT reconstruction
NASA Astrophysics Data System (ADS)
Hu, Zhanli; Zhang, Yunwan; Liu, Jianbo; Ma, Jianhua; Zheng, Hairong; Liang, Dong
2016-07-01
Interior tomography is clinically desired to reduce the radiation dose rendered to patients. In this work, a new statistical interior tomography approach for computed tomography is proposed. The developed design focuses on taking into account the statistical nature of local projection data and recovering fine structures which are lost in the conventional total-variation (TV)—minimization reconstruction. The proposed method falls within the compressed sensing framework of TV minimization, which only assumes that the interior ROI is piecewise constant or polynomial and does not need any additional prior knowledge. To integrate the statistical distribution property of projection data, the objective function is built under the criteria of penalized weighed least-square (PWLS-TV). In the implementation of the proposed method, the interior projection extrapolation based FBP reconstruction is first used as the initial guess to mitigate truncation artifacts and also provide an extended field-of-view. Moreover, an interior feature refinement step, as an important processing operation is performed after each iteration of PWLS-TV to recover the desired structure information which is lost during the TV minimization. Here, a feature descriptor is specifically designed and employed to distinguish structure from noise and noise-like artifacts. A modified steepest descent algorithm is adopted to minimize the associated objective function. The proposed method is applied to both digital phantom and in vivo Micro-CT datasets, and compared to FBP, ART-TV and PWLS-TV. The reconstruction results demonstrate that the proposed method performs better than other conventional methods in suppressing noise, reducing truncated and streak artifacts, and preserving features. The proposed approach demonstrates its potential usefulness for feature preservation of interior tomography under truncated projection measurements.
A feature refinement approach for statistical interior CT reconstruction.
Hu, Zhanli; Zhang, Yunwan; Liu, Jianbo; Ma, Jianhua; Zheng, Hairong; Liang, Dong
2016-07-21
Interior tomography is clinically desired to reduce the radiation dose rendered to patients. In this work, a new statistical interior tomography approach for computed tomography is proposed. The developed design focuses on taking into account the statistical nature of local projection data and recovering fine structures which are lost in the conventional total-variation (TV)-minimization reconstruction. The proposed method falls within the compressed sensing framework of TV minimization, which only assumes that the interior ROI is piecewise constant or polynomial and does not need any additional prior knowledge. To integrate the statistical distribution property of projection data, the objective function is built under the criteria of penalized weighed least-square (PWLS-TV). In the implementation of the proposed method, the interior projection extrapolation based FBP reconstruction is first used as the initial guess to mitigate truncation artifacts and also provide an extended field-of-view. Moreover, an interior feature refinement step, as an important processing operation is performed after each iteration of PWLS-TV to recover the desired structure information which is lost during the TV minimization. Here, a feature descriptor is specifically designed and employed to distinguish structure from noise and noise-like artifacts. A modified steepest descent algorithm is adopted to minimize the associated objective function. The proposed method is applied to both digital phantom and in vivo Micro-CT datasets, and compared to FBP, ART-TV and PWLS-TV. The reconstruction results demonstrate that the proposed method performs better than other conventional methods in suppressing noise, reducing truncated and streak artifacts, and preserving features. The proposed approach demonstrates its potential usefulness for feature preservation of interior tomography under truncated projection measurements.
Reliability of clinical guideline development using mail-only versus in-person expert panels.
Washington, Donna L; Bernstein, Steven J; Kahan, James P; Leape, Lucian L; Kamberg, Caren J; Shekelle, Paul G
2003-12-01
Clinical practice guidelines quickly become outdated. One reason they might not be updated as often as needed is the expense of collecting expert judgment regarding the evidence. The RAND-UCLA Appropriateness Method is one commonly used method for collecting expert opinion. We tested whether a less expensive, mail-only process could substitute for the standard in-person process normally used. We performed a 4-way replication of the appropriateness panel process for coronary revascularization and hysterectomy, conducting 3 panels using the conventional in-person method and 1 panel entirely by mail. All indications were classified as inappropriate or not (to evaluate overuse), and coronary revascularization indications were classified as necessary or not (to evaluate underuse). Kappa statistics were calculated for the comparison in ratings from the 2 methods. Agreement beyond chance between the 2 panel methods ranged from moderate to substantial. The kappa statistic to detect overuse was 0.57 for coronary revascularization and 0.70 for hysterectomy. The kappa statistic to detect coronary revascularization underuse was 0.76. There were no cases in which coronary revascularization was considered inappropriate by 1 method, but necessary or appropriate by the other. Three of 636 (0.5%) hysterectomy cases were categorized as inappropriate by 1 method but appropriate by the other. The reproducibility of the overuse and underuse assessments from the mail-only compared with the conventional in-person conduct of expert panels in this application was similar to the underlying reproducibility of the process. This suggests a potential role for updating guidelines using an expert judgment process conducted entirely through the mail.
[Role of BoBs technology in early missed abortion chorionic villi].
Li, Z Y; Liu, X Y; Peng, P; Chen, N; Ou, J; Hao, N; Zhou, J; Bian, X M
2018-05-25
Objective: To investigate the value of bacterial artificial chromosome-on-beads (BoBs) technology in the genetic analysis of early missed abortion chorionic villi. Methods: Early missed abortion chorionic villi were detected with both conventional karyotyping method and BoBs technology in Peking Union Medical Hospital from July 2014 to March 2015. Compared the results of BoBs with conventional karyotyping analysis to evaluate the sensitivity, specificity and accuracy of this new method. Results: (1) A total of 161 samples were tested successfully in the technology of BoBs, 131 samples were tested successfully in the method of conventional karyotyping. (2) All of the cases obtained from BoBs results in (2.7±0.6) days and obtained from conventional karyotyping results in (22.5±1.9) days. There was significant statistical difference between the two groups ( t= 123.315, P< 0.01) . (3) Out of 161 cases tested in BoBs, 85 (52.8%, 85/161) cases had the abnormal chromosomes, including 79 cases chromosome number abnormality, 4 cases were chromosome segment deletion, 2 cases mosaic. Out of 131 cases tested successfully in conventional karyotyping, 79 (60.3%, 79/131) cases had the abnormal chromosomes including 62 cases chromosome number abnormality, 17 cases other chromosome number abnormality, and the rate of chromosome abnormality between two methods was no significant differences ( P =0.198) . (4) Conventional karyotyping results were served as the gold standard, the accuracy of BoBs for abnormal chromosomes was 82.4% (108/131) , analysed the normal chromosomes (52 cases) and chromosome number abnormality (62 cases) tested in conventional karyotyping, the accuracy of BoBs for chromosome number abnormality was 94.7% (108/114) . Conclusion: BoBs is a rapid reliable and easily operated method to test early missed abortion chorionic villi chromosomal abnormalities.
Enrichment analysis in high-throughput genomics - accounting for dependency in the NULL.
Gold, David L; Coombes, Kevin R; Wang, Jing; Mallick, Bani
2007-03-01
Translating the overwhelming amount of data generated in high-throughput genomics experiments into biologically meaningful evidence, which may for example point to a series of biomarkers or hint at a relevant pathway, is a matter of great interest in bioinformatics these days. Genes showing similar experimental profiles, it is hypothesized, share biological mechanisms that if understood could provide clues to the molecular processes leading to pathological events. It is the topic of further study to learn if or how a priori information about the known genes may serve to explain coexpression. One popular method of knowledge discovery in high-throughput genomics experiments, enrichment analysis (EA), seeks to infer if an interesting collection of genes is 'enriched' for a Consortium particular set of a priori Gene Ontology Consortium (GO) classes. For the purposes of statistical testing, the conventional methods offered in EA software implicitly assume independence between the GO classes. Genes may be annotated for more than one biological classification, and therefore the resulting test statistics of enrichment between GO classes can be highly dependent if the overlapping gene sets are relatively large. There is a need to formally determine if conventional EA results are robust to the independence assumption. We derive the exact null distribution for testing enrichment of GO classes by relaxing the independence assumption using well-known statistical theory. In applications with publicly available data sets, our test results are similar to the conventional approach which assumes independence. We argue that the independence assumption is not detrimental.
Kaur, Ravinder; Dhakad, Megh Singh; Goyal, Ritu; Haque, Absarul; Mukhopadhyay, Gauranga
2016-01-01
Candida infection is a major cause of morbidity and mortality in immunocompromised patients; an accurate and early identification is a prerequisite need to be taken as an effective measure for the management of patients. The purpose of this study was to compare the conventional identification of Candida species with identification by Vitek-2 system and the antifungal susceptibility testing (AST) by broth microdilution method with Vitek-2 AST system. A total of 172 Candida isolates were subjected for identification by the conventional methods, Vitek-2 system, restriction fragment length polymorphism, and random amplified polymorphic DNA analysis. AST was carried out as per the Clinical and Laboratory Standards Institute M27-A3 document and by Vitek-2 system. Candida albicans (82.51%) was the most common Candida species followed by Candida tropicalis (6.29%), Candida krusei (4.89%), Candida parapsilosis (3.49%), and Candida glabrata (2.79%). With Vitek-2 system, of the 172 isolates, 155 Candida isolates were correctly identified, 13 were misidentified, and four were with low discrimination. Whereas with conventional methods, 171 Candida isolates were correctly identified and only a single isolate of C. albicans was misidentified as C. tropicalis . The average measurement of agreement between the Vitek-2 system and conventional methods was >94%. Most of the isolates were susceptible to fluconazole (88.95%) and amphotericin B (97.67%). The measurement of agreement between the methods of AST was >94% for fluconazole and >99% for amphotericin B, which was statistically significant ( P < 0.01). The study confirmed the importance and reliability of conventional and molecular methods, and the acceptable agreements suggest Vitek-2 system an alternative method for speciation and sensitivity testing of Candida species infections.
1994-08-01
Descriptive Statistics of Sediment Conventional Parameters and Statistical Comparisons of Oil and Grease and Total Petroleum Hydrocarbon Concentrations in...7000 Series, USEPA 1986, Bloom and Crecelius 1987). Oil and grease, total petroleum hydrocarbons . BPNL. Oil and grease were determined accord- ing to...infrared spectrometer. Total petroleum hydrocarbons were determined according to Method 418.1 (USEPA 1983). Sediment samples were extracted with freon
Kim, Dong Wook; Kim, Hwiyoung; Nam, Woong; Kim, Hyung Jun; Cha, In-Ho
2018-04-23
The aim of this study was to build and validate five types of machine learning models that can predict the occurrence of BRONJ associated with dental extraction in patients taking bisphosphonates for the management of osteoporosis. A retrospective review of the medical records was conducted to obtain cases and controls for the study. Total 125 patients consisting of 41 cases and 84 controls were selected for the study. Five machine learning prediction algorithms including multivariable logistic regression model, decision tree, support vector machine, artificial neural network, and random forest were implemented. The outputs of these models were compared with each other and also with conventional methods, such as serum CTX level. Area under the receiver operating characteristic (ROC) curve (AUC) was used to compare the results. The performance of machine learning models was significantly superior to conventional statistical methods and single predictors. The random forest model yielded the best performance (AUC = 0.973), followed by artificial neural network (AUC = 0.915), support vector machine (AUC = 0.882), logistic regression (AUC = 0.844), decision tree (AUC = 0.821), drug holiday alone (AUC = 0.810), and CTX level alone (AUC = 0.630). Machine learning methods showed superior performance in predicting BRONJ associated with dental extraction compared to conventional statistical methods using drug holiday and serum CTX level. Machine learning can thus be applied in a wide range of clinical studies. Copyright © 2017. Published by Elsevier Inc.
Tambe, Varsha H; Nagmode, Pradnya S; Vishwas, Jayshree R; P, Saujanya K; Angadi, Prabakar; Ali, Fareedi Mukram
2013-01-01
Background: To compare the amount of debris extruded apically by using conventional syringe, Endovac & Ultrasonic irrigation. Materials & Methods: Thirty freshly extracted mandibular premolars were selected, working length was determined and mounted in a debris and collection apparatus. The canals were prepared. After each instrument change, 1 ml. of 3% sodium hypochlorite was used as irrigation. Debris extruded apically by using conventional syringe, endovac& ultrasonic irrigation tech, was measured using the electronic balance to determine its weight and statistical analysis was performed. The mean difference between the groups was determined using statistical analysis within the groups &between the groups for equal variances. Results: Among all the groups, significantly less debris were found apically in the Endovac group (0.96) compared to conventional and ultrasonic group (1.23) syringe. Conclusion: The present study showed that endovac system extrudes less amount of debris apically as compared to ultrasonic followed by conventional so incidence of flare up can be reduce by using endovac irrigation system. How to cite this article: Tambe V H, Nagmode P S, Vishwas J R, Saujanya K P, Angadi P, Ali F M. Evaluation of the Amount of Debris extruded apically by using Conventional Syringe, Endovac and Ultrasonic Irrigation Technique: An In Vitro Study. J Int Oral Health 2013; 5(3):63-66. PMID:24155604
Dimensional Changes of Acrylic Resin Denture Bases: Conventional Versus Injection-Molding Technique
Gharechahi, Jafar; Asadzadeh, Nafiseh; Shahabian, Foad; Gharechahi, Maryam
2014-01-01
Objective: Acrylic resin denture bases undergo dimensional changes during polymerization. Injection molding techniques are reported to reduce these changes and thereby improve physical properties of denture bases. The aim of this study was to compare dimensional changes of specimens processed by conventional and injection-molding techniques. Materials and Methods: SR-Ivocap Triplex Hot resin was used for conventional pressure-packed and SR-Ivocap High Impact was used for injection-molding techniques. After processing, all the specimens were stored in distilled water at room temperature until measured. For dimensional accuracy evaluation, measurements were recorded at 24-hour, 48-hour and 12-day intervals using a digital caliper with an accuracy of 0.01 mm. Statistical analysis was carried out by SPSS (SPSS Inc., Chicago, IL, USA) using t-test and repeated-measures ANOVA. Statistical significance was defined at P<0.05. Results: After each water storage period, the acrylic specimens produced by injection exhibited less dimensional changes compared to those produced by the conventional technique. Curing shrinkage was compensated by water sorption with an increase in water storage time decreasing dimensional changes. Conclusion: Within the limitations of this study, dimensional changes of acrylic resin specimens were influenced by the molding technique used and SR-Ivocap injection procedure exhibited higher dimensional accuracy compared to conventional molding. PMID:25584050
Bayesian statistics in radionuclide metrology: measurement of a decaying source
NASA Astrophysics Data System (ADS)
Bochud, François O.; Bailat, Claude J.; Laedermann, Jean-Pascal
2007-08-01
The most intuitive way of defining a probability is perhaps through the frequency at which it appears when a large number of trials are realized in identical conditions. The probability derived from the obtained histogram characterizes the so-called frequentist or conventional statistical approach. In this sense, probability is defined as a physical property of the observed system. By contrast, in Bayesian statistics, a probability is not a physical property or a directly observable quantity, but a degree of belief or an element of inference. The goal of this paper is to show how Bayesian statistics can be used in radionuclide metrology and what its advantages and disadvantages are compared with conventional statistics. This is performed through the example of an yttrium-90 source typically encountered in environmental surveillance measurement. Because of the very low activity of this kind of source and the small half-life of the radionuclide, this measurement takes several days, during which the source decays significantly. Several methods are proposed to compute simultaneously the number of unstable nuclei at a given reference time, the decay constant and the background. Asymptotically, all approaches give the same result. However, Bayesian statistics produces coherent estimates and confidence intervals in a much smaller number of measurements. Apart from the conceptual understanding of statistics, the main difficulty that could deter radionuclide metrologists from using Bayesian statistics is the complexity of the computation.
NASA Astrophysics Data System (ADS)
Decraene, Carolina; Dijckmans, Arne; Reynders, Edwin P. B.
2018-05-01
A method is developed for computing the mean and variance of the diffuse field sound transmission loss of finite-sized layered wall and floor systems that consist of solid, fluid and/or poroelastic layers. This is achieved by coupling a transfer matrix model of the wall or floor to statistical energy analysis subsystem models of the adjacent room volumes. The modal behavior of the wall is approximately accounted for by projecting the wall displacement onto a set of sinusoidal lateral basis functions. This hybrid modal transfer matrix-statistical energy analysis method is validated on multiple wall systems: a thin steel plate, a polymethyl methacrylate panel, a thick brick wall, a sandwich panel, a double-leaf wall with poro-elastic material in the cavity, and a double glazing. The predictions are compared with experimental data and with results obtained using alternative prediction methods such as the transfer matrix method with spatial windowing, the hybrid wave based-transfer matrix method, and the hybrid finite element-statistical energy analysis method. These comparisons confirm the prediction accuracy of the proposed method and the computational efficiency against the conventional hybrid finite element-statistical energy analysis method.
Strength and life criteria for corrugated fiberboard by three methods
Thomas J. Urbanik
1997-01-01
The conventional test method for determining the stacking life of corrugated containers at a fixed load level does not adequately predict a safe load when storage time is fixed. This study introduced multiple load levels and related the probability of time at failure to load. A statistical analysis of logarithm-of-time failure data varying with load level predicts the...
Saadati, Farzaneh; Ahmad Tarmizi, Rohani
2015-01-01
Because students’ ability to use statistics, which is mathematical in nature, is one of the concerns of educators, embedding within an e-learning system the pedagogical characteristics of learning is ‘value added’ because it facilitates the conventional method of learning mathematics. Many researchers emphasize the effectiveness of cognitive apprenticeship in learning and problem solving in the workplace. In a cognitive apprenticeship learning model, skills are learned within a community of practitioners through observation of modelling and then practice plus coaching. This study utilized an internet-based Cognitive Apprenticeship Model (i-CAM) in three phases and evaluated its effectiveness for improving statistics problem-solving performance among postgraduate students. The results showed that, when compared to the conventional mathematics learning model, the i-CAM could significantly promote students’ problem-solving performance at the end of each phase. In addition, the combination of the differences in students' test scores were considered to be statistically significant after controlling for the pre-test scores. The findings conveyed in this paper confirmed the considerable value of i-CAM in the improvement of statistics learning for non-specialized postgraduate students. PMID:26132553
Treatment of Chemical Equilibrium without Using Thermodynamics or Statistical Mechanics.
ERIC Educational Resources Information Center
Nelson, P. G.
1986-01-01
Discusses the conventional approaches to teaching about chemical equilibrium in advanced physical chemistry courses. Presents an alternative approach to the treatment of this concept by using Boltzmann's distribution law. Lists five advantages to using this method as compared with the other approaches. (TW)
Baek, Hye Jin; Kim, Dong Wook; Ryu, Ji Hwa; Lee, Yoo Jin
2013-01-01
Background There has been no study to compare the diagnostic accuracy of an experienced radiologist with a trainee in nasal bone fracture. Objectives To compare the diagnostic accuracy between conventional radiography and computed tomography (CT) for the identification of nasal bone fractures and to evaluate the interobserver reliability between a staff radiologist and a trainee. Patients and Methods A total of 108 patients who underwent conventional radiography and CT after acute nasal trauma were included in this retrospective study. Two readers, a staff radiologist and a second-year resident, independently assessed the results of the imaging studies. Results Of the 108 patients, the presence of a nasal bone fracture was confirmed in 88 (81.5%) patients. The number of non-depressed fractures was higher than the number of depressed fractures. In nine (10.2%) patients, nasal bone fractures were only identified on conventional radiography, including three depressed and six non-depressed fractures. CT was more accurate as compared to conventional radiography for the identification of nasal bone fractures as determined by both readers (P <0.05), all diagnostic indices of an experienced radiologist were similar to or higher than those of a trainee, and κ statistics showed moderate agreement between the two diagnostic tools for both readers. There was no statistical difference in the assessment of interobserver reliability for both imaging modalities in the identification of nasal bone fractures. Conclusion For the identification of nasal bone fractures, CT was significantly superior to conventional radiography. Although a staff radiologist showed better values in the identification of nasal bone fracture and differentiation between depressed and non-depressed fractures than a trainee, there was no statistically significant difference in the interpretation of conventional radiography and CT between a radiologist and a trainee. PMID:24348599
Percutaneous dilatational versus conventional surgical tracheostomy in intensive care patients
Youssef, Tarek F.; Ahmed, Mohamed Rifaat; Saber, Aly
2011-01-01
Background: Tracheostomy is usually performed in patients with difficult weaning from mechanical ventilation or some catastrophic neurologic insult. Conventional tracheostomy involves dissection of the pretracheal tissues and insertion of the tracheostomy tube into the trachea under direct vision. Percutaneous dilatational tracheostomy is increasingly popular and has gained widespread acceptance in many intensive care unit and trauma centers. Aim: Aim of the study was to compare percutaneous dilatational tracheostomy versus conventional tracheostomy in intensive care patients. Patients and Methods: 64 critically ill patients admitted to intensive care unit subjected to tracheostomy and randomly divided into two groups; percutaneous dilatational tracheostomy and conventional tracheostomy. Results: Mean duration of the procedure was similar between the two procedures while the mean size of tracheostomy tube was smaller in percutaneous technique. In addition, the Lowest SpO2 during procedure, PaCO2 after operation and intra-operative bleeding for both groups were nearly similar without any statistically difference. Postoperative infection after 7 days seen to be statistically lowered and the length of scar tend to be smaller among PDT patients. Conclusion: PDT technique is effective and safe as CST with low incidence of post operative complication. PMID:22361497
Shabzendedar, Mahbobeh; Moosavi, Horieh; Talbi, Maryam; Sharifi, Marjan
2011-11-01
The goal of the study was to evaluate the effect of caries removal by three various methods on the permeability of class II composite resin restorations in primary molar teeth. Forty-five recently extracted primary molars were randomly assigned to three groups for three different methods of caries removal; group 1-mechanical, group 2-caries detector dye, and group 3-Carisolv (n = 15). After that, class II cavities in all groups were restored with the adhesive (Opti Bond Solo Plus) that was applied according to the manufacturer's instruction and a posterior composite (Herculite XRV), which was used incrementally. After 24 hours the samples were thermocycled in water for 500 cycles between 5 and 55°C with a dwell time of 30 sec. Permeability was assessed by the fluid filtration method. The data were analyzed using the ANOVA test while study groups were compared with Tukey test for statistically significant differences at a 5% significance level. The evaluation of tested groups indicated that the highest (0.80) and least (0.37) mean of permeability was observed in group 2 and 3 respectively. Significant difference was revealed among the tested groups (p = 0.045). The comparison of Carisolv and caries detector dye groups indicated a statistically significant difference (p = 0.037). There was not any significant difference between Carisolv or caries dye in the conventional group. Using the chemomechanical and staining methods for caries removal had no more detrimental effect on permeability than the conventional technique. However, caries detection dye for caries removal could be more harmful than chemomechanical method. None of the current caries-excavation techniques could eliminate permeability in class II composite resin restorations. Furthermore, staining methods do not have an adverse effect on sealing ability in comparison to the conventional technique.
Xu, Junzhong; Li, Ke; Smith, R. Adam; Waterton, John C.; Zhao, Ping; Ding, Zhaohua; Does, Mark D.; Manning, H. Charles; Gore, John C.
2016-01-01
Background Diffusion-weighted MRI (DWI) signal attenuation is often not mono-exponential (i.e. non-Gaussian diffusion) with stronger diffusion weighting. Several non-Gaussian diffusion models have been developed and may provide new information or higher sensitivity compared with the conventional apparent diffusion coefficient (ADC) method. However the relative merits of these models to detect tumor therapeutic response is not fully clear. Methods Conventional ADC, and three widely-used non-Gaussian models, (bi-exponential, stretched exponential, and statistical model), were implemented and compared for assessing SW620 human colon cancer xenografts responding to barasertib, an agent known to induce apoptosis via polyploidy. Bayesian Information Criterion (BIC) was used for model selection among all three non-Gaussian models. Results All of tumor volume, histology, conventional ADC, and three non-Gaussian DWI models could show significant differences between control and treatment groups after four days of treatment. However, only the non-Gaussian models detected significant changes after two days of treatment. For any treatment or control group, over 65.7% of tumor voxels indicate the bi-exponential model is strongly or very strongly preferred. Conclusion Non-Gaussian DWI model-derived biomarkers are capable of detecting tumor earlier chemotherapeutic response of tumors compared with conventional ADC and tumor volume. The bi-exponential model provides better fitting compared with statistical and stretched exponential models for the tumor and treatment models used in the current work. PMID:27919785
Lotfy, Hayam M; Amer, Sawsan M; Zaazaa, Hala E; Mostafa, Noha S
2015-01-01
Two novel simple, specific, accurate and precise spectrophotometric methods manipulating ratio spectra are developed and validated for simultaneous determination of Esomeprazole magnesium trihydrate (ESO) and Naproxen (NAP) namely; absorbance subtraction and ratio difference. The results were compared to that of the conventional spectrophotometric methods namely; dual wavelength and isoabsorptive point coupled with first derivative of ratio spectra and derivative ratio. The suggested methods were validated in compliance with the ICH guidelines and were successfully applied for determination of ESO and NAP in their laboratory prepared mixtures and pharmaceutical preparation. No preliminary separation steps are required for the proposed spectrophotometeric procedures. The statistical comparison showed that there is no significant difference between the proposed methods and the reported method with respect to both accuracy and precision. Copyright © 2015 Elsevier B.V. All rights reserved.
Pre-Then-Post Testing: A Tool To Improve the Accuracy of Management Training Program Evaluation.
ERIC Educational Resources Information Center
Mezoff, Bob
1981-01-01
Explains a procedure to avoid the detrimental biases of conventional self-reports of training outcomes. The evaluation format provided is a method for using statistical procedures to increase the accuracy of self-reports by overcoming response-shift-bias. (Author/MER)
NASA Astrophysics Data System (ADS)
Kim, Kyungmin; Harry, Ian W.; Hodge, Kari A.; Kim, Young-Min; Lee, Chang-Hwan; Lee, Hyun Kyu; Oh, John J.; Oh, Sang Hoon; Son, Edwin J.
2015-12-01
We apply a machine learning algorithm, the artificial neural network, to the search for gravitational-wave signals associated with short gamma-ray bursts (GRBs). The multi-dimensional samples consisting of data corresponding to the statistical and physical quantities from the coherent search pipeline are fed into the artificial neural network to distinguish simulated gravitational-wave signals from background noise artifacts. Our result shows that the data classification efficiency at a fixed false alarm probability (FAP) is improved by the artificial neural network in comparison to the conventional detection statistic. Specifically, the distance at 50% detection probability at a fixed false positive rate is increased about 8%-14% for the considered waveform models. We also evaluate a few seconds of the gravitational-wave data segment using the trained networks and obtain the FAP. We suggest that the artificial neural network can be a complementary method to the conventional detection statistic for identifying gravitational-wave signals related to the short GRBs.
Statistical physics inspired energy-efficient coded-modulation for optical communications.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2012-04-15
Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America
Extraction of coffee silverskin to convert waste into a source of antioxidant
NASA Astrophysics Data System (ADS)
Tangguh, Patrick; Kusumocahyo, Samuel P.
2017-01-01
Coffee silverskin (CS) is a thin layer of coffee bean, and is regarded as a waste during coffee roasting process. In this work, coffee silverskin was extracted by three types of method: conventional extraction (CE) with agitation, conventional extraction (CE) without agitation and ultrasound-assisted extraction (UAE). The total phenolic content, the total flavonoid content and the antioxidant activity of the extract were analyzed. It was found that the type of extraction method, the extraction time and the extraction temperature strongly influenced the total phenolic content, the total flavonoid content and the antioxidant activity of the extract. Comparison between conventional extraction (CE) and ultrasound-assisted extraction (UAE) were statistically analyzed using 3-way ANOVA test. The optimum extraction time and temperature for each method were analyzed using 2-way ANOVA test. It was found that the optimum condition to obtain a high antioxidant activity of 68.9% was by using CE with agitation with the extraction time and temperature of 60 minutes and 60˚C, respectively.
van IJsseldijk, E A; Valstar, E R; Stoel, B C; Nelissen, R G H H; Baka, N; Van't Klooster, R; Kaptein, B L
2016-08-01
An important measure for the diagnosis and monitoring of knee osteoarthritis is the minimum joint space width (mJSW). This requires accurate alignment of the x-ray beam with the tibial plateau, which may not be accomplished in practice. We investigate the feasibility of a new mJSW measurement method from stereo radiographs using 3D statistical shape models (SSM) and evaluate its sensitivity to changes in the mJSW and its robustness to variations in patient positioning and bone geometry. A validation study was performed using five cadaver specimens. The actual mJSW was varied and images were acquired with variation in the cadaver positioning. For comparison purposes, the mJSW was also assessed from plain radiographs. To study the influence of SSM model accuracy, the 3D mJSW measurement was repeated with models from the actual bones, obtained from CT scans. The SSM-based measurement method was more robust (consistent output for a wide range of input data/consistent output under varying measurement circumstances) than the conventional 2D method, showing that the 3D reconstruction indeed reduces the influence of patient positioning. However, the SSM-based method showed comparable sensitivity to changes in the mJSW with respect to the conventional method. The CT-based measurement was more accurate than the SSM-based measurement (smallest detectable differences 0.55 mm versus 0. 82 mm, respectively). The proposed measurement method is not a substitute for the conventional 2D measurement due to limitations in the SSM model accuracy. However, further improvement of the model accuracy and optimisation technique can be obtained. Combined with the promising options for applications using quantitative information on bone morphology, SSM based 3D reconstructions of natural knees are attractive for further development.Cite this article: E. A. van IJsseldijk, E. R. Valstar, B. C. Stoel, R. G. H. H. Nelissen, N. Baka, R. van't Klooster, B. L. Kaptein. Three dimensional measurement of minimum joint space width in the knee from stereo radiographs using statistical shape models. Bone Joint Res 2016;320-327. DOI: 10.1302/2046-3758.58.2000626. © 2016 van IJsseldijk et al.
Singh, Santosh K; Singh, Sanjay K; Tripathi, Vinayak R; Khare, Sunil K; Garg, Satyendra K
2011-12-28
Production of alkaline protease from various bacterial strains using statistical methods is customary now-a-days. The present work is first attempt for the production optimization of a solvent stable thermoalkaline protease by a psychrotrophic Pseudomonas putida isolate using conventional, response surface methods, and fermentor level optimization. The pre-screening medium amended with optimized (w/v) 1.0% glucose, 2.0% gelatin and 0.5% yeast extract, produced 278 U protease ml(-1) at 72 h incubation. Enzyme production increased to 431 Uml(-1) when Mg2+ (0.01%, w/v) was supplemented. Optimization of physical factors further enhanced protease to 514 Uml(-1) at pH 9.0, 25°C and 200 rpm within 60 h. The combined effect of conventionally optimized variables (glucose, yeast extract, MgSO4 and pH), thereafter predicted by response surface methodology yielded 617 U protease ml(-1) at glucose 1.25% (w/v), yeast extract 0.5% (w/v), MgSO4 0.01% (w/v) and pH 8.8. Bench-scale bioreactor level optimization resulted in enhanced production of 882 U protease ml(-1) at 0.8 vvm aeration and 150 rpm agitation during only 48 h incubation. The optimization of fermentation variables using conventional, statistical approaches and aeration/agitation at fermentor level resulted in ~13.5 folds increase (882 Uml(-1)) in protease production compared to un-optimized conditions (65 Uml(-1)). This is the highest level of thermoalkaline protease reported so far by any psychrotrophic bacterium.
Access to opioids: a global pain management crisis.
Buitrago, Rosa
2013-03-01
The lack of availability of opioids in many countries has created a pain management crisis. Because the Single Convention on Narcotic Drugs requires governments to report annual opioid statistics, there is a need for methods to calculate individual nations' opioid needs. Ways to address this need are discussed.
Applying Bayesian statistics to the study of psychological trauma: A suggestion for future research.
Yalch, Matthew M
2016-03-01
Several contemporary researchers have noted the virtues of Bayesian methods of data analysis. Although debates continue about whether conventional or Bayesian statistics is the "better" approach for researchers in general, there are reasons why Bayesian methods may be well suited to the study of psychological trauma in particular. This article describes how Bayesian statistics offers practical solutions to the problems of data non-normality, small sample size, and missing data common in research on psychological trauma. After a discussion of these problems and the effects they have on trauma research, this article explains the basic philosophical and statistical foundations of Bayesian statistics and how it provides solutions to these problems using an applied example. Results of the literature review and the accompanying example indicates the utility of Bayesian statistics in addressing problems common in trauma research. Bayesian statistics provides a set of methodological tools and a broader philosophical framework that is useful for trauma researchers. Methodological resources are also provided so that interested readers can learn more. (c) 2016 APA, all rights reserved).
Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting
Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen; Wald, Lawrence L.
2017-01-01
This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization. PMID:26915119
Maximum Likelihood Reconstruction for Magnetic Resonance Fingerprinting.
Zhao, Bo; Setsompop, Kawin; Ye, Huihui; Cauley, Stephen F; Wald, Lawrence L
2016-08-01
This paper introduces a statistical estimation framework for magnetic resonance (MR) fingerprinting, a recently proposed quantitative imaging paradigm. Within this framework, we present a maximum likelihood (ML) formalism to estimate multiple MR tissue parameter maps directly from highly undersampled, noisy k-space data. A novel algorithm, based on variable splitting, the alternating direction method of multipliers, and the variable projection method, is developed to solve the resulting optimization problem. Representative results from both simulations and in vivo experiments demonstrate that the proposed approach yields significantly improved accuracy in parameter estimation, compared to the conventional MR fingerprinting reconstruction. Moreover, the proposed framework provides new theoretical insights into the conventional approach. We show analytically that the conventional approach is an approximation to the ML reconstruction; more precisely, it is exactly equivalent to the first iteration of the proposed algorithm for the ML reconstruction, provided that a gridding reconstruction is used as an initialization.
Comparison of Accuracy Between a Conventional and Two Digital Intraoral Impression Techniques.
Malik, Junaid; Rodriguez, Jose; Weisbloom, Michael; Petridis, Haralampos
To compare the accuracy (ie, precision and trueness) of full-arch impressions fabricated using either a conventional polyvinyl siloxane (PVS) material or one of two intraoral optical scanners. Full-arch impressions of a reference model were obtained using addition silicone impression material (Aquasil Ultra; Dentsply Caulk) and two optical scanners (Trios, 3Shape, and CEREC Omnicam, Sirona). Surface matching software (Geomagic Control, 3D Systems) was used to superimpose the scans within groups to determine the mean deviations in precision and trueness (μm) between the scans, which were calculated for each group and compared statistically using one-way analysis of variance with post hoc Bonferroni (trueness) and Games-Howell (precision) tests (IBM SPSS ver 24, IBM UK). Qualitative analysis was also carried out from three-dimensional maps of differences between scans. Means and standard deviations (SD) of deviations in precision for conventional, Trios, and Omnicam groups were 21.7 (± 5.4), 49.9 (± 18.3), and 36.5 (± 11.12) μm, respectively. Means and SDs for deviations in trueness were 24.3 (± 5.7), 87.1 (± 7.9), and 80.3 (± 12.1) μm, respectively. The conventional impression showed statistically significantly improved mean precision (P < .006) and mean trueness (P < .001) compared to both digital impression procedures. There were no statistically significant differences in precision (P = .153) or trueness (P = .757) between the digital impressions. The qualitative analysis revealed local deviations along the palatal surfaces of the molars and incisal edges of the anterior teeth of < 100 μm. Conventional full-arch PVS impressions exhibited improved mean accuracy compared to two direct optical scanners. No significant differences were found between the two digital impression methods.
Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.
Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo
2015-11-01
The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.
Tupinambá, Rogerio Amaral; Claro, Cristiane Aparecida de Assis; Pereira, Cristiane Aparecida; Nobrega, Celestino José Prudente; Claro, Ana Paula Rosifini Alves
2017-01-01
ABSTRACT Introduction: Plasma-polymerized film deposition was created to modify metallic orthodontic brackets surface properties in order to inhibit bacterial adhesion. Methods: Hexamethyldisiloxane (HMDSO) polymer films were deposited on conventional (n = 10) and self-ligating (n = 10) stainless steel orthodontic brackets using the Plasma-Enhanced Chemical Vapor Deposition (PECVD) radio frequency technique. The samples were divided into two groups according to the kind of bracket and two subgroups after surface treatment. Scanning Electron Microscopy (SEM) analysis was performed to assess the presence of bacterial adhesion over samples surfaces (slot and wings region) and film layer integrity. Surface roughness was assessed by Confocal Interferometry (CI) and surface wettability, by goniometry. For bacterial adhesion analysis, samples were exposed for 72 hours to a Streptococcus mutans solution for biofilm formation. The values obtained for surface roughness were analyzed using the Mann-Whitney test while biofilm adhesion were assessed by Kruskal-Wallis and SNK test. Results: Significant statistical differences (p< 0.05) for surface roughness and bacterial adhesion reduction were observed on conventional brackets after surface treatment and between conventional and self-ligating brackets; no significant statistical differences were observed between self-ligating groups (p> 0.05). Conclusion: Plasma-polymerized film deposition was only effective on reducing surface roughness and bacterial adhesion in conventional brackets. It was also noted that conventional brackets showed lower biofilm adhesion than self-ligating brackets despite the absence of film. PMID:28902253
van Bochove, J A; van Amerongen, W E
2006-03-01
The aim is to investigate possible differences in discomfort during treatment with the atraumatic restorative treatment (ART) or the Conventional restorative method with and without local analgesia (LA). The study group consisted of 6 and 7 year old children with no dental experience (mean age 6.98, SD +/- 0.52) randomly divided into four treatment groups: Conventional method with and without LA and ART with and without LA. One or two proximal lesions in primary molars were treated. The heart rate and the behaviour (Venham) were measured. Statistical analysis was performed in SPSS version 10.0. In a first session 300 children were treated and 109 children for a second time in the same way as at the first visit. During the first session ART without LA gave the least discomfort while the Conventional method without LA gave the most discomfort. During the second treatment the least discomfort was observed with ART without LA and the most discomfort in the Conventional way with LA. There is a constant preference for hand instruments; the bur is increasingly accepted. The experience with LA is the reverse.
Ghoveizi, Rahab; Alikhasi, Marzieh; Siadat, Mohammad-Reza; Siadat, Hakimeh; Sorouri, Majid
2013-01-01
Objective: Crestal bone loss is a biological complication in implant dentistry. The aim of this study was to compare the effect of progressive and conventional loading on crestal bone height and bone density around single osseointegrated implants in the posterior maxilla by a longitudinal radiographic assessment technique. Materials and Methods: Twenty micro thread implants were placed in 10 patients (two implants per patient). One of the two implants in each patient was assigned to progressive and the other to conventional loading groups. Eight weeks after surgery, conventional implants were restored with a metal ceramic crown and the progressive group underwent a progressive loading protocol. The progressive loading group took different temporary acrylic crowns at 2, 4 and 6 months. After eight months, acrylic crowns were replaced with a metal ceramic crown. Computer radiography of both progressive and conventional implants was taken at 2, 4, 6, and 12 months. Image analysis was performed to measure the height of crestal bone loss and bone density. Results: The mean values of crestal bone loss at month 12 were 0.11 (0.19) mm for progressively and 0.36 (0.36) mm for conventionally loaded implants, with a statistically significant difference (P < 0.05) using Wilcoxon sign rank. Progressively loaded group showed a trend for higher bone density gain compared to the conventionally loaded group, but when tested with repeated measure ANOVA, the differences were not statistically significant (P > 0.05). Conclusion: The progressive group showed less crestal bone loss in single osseointegrated implant than the conventional group. Bone density around progressively loaded implants showed increase in crestal, middle and apical areas. PMID:23724215
NASA Astrophysics Data System (ADS)
Ren, Weiwei; Yang, Tao; Shi, Pengfei; Xu, Chong-yu; Zhang, Ke; Zhou, Xudong; Shao, Quanxi; Ciais, Philippe
2018-06-01
Climate change imposes profound influence on regional hydrological cycle and water security in many alpine regions worldwide. Investigating regional climate impacts using watershed scale hydrological models requires a large number of input data such as topography, meteorological and hydrological data. However, data scarcity in alpine regions seriously restricts evaluation of climate change impacts on water cycle using conventional approaches based on global or regional climate models, statistical downscaling methods and hydrological models. Therefore, this study is dedicated to development of a probabilistic model to replace the conventional approaches for streamflow projection. The probabilistic model was built upon an advanced Bayesian Neural Network (BNN) approach directly fed by the large-scale climate predictor variables and tested in a typical data sparse alpine region, the Kaidu River basin in Central Asia. Results show that BNN model performs better than the general methods across a number of statistical measures. The BNN method with flexible model structures by active indicator functions, which reduce the dependence on the initial specification for the input variables and the number of hidden units, can work well in a data limited region. Moreover, it can provide more reliable streamflow projections with a robust generalization ability. Forced by the latest bias-corrected GCM scenarios, streamflow projections for the 21st century under three RCP emission pathways were constructed and analyzed. Briefly, the proposed probabilistic projection approach could improve runoff predictive ability over conventional methods and provide better support to water resources planning and management under data limited conditions as well as enable a facilitated climate change impact analysis on runoff and water resources in alpine regions worldwide.
Evaluation of Statistical Methods for Modeling Historical Resource Production and Forecasting
NASA Astrophysics Data System (ADS)
Nanzad, Bolorchimeg
This master's thesis project consists of two parts. Part I of the project compares modeling of historical resource production and forecasting of future production trends using the logit/probit transform advocated by Rutledge (2011) with conventional Hubbert curve fitting, using global coal production as a case study. The conventional Hubbert/Gaussian method fits a curve to historical production data whereas a logit/probit transform uses a linear fit to a subset of transformed production data. Within the errors and limitations inherent in this type of statistical modeling, these methods provide comparable results. That is, despite that apparent goodness-of-fit achievable using the Logit/Probit methodology, neither approach provides a significant advantage over the other in either explaining the observed data or in making future projections. For mature production regions, those that have already substantially passed peak production, results obtained by either method are closely comparable and reasonable, and estimates of ultimately recoverable resources obtained by either method are consistent with geologically estimated reserves. In contrast, for immature regions, estimates of ultimately recoverable resources generated by either of these alternative methods are unstable and thus, need to be used with caution. Although the logit/probit transform generates high quality-of-fit correspondence with historical production data, this approach provides no new information compared to conventional Gaussian or Hubbert-type models and may have the effect of masking the noise and/or instability in the data and the derived fits. In particular, production forecasts for immature or marginally mature production systems based on either method need to be regarded with considerable caution. Part II of the project investigates the utility of a novel alternative method for multicyclic Hubbert modeling tentatively termed "cycle-jumping" wherein overlap of multiple cycles is limited. The model is designed in a way that each cycle is described by the same three parameters as conventional multicyclic Hubbert model and every two cycles are connected with a transition width. Transition width indicates the shift from one cycle to the next and is described as weighted coaddition of neighboring two cycles. It is determined by three parameters: transition year, transition width, and gamma parameter for weighting. The cycle-jumping method provides superior model compared to the conventional multicyclic Hubbert model and reflects historical production behavior more reasonably and practically, by better modeling of the effects of technological transitions and socioeconomic factors that affect historical resource production behavior by explicitly considering the form of the transitions between production cycles.
Use of Negative Pressure Wound Therapy for Lower Limb Bypass Incisions.
Tan, Kah Wei; Lo, Zhiwen Joseph; Hong, Qiantai; Narayanan, Sriram; Tan, Glenn Wei Leong; Chandrasekar, Sadhana
2017-12-25
Objective : The use of negative pressure wound therapy (NPWT) for post-surgical cardiothoracic, orthopedic, plastic, and obstetric and gynecologic procedures has been described. However, there are no data regarding its use for lower limb bypass incisions. We aimed to investigate the outcomes of NPWT in preventing surgical site infection (SSI) in patients with lower limb arterial bypass incisions. Materials and Methods : We retrospectively used data of 42 patients who underwent lower limb arterial bypass with reversed great saphenous vein between March 2014 and June 2016 and compared conventional wound therapy and NPWT with regard to preventing SSI. Results : Twenty-eight (67%) patients underwent conventional wound therapy and 14 (33%) underwent NPWT. There were no statistical differences regarding patient characteristics and mean SSI risk scores between the two patient groups (13.7% for conventional wound therapy vs. 13.4% for NPWT; P=0.831). In the conventional group, nine instances of SSI (32%) and three (11%) of these required subsequent surgical wound debridement, whereas in the NPWT group, there was no SSI incidence (P=0.019). Secondary outcomes such as the length of hospital stay, 30-day readmission rate, and need for secondary vascular procedures were not statistically different between the two groups. Conclusion : The use of NPWT for lower limb arterial bypass incisions is superior to that of conventional wound therapy because it may prevent SSIs.
Ing, Alex; Schwarzbauer, Christian
2014-01-01
Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods--the cluster size statistic (CSS) and cluster mass statistic (CMS)--are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity.
Ing, Alex; Schwarzbauer, Christian
2014-01-01
Functional connectivity has become an increasingly important area of research in recent years. At a typical spatial resolution, approximately 300 million connections link each voxel in the brain with every other. This pattern of connectivity is known as the functional connectome. Connectivity is often compared between experimental groups and conditions. Standard methods used to control the type 1 error rate are likely to be insensitive when comparisons are carried out across the whole connectome, due to the huge number of statistical tests involved. To address this problem, two new cluster based methods – the cluster size statistic (CSS) and cluster mass statistic (CMS) – are introduced to control the family wise error rate across all connectivity values. These methods operate within a statistical framework similar to the cluster based methods used in conventional task based fMRI. Both methods are data driven, permutation based and require minimal statistical assumptions. Here, the performance of each procedure is evaluated in a receiver operator characteristic (ROC) analysis, utilising a simulated dataset. The relative sensitivity of each method is also tested on real data: BOLD (blood oxygen level dependent) fMRI scans were carried out on twelve subjects under normal conditions and during the hypercapnic state (induced through the inhalation of 6% CO2 in 21% O2 and 73%N2). Both CSS and CMS detected significant changes in connectivity between normal and hypercapnic states. A family wise error correction carried out at the individual connection level exhibited no significant changes in connectivity. PMID:24906136
NASA Astrophysics Data System (ADS)
Kurnia, H.; Noerhadi, N. A. I.
2017-08-01
Three-dimensional digital study models were introduced following advances in digital technology. This study was carried out to assess the reliability of digital study models scanned by a laser scanning device newly assembled. The aim of this study was to compare the digital study models and conventional models. Twelve sets of dental impressions were taken from patients with mild-to-moderate crowding. The impressions were taken twice, one with alginate and the other with polyvinylsiloxane. The alginate impressions were made into conventional models, and the polyvinylsiloxane impressions were scanned to produce digital models. The mesiodistal tooth width and Little’s irregularity index (LII) were measured manually with digital calipers on the conventional models and digitally on the digital study models. Bolton analysis was performed on each study models. Each method was carried out twice to check for intra-observer variability. The reproducibility (comparison of the methods) was assessed using independent-sample t-tests. The mesiodistal tooth width between conventional and digital models did not significantly differ (p > 0.05). Independent-sample t-tests did not identify statistically significant differences for Bolton analysis and LII (p = 0.603 for Bolton and p = 0894 for LII). The measurements of the digital study models are as accurate as those of the conventional models.
Reproducibility of techniques using Archimedes' principle in measuring cancellous bone volume.
Zou, L; Bloebaum, R D; Bachus, K N
1997-01-01
Researchers have been interested in developing techniques to accurately and reproducibly measure the volume fraction of cancellous bone. Historically bone researchers have used Archimedes' principle with water to measure the volume fraction of cancellous bone. Preliminary results in our lab suggested that the calibrated water technique did not provide reproducible results. Because of this difficulty, it was decided to compare the conventional water method to a water with surfactant and a helium method using a micropycnometer. The water/surfactant and the helium methods were attempts to improve the fluid penetration into the small voids present in the cancellous bone structure. In order to compare the reproducibility of the new methods with the conventional water method, 16 cancellous bone specimens were obtained from femoral condyles of human and greyhound dog femora. The volume fraction measurements on each specimen were repeated three times with all three techniques. The results showed that the helium displacement method was more than an order of magnitudes more reproducible than the two other water methods (p < 0.05). Statistical analysis also showed that the conventional water method produced the lowest reproducibility (p < 0.05). The data from this study indicate that the helium displacement technique is a very useful, rapid and reproducible tool for quantitatively characterizing anisotropic porous tissue structures such as cancellous bone.
Ramírez-Carrasco, A; Butrón-Téllez Girón, C; Sanchez-Armass, O; Pierdant-Pérez, M
2017-01-01
Background and Objective . Anxiety/pain are experiences that make dental treatment difficult for children, especially during the time of anesthesia. Hypnosis is used in pediatric clinical situations to modify thinking, behavior, and perception as well as, recently, in dentistry; therefore the aim of this study was to evaluate the effectiveness of hypnosis combined with conventional behavior management techniques during infiltration anesthetic. Methods . Anxiety/pain were assessed with the FLACC scale during the anesthetic moment, as well as heart rate variability and skin conductance before and during the anesthetic moment, between the control and experimental group. Results . A marginal statistical difference ( p = 0.05) was found in the heart rate between baseline and anesthetic moment, being lower in the hypnosis group. No statistically significant differences were found with the FLACC scale or in the skin conductance ( p > 0.05). Conclusion . Hypnosis combined with conventional behavior management techniques decreases heart rate during anesthetic infiltration showing that there may be an improvement in anxiety/pain control through hypnotic therapy.
Ramírez-Carrasco, A.; Butrón-Téllez Girón, C.; Sanchez-Armass, O.
2017-01-01
Background and Objective. Anxiety/pain are experiences that make dental treatment difficult for children, especially during the time of anesthesia. Hypnosis is used in pediatric clinical situations to modify thinking, behavior, and perception as well as, recently, in dentistry; therefore the aim of this study was to evaluate the effectiveness of hypnosis combined with conventional behavior management techniques during infiltration anesthetic. Methods. Anxiety/pain were assessed with the FLACC scale during the anesthetic moment, as well as heart rate variability and skin conductance before and during the anesthetic moment, between the control and experimental group. Results. A marginal statistical difference (p = 0.05) was found in the heart rate between baseline and anesthetic moment, being lower in the hypnosis group. No statistically significant differences were found with the FLACC scale or in the skin conductance (p > 0.05). Conclusion. Hypnosis combined with conventional behavior management techniques decreases heart rate during anesthetic infiltration showing that there may be an improvement in anxiety/pain control through hypnotic therapy. PMID:28490941
Gene-expression programming for flip-bucket spillway scour.
Guven, Aytac; Azamathulla, H Md
2012-01-01
During the last two decades, researchers have noticed that the use of soft computing techniques as an alternative to conventional statistical methods based on controlled laboratory or field data, gave significantly better results. Gene-expression programming (GEP), which is an extension to genetic programming (GP), has nowadays attracted the attention of researchers in prediction of hydraulic data. This study presents GEP as an alternative tool in the prediction of scour downstream of a flip-bucket spillway. Actual field measurements were used to develop GEP models. The proposed GEP models are compared with the earlier conventional GP results of others (Azamathulla et al. 2008b; RMSE = 2.347, δ = 0.377, R = 0.842) and those of commonly used regression-based formulae. The predictions of GEP models were observed to be in strictly good agreement with measured ones, and quite a bit better than conventional GP and the regression-based formulae. The results are tabulated in terms of statistical error measures (GEP1; RMSE = 1.596, δ = 0.109, R = 0.917) and illustrated via scatter plots.
NASA Astrophysics Data System (ADS)
Lee, Dong-Sup; Cho, Dae-Seung; Kim, Kookhyun; Jeon, Jae-Jin; Jung, Woo-Jin; Kang, Myeng-Hwan; Kim, Jae-Ho
2015-01-01
Independent Component Analysis (ICA), one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: instability and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to validate the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.
Lee, Anthony; Yau, Christopher; Giles, Michael B.; Doucet, Arnaud; Holmes, Christopher C.
2011-01-01
We present a case-study on the utility of graphics cards to perform massively parallel simulation of advanced Monte Carlo methods. Graphics cards, containing multiple Graphics Processing Units (GPUs), are self-contained parallel computational devices that can be housed in conventional desktop and laptop computers and can be thought of as prototypes of the next generation of many-core processors. For certain classes of population-based Monte Carlo algorithms they offer massively parallel simulation, with the added advantage over conventional distributed multi-core processors that they are cheap, easily accessible, easy to maintain, easy to code, dedicated local devices with low power consumption. On a canonical set of stochastic simulation examples including population-based Markov chain Monte Carlo methods and Sequential Monte Carlo methods, we nd speedups from 35 to 500 fold over conventional single-threaded computer code. Our findings suggest that GPUs have the potential to facilitate the growth of statistical modelling into complex data rich domains through the availability of cheap and accessible many-core computation. We believe the speedup we observe should motivate wider use of parallelizable simulation methods and greater methodological attention to their design. PMID:22003276
A risk assessment methodology using intuitionistic fuzzy set in FMEA
NASA Astrophysics Data System (ADS)
Chang, Kuei-Hu; Cheng, Ching-Hsue
2010-12-01
Most current risk assessment methods use the risk priority number (RPN) value to evaluate the risk of failure. However, conventional RPN methodology has been criticised as having five main shortcomings as follows: (1) the assumption that the RPN elements are equally weighted leads to over simplification; (2) the RPN scale itself has some non-intuitive statistical properties; (3) the RPN elements have many duplicate numbers; (4) the RPN is derived from only three factors mainly in terms of safety; and (5) the conventional RPN method has not considered indirect relations between components. To address the above issues, an efficient and comprehensive algorithm to evaluate the risk of failure is needed. This article proposes an innovative approach, which integrates the intuitionistic fuzzy set (IFS) and the decision-making trial and evaluation laboratory (DEMATEL) approach on risk assessment. The proposed approach resolves some of the shortcomings of the conventional RPN method. A case study, which assesses the risk of 0.15 µm DRAM etching process, is used to demonstrate the effectiveness of the proposed approach. Finally, the result of the proposed method is compared with the listing approaches of risk assessment methods.
Wiuf, Carsten; Schaumburg-Müller Pallesen, Jonatan; Foldager, Leslie; Grove, Jakob
2016-08-01
In many areas of science it is custom to perform many, potentially millions, of tests simultaneously. To gain statistical power it is common to group tests based on a priori criteria such as predefined regions or by sliding windows. However, it is not straightforward to choose grouping criteria and the results might depend on the chosen criteria. Methods that summarize, or aggregate, test statistics or p-values, without relying on a priori criteria, are therefore desirable. We present a simple method to aggregate a sequence of stochastic variables, such as test statistics or p-values, into fewer variables without assuming a priori defined groups. We provide different ways to evaluate the significance of the aggregated variables based on theoretical considerations and resampling techniques, and show that under certain assumptions the FWER is controlled in the strong sense. Validity of the method was demonstrated using simulations and real data analyses. Our method may be a useful supplement to standard procedures relying on evaluation of test statistics individually. Moreover, by being agnostic and not relying on predefined selected regions, it might be a practical alternative to conventionally used methods of aggregation of p-values over regions. The method is implemented in Python and freely available online (through GitHub, see the Supplementary information).
Choi, Ki Hwan; Chung, Song Ee; Chung, Tae Young; Chung, Eui Sang
2007-04-01
To assess the efficacy of the ultrasound biomicroscopic (UBM) method in estimating the sulcus-to-sulcus horizontal diameter for Visian Implantable Contact Lens (ICL, model V4) length determination to obtain optimal ICL vault. The results of postoperative ICL vaults in 30 eyes of 18 patients were retrospectively analyzed. In 17 eyes, ICL length was determined using the conventional method, and in 13 eyes, ICL length was determined using the UBM method. The UBM method was carried out by measuring the sulcus to limbus distance on each side by 50 MHz UBM and adding the white-to-white diameter by caliper or Orbscan. The ICL vaults were measured using the UBM method at 1 and 6 months postoperatively and the results were compared between the two groups. Ideal ICL vault was defined as vault between 250 and 750 microm. The relation between the ICL vault, footplate location, and ICL power was also investigated. In the UBM method group, ICL vault was within the ideal range in all 13 (100%) eyes at 1 and 6 months postoperatively, whereas in the conventional method group, 10 (58.8%) eyes showed ideal vault at 1 month postoperatively (P = .01) and 9 (52.9%) eyes showed ideal vault at 6 months postoperatively (P < .01). The ideal ICL footplate location was achieved in the ciliary sulcus in 11 (84.6%) eyes of the UBM method group and 10 (64.7%) eyes of the conventional method group. However, the differences between the two groups were not statistically significant. The ICL vault was not significantly affected by the ICL power. Implantable Contact Lens length determined by the UBM method achieved significantly more ideal ICL vault than that of the conventional white-to-white method. The UBM method is superior to the conventional method in terms of predicting the sulcus-to-sulcus horizontal diameter for ICL length determination.
Statistical science: a grammar for research.
Cox, David R
2017-06-01
I greatly appreciate the invitation to give this lecture with its century long history. The title is a warning that the lecture is rather discursive and not highly focused and technical. The theme is simple. That statistical thinking provides a unifying set of general ideas and specific methods relevant whenever appreciable natural variation is present. To be most fruitful these ideas should merge seamlessly with subject-matter considerations. By contrast, there is sometimes a temptation to regard formal statistical analysis as a ritual to be added after the serious work has been done, a ritual to satisfy convention, referees, and regulatory agencies. I want implicitly to refute that idea.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wurtz, R.; Kaplan, A.
Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-building elements and their functions in a fully-designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejectionmore » rate (GRR) relevant for realistic applications.« less
Estimating procedure times for surgeries by determining location parameters for the lognormal model.
Spangler, William E; Strum, David P; Vargas, Luis G; May, Jerrold H
2004-05-01
We present an empirical study of methods for estimating the location parameter of the lognormal distribution. Our results identify the best order statistic to use, and indicate that using the best order statistic instead of the median may lead to less frequent incorrect rejection of the lognormal model, more accurate critical value estimates, and higher goodness-of-fit. Using simulation data, we constructed and compared two models for identifying the best order statistic, one based on conventional nonlinear regression and the other using a data mining/machine learning technique. Better surgical procedure time estimates may lead to improved surgical operations.
NASA Astrophysics Data System (ADS)
Hardy, D.; Janée, G.; Gallagher, J.; Frew, J.; Cornillon, P.
2006-12-01
The OPeNDAP Data Access Protocol (DAP) is a community standard for sharing scientific data across the Internet. Data providers using DAP have adopted a variety of metadata conventions to improve data utility, such as COARDS (1995) and CF (2003). Our results show, however, that metadata do not follow these conventions in practice. We collected metadata from over a hundred DAP servers, tens of thousands of data objects, and hundreds of collections. We found that a minority claim to adhere to a metadata convention, and a small percentage accurately adhere to their stated convention. We present descriptive statistics of our survey and highlight common traits such as well-populated attributes. Our empirical results indicate that unified search services cannot rely solely on metadata conventions. Although we encourage all providers to adopt a small subset of the CF convention for discovery purposes, we have no evidence to suggest that improved conventions would simplify the fundamental problem of heterogeneity. Large-scale discovery services must find methods for integrating incompatible metadata.
Comparison of surgically induced astigmatism in patients with horizontal rectus muscle recession
Çakmak, Harun; Kocatürk, Tolga; Dündar, Sema Oruç
2014-01-01
AIM To compare surgically induced astigmatism (SIA) following horizontal rectus muscle recession surgery between suspension recession with both the “hang-back” technique and conventional recession technique. METHODS Totally, 48 eyes of 24 patients who had undergone horizontal rectus muscle recession surgery were reviewed retrospectively. The patients were divided into two groups. Twelve patients were operated on by the hang-back technique (Group 1), and 12 by the conventional recession technique (Group 2). SIA was calculated on the 1st wk, 1st and in the 3rd mo after surgery using the SIA calculator. RESULTS SIA was statistically higher in the Group 1 all postoperative follow-up. SIA was the highest in the 1st wk, and decreased gradually in both groups. CONCLUSION The suspension recession technique induced much more SIA than the conventional recession technique. This difference also continued in the following visits. Therefore, the refractive power should be checked postoperatively in order to avoid refractive amblyopia. Conventional recession surgery should be the preferred method so as to minimize the postoperative refractive changes in patients with amblyopia. PMID:25161948
Sun, Xing; Li, Xiaoyun; Chen, Cong; Song, Yang
2013-01-01
Frequent rise of interval-censored time-to-event data in randomized clinical trials (e.g., progression-free survival [PFS] in oncology) challenges statistical researchers in the pharmaceutical industry in various ways. These challenges exist in both trial design and data analysis. Conventional statistical methods treating intervals as fixed points, which are generally practiced by pharmaceutical industry, sometimes yield inferior or even flawed analysis results in extreme cases for interval-censored data. In this article, we examine the limitation of these standard methods under typical clinical trial settings and further review and compare several existing nonparametric likelihood-based methods for interval-censored data, methods that are more sophisticated but robust. Trial design issues involved with interval-censored data comprise another topic to be explored in this article. Unlike right-censored survival data, expected sample size or power for a trial with interval-censored data relies heavily on the parametric distribution of the baseline survival function as well as the frequency of assessments. There can be substantial power loss in trials with interval-censored data if the assessments are very infrequent. Such an additional dependency controverts many fundamental assumptions and principles in conventional survival trial designs, especially the group sequential design (e.g., the concept of information fraction). In this article, we discuss these fundamental changes and available tools to work around their impacts. Although progression-free survival is often used as a discussion point in the article, the general conclusions are equally applicable to other interval-censored time-to-event endpoints.
Computer-Based Instruction and Health Professions Education: A Meta-Analysis of Outcomes.
ERIC Educational Resources Information Center
Cohen, Peter A.; Dacanay, Lakshmi S.
1992-01-01
The meta-analytic techniques of G. V. Glass were used to statistically integrate findings from 47 comparative studies on computer-based instruction (CBI) in health professions education. A clear majority of the studies favored CBI over conventional methods of instruction. Results show higher-order applications of computers to be especially…
Harrison, Jay M; Breeze, Matthew L; Harrigan, George G
2011-08-01
Statistical comparisons of compositional data generated on genetically modified (GM) crops and their near-isogenic conventional (non-GM) counterparts typically rely on classical significance testing. This manuscript presents an introduction to Bayesian methods for compositional analysis along with recommendations for model validation. The approach is illustrated using protein and fat data from two herbicide tolerant GM soybeans (MON87708 and MON87708×MON89788) and a conventional comparator grown in the US in 2008 and 2009. Guidelines recommended by the US Food and Drug Administration (FDA) in conducting Bayesian analyses of clinical studies on medical devices were followed. This study is the first Bayesian approach to GM and non-GM compositional comparisons. The evaluation presented here supports a conclusion that a Bayesian approach to analyzing compositional data can provide meaningful and interpretable results. We further describe the importance of method validation and approaches to model checking if Bayesian approaches to compositional data analysis are to be considered viable by scientists involved in GM research and regulation. Copyright © 2011 Elsevier Inc. All rights reserved.
Comparison of Olive Tipped and Conventional Steven's Cannula for Sub-Tenon Ophthalmic Anesthesia
Al-Motowa, Saeed; Ahmad, Nauman; Khandekar, Rajiv; Zahoor, Abdul
2016-01-01
PURPOSE: To compare the efficacy of the olive tipped (OT) cannula to the conventional Steven's cannula for sub-Tenon block (STB) before cataract surgery. METHODS: This prospective, randomized, double-masked compared STB delivered in cataract surgery patients with an OT cannula or a conventional Steven's cannula (ST). Outcome variables included the akinesia score and lid movement scores at 5 and 10 min. The patient perception of pain during delivery of the STB and surgery were also compared between groups. Surgeon satisfaction with anesthesia was compared between groups. P <0.05 was statistically significant. RESULTS: There were sixty patients in each group. The age between groups was not statistically different (P = 0.4). The body mass index was higher in the ST group compared to the OT group (P < 0.001). The akinesia score at 5 and 10 min did not differ between groups (P = 0.07 and P = 0.6, respectively). The patient perception of pain during STB and surgery were similar between groups (P = 0.1 and P = 0.06, respectively). There were six patients with mild chemosis and redness in the OT group and 15 patients in the ST group. CONCLUSION: An OT cannula is equally effective as the conventional Steven's cannula for delivering STB anesthesia before cataract surgery. PMID:27994394
Zhou, Zheng; Dai, Cong; Liu, Wei-Xin
2015-01-01
TNF-α has an important role in the pathogenesis of ulcerative colitis (UC). It seems that anti-TNF-α therapy is beneficial in the treatment of UC. The aim was to assess the effectiveness of Infliximab and Adalimamab with UC compared with conventional therapy. The Pubmed and Embase databases were searched for studies investigating the efficacy of infliximab and adalimumab on UC. Infliximab had a statistically significant effects in induction of clinical response (RR = 1.67; 95% CI 1.12 to 2.50) of UC compared with conventional therapy, but those had not a statistically significant effects in clinical remission (RR = 1.63; 95% CI 0.84 to 3.18) and reduction of colectomy rate (RR = 0.54; 95% CI 0.26 to 1.12) of UC. And adalimumab had a statistically significant effects in induction of clinical remission (RR = 1.82; 95% CI 1.24 to 2.67) and clinical response (RR = 1.36; 95% CI 1.13 to 1.64) of UC compared with conventional therapy. Our meta-analyses suggested that Infliximab had a statistically significant effects in induction of clinical response of UC compared with conventional therapy and adalimumab had a statistically significant effects in induction of clinical remission and clinical response of UC compared with conventional therapy.
Ultrasonography in determining pubertal growth and bone age.
Torenek Ağırman, Kubra; Bilge, Osman Murat; Miloğlu, Özkan
2018-05-02
The purpose of this study is to evaluate the compatibility of ultrasonographic data with hand-wrist radiographs taken to determine the extent of pubertal growth and bone age in patients and investigate the usability of ionizing radiation-free ultrasonography instead of conventional radiography. In this study, a total of 120 children from 10 to 17 years old (mean age was 168 months ± 27.5 months) were treated with routine radiographs before orthodontic treatment, and ultrasonographic imaging was performed on the wrists the same day. Researchers examined the phalanges, sesamoid bone, and radial bone distal epiphysis-diaphysis comparatively in each patient by both imaging methods and statistical evaluation. There was no statistically significant difference between conventional radiography and ultrasonography values at 13 points except for PP1 (proximal phalanges of the first finger), PP2 (proximal phalanges of the second finger), and radial epiphysis (p > 0.05). PP1, PP2, and radial epiphysis showed a statistically significant difference (p < 0.05). The CBA (bone age obtained from conventional radiographs) of the females was found to be larger than their CA (chronological age) and their UBA (ultrasonographic bone age). For males; the means of the CBA, UBA and CA values close to each other. In females and males; there was a strong correlation between the CA, the UBA and the CBA (p < 0.01). Ultrasonography gives detailed information about epiphyseal diaphysis relations. It can be used as an alternative to conventional radiography in the detection of bone age and pubertal growth, owing to the absence of ionizing radiation.
Lee, Jong Hwa; Kim, Sang Beom; Lee, Kyeong Woo; Lee, Sook Joung; Park, Hyuntae; Kim, Dong Won
2017-09-01
The use of a whole-body vibration (WBV) therapy has recently been applied and investigated as a rehabilitation method for subacute stroke patients. To evaluate the effects of a WBV therapy on recovery of balance in subacute stroke patients who were unable to gain sitting balance. The conventional rehabilitation group (CG) received conventional physical therapy, including sitting balance training by a physical therapist, for 30 min a one session, for twice a day for five days a week for two weeks. The whole-body vibration group (VG) received one session of conventional physical therapy, and received WBV therapy instead of conventional physical therapy for 30 min a day for five days a week for two weeks. There were 15 patients in the CG and 15 patients in the VG who completed the two-week therapy. After the two-week therapy, both groups showed functional improvement. Patients in the VG improved functional ambulation categories, Berg balance scale, trunk impairment scale scores. But, no statistically significant correlations between the therapeutic methods and outcomes were observed in either group. Our results suggest that WBV therapy led to improvement of the recovery in balance recovery for subacute stroke patients. Because the WBV therapy was as effective as conventional physical therapy, we can consider a WBV therapy as a clinical method to improve the sitting balance of subacute stoke patients.
Srivastava, Rajeshwar N; Dwivedi, Mukesh K; Bhagat, Amit K; Raj, Saloni; Agarwal, Rajiv; Chandra, Abhijit
2016-06-01
The conventional methods of treatment of pressure ulcers (PUs) by serial debridement and daily dressings require prolonged hospitalisation, associated with considerable morbidity. There is, however, recent evidence to suggest that negative pressure wound therapy (NPWT) accelerates healing. The commercial devices for NPWT are costly, cumbersome, and electricity dependent. We compared PU wound healing in traumatic paraplegia patients by conventional dressing and by an innovative negative pressure device (NPD). In this prospective, non-randomised trial, 48 traumatic paraplegia patients with PUs of stages 3 and 4 were recruited. Patients were divided into two groups: group A (n = 24) received NPWT with our NPD, and group B (n = 24) received conventional methods of dressing. All patients were followed up for 9 weeks. At week 9, all patients on NPD showed a statistically significant improvement in PU healing in terms of slough clearance, granulation tissue formation, wound discharge and culture. A significant reduction in wound size and ulcer depth was observed in NPD as compared with conventional methods at all follow-up time points (P = 0·0001). NPWT by the innovative device heals PUs at a significantly higher rate than conventional treatment. The device is safe, easy to apply and cost-effective. © 2014 The Authors. International Wound Journal © 2014 Medicalhelplines.com Inc and John Wiley & Sons Ltd.
PROBABILITY SAMPLING AND POPULATION INFERENCE IN MONITORING PROGRAMS
A fundamental difference between probability sampling and conventional statistics is that "sampling" deals with real, tangible populations, whereas "conventional statistics" usually deals with hypothetical populations that have no real-world realization. he focus here is on real ...
A dose-response model for the conventional phototherapy of the newborn.
Osaku, Nelson Ossamu; Lopes, Heitor Silvério
2006-06-01
Jaundice of the newborn is a common problem as a consequence of the rapid increment of blood bilirubin in the first days of live. In most cases, it is considered a physiological transient situation, but unmanaged hyperbilirubinemia can lead to death or serious injuries for the survivors. For decades, phototherapy has been used as the main method for prevention and treatment of hyperbilirubinaemia of the newborn. This work aims at finding a predictive model for the decrement of blood bilirubin for patients submitted to conventional phototherapy. Data from the phototherapy of 90 term newborns were collected and used in a multiple regression method. A rigorous statistical analysis was done in order to guarantee a correct and valid model. The obtained model was able to explain 78% of the variation of the dependent variable. We show that it is possible to predict the total serum bilirubin of the patient under conventional phototherapy by knowing its birth weight, bilirubin level at the beginning of treatment and the radiant energy density (dose). Besides, it is possible to infer the time necessary for a given decrement of bilirubin, under approximately constant irradiance. Statistical analysis of the obtained model shows that it is valid for several ranges of birth weight, initial bilirubin level, and radiant energy density. It is expected that the proposed model can be useful in the clinical management of hyperbilirubinemia of the newborn.
Singh, Sunint; Palaskar, Jayant N.; Mittal, Sanjeev
2013-01-01
Background: Conventional heat cure poly methyl methacrylate (PMMA) is the most commonly used denture base resin despite having some short comings. Lengthy polymerization time being one of them and in order to overcome this fact microwave curing method was recommended. Unavailability of specially designed microwavable acrylic resin made it unpopular. Therefore, in this study, conventional heat cure PMMA was polymerized by microwave energy. Aim and Objectives: This study was designed to evaluate the surface porosities in PMMA cured by conventional water bath and microwave energy and compare it with microwavable acrylic resin cured by microwave energy. Materials and Methods: Wax samples were obtained by pouring molten wax into a metal mold of 25 mm × 12 mm × 3 mm dimensions. These samples were divided into three groups namely C, CM, and M. Group C denotes conventional heat cure PMMA cured by water bath method, CM denotes conventional heat cure PMMA cured by microwave energy, M denotes specially designed microwavable acrylic denture base resin cured by microwave energy. After polymerization, each sample was scanned in three pre-marked areas for surface porosities using the optical microscope. As per the literature available, this instrument is being used for the first time to measure the porosity in acrylic resin. It is a reliable method of measuring area of surface pores. Portion of the sample being scanned is displayed on the computer and with the help of software area of each pore was measured and data were analyzed. Results: Conventional heat cure PMMA samples cured by microwave energy showed maximum porosities than the samples cured by conventional water bath method and microwavable acrylic resin cured by microwave energy. Higher percentage of porosities was statistically significant, but well within the range to be clinically acceptable. Conclusion: Within the limitations of this in-vitro study, conventional heat cure PMMA can be cured by microwave energy without compromising on its property such as surface porosity. PMID:24015000
Przylipiak, Andrzej Feliks; Galicka, Elżbieta; Donejko, Magdalena; Niczyporuk, Marek; Przylipiak, Jerzy
2013-01-01
Background Liposuction is a type of aesthetic surgery that has been performed on humans for decades. There is not much literature addressing the subject matter of pre- and post-surgery blood parameters, although this information is rather interesting. Documentation on patients who received laser-assisted liposuction treatment is particularly scarce. Until now, there has been no literature showing values of platelets, lymphocytes, and neutrophils after liposuction. Purpose The aim of the work is to analyze and interpret values of platelets, lymphocytes and neutrophils in patient blood before and after liposuction, a surgery in which an extraordinarily large amount of potent drugs are used. Moreover, the aim is to compare values changes in patients of conventional and laser-assisted liposuction. Material and methods We evaluated standard blood samples in patients prior to and after liposuction. This paper covers the number of platelets, lymphocytes, and neutrophils. A total of 54 patients were examined. Moreover, we compared the change in postoperative values in laser-assisted liposuction patients with the change of values in conventional liposuction patients. A paired two-sided Student’s t-test was used for statistical evaluation. P < 0.005 was acknowledged to be statistically significant. Results Values of platelets were raised both in conventional and in laser-assisted liposuction patients, but this difference was statistically non-significant and levels of platelets were still normal and within the range of blood levels in healthy patients. Values of neutrophils rose by up to 79.49% ± 7.74% standard deviation (SD) and values of lymphocytes dropped by up to 12.68% ± 5.61% SD. The before/after variances of conventional tumescent local anesthesia liposuction and variations in laser-assisted liposuction were similar for all measured parameters; they also showed no statistically significant differences between before and after surgery. The mean value of total operation time without laser-assistance was 3 hours 42 minutes (±57 minutes SD, range 2 hours 50 minutes to 5 hours 10 minutes). Surgeries with laser-assistance were on average 16 minutes shorter with a mean duration of 3 hours 26 minutes (±45 minutes SD, range 2 hours 40 minutes to 4 hours 10 minutes). The difference was not statistically significant (P < 0.06). The mean value of aspirate volume for liposuctions performed without laser support was 2,618 mL (±633.7 SD, range 700 mL to 3,500 mL). Mean aspirate volume for liposuctions with laser assistance was increased by up to 61 mL (2,677 mL ± 499.5 SD, range 1,800 mL to 3,500 mL). The difference was not statistically significant (P < 0.71). Conclusion We conclude that conventional liposuction and laser-assisted liposuction have a similar influence on platelets, lymphocytes, and neutrophils in patients. Moreover, laser-assisted liposuction seems to be less time consuming than conventional liposuction. PMID:24143076
Shivakumarswamy, Udasimath; Arakeri, Surekha U; Karigowdar, Mahesh H; Yelikar, Br
2012-01-01
The cytological examinations of serous effusions have been well-accepted, and a positive diagnosis is often considered as a definitive diagnosis. It helps in staging, prognosis and management of the patients in malignancies and also gives information about various inflammatory and non-inflammatory lesions. Diagnostic problems arise in everyday practice to differentiate reactive atypical mesothelial cells and malignant cells by the routine conventional smear (CS) method. To compare the morphological features of the CS method with those of the cell block (CB) method and also to assess the utility and sensitivity of the CB method in the cytodiagnosis of pleural effusions. The study was conducted in the cytology section of the Department of Pathology. Sixty pleural fluid samples were subjected to diagnostic evaluation for over a period of 20 months. Along with the conventional smears, cell blocks were prepared by using 10% alcohol-formalin as a fixative agent. Statistical analysis with the 'z test' was performed to identify the cellularity, using the CS and CB methods. Mc. Naemer's χ(2)test was used to identify the additional yield for malignancy by the CB method. Cellularity and additional yield for malignancy was 15% more by the CB method. The CB method provides high cellularity, better architectural patterns, morphological features and an additional yield of malignant cells, and thereby, increases the sensitivity of the cytodiagnosis when compared with the CS method.
Allen, R W; Harnsberger, H R; Shelton, C; King, B; Bell, D A; Miller, R; Parkin, J L; Apfelbaum, R I; Parker, D
1996-08-01
To determine whether unenhanced high-resolution T2-weighted fast spin-echo MR imaging provides an acceptable and less expensive alternative to contrast-enhanced conventional T1-weighted spin-echo MR techniques in the diagnosis of acoustic schwannoma. We reviewed in a blinded fashion the records of 25 patients with pathologically documented acoustic schwannoma and of 25 control subjects, all of whom had undergone both enhanced conventional spin-echo MR imaging and unenhanced fast spin-echo MR imaging of the cerebellopontine angle/internal auditory canal region. The patients were imaged with the use of a quadrature head receiver coil for the conventional spin-echo sequences and dual 3-inch phased-array receiver coils for the fast spin-echo sequences. The size of the acoustic schwannomas ranged from 2 to 40 mm in maximum dimension. The mean maximum diameter was 12 mm, and 12 neoplasms were less than 10 mm in diameter. Acoustic schwannoma was correctly diagnosed on 98% of the fast spin-echo images and on 100% of the enhanced conventional spin-echo images. Statistical analysis of the data using the kappa coefficient demonstrated agreement beyond chance between these two imaging techniques for the diagnosis of acoustic schwannoma. There is no statistically significant difference in the sensitivity and specificity of unenhanced high-resolution fast spin-echo imaging and enhance T1-weighted conventional spin-echo imaging in the detection of acoustic schwannoma. We believe that the unenhanced high-resolution fast spin-echo technique provides a cost-effective method for the diagnosis of acoustic schwannoma.
NASA Technical Reports Server (NTRS)
Talpe, Matthieu J.; Nerem, R. Steven; Forootan, Ehsan; Schmidt, Michael; Lemoine, Frank G.; Enderlin, Ellyn M.; Landerer, Felix W.
2017-01-01
We construct long-term time series of Greenland and Antarctic ice sheet mass change from satellite gravity measurements. A statistical reconstruction approach is developed based on a principal component analysis (PCA) to combine high-resolution spatial modes from the Gravity Recovery and Climate Experiment (GRACE) mission with the gravity information from conventional satellite tracking data. Uncertainties of this reconstruction are rigorously assessed; they include temporal limitations for short GRACE measurements, spatial limitations for the low-resolution conventional tracking data measurements, and limitations of the estimated statistical relationships between low- and high-degree potential coefficients reflected in the PCA modes. Trends of mass variations in Greenland and Antarctica are assessed against a number of previous studies. The resulting time series for Greenland show a higher rate of mass loss than other methods before 2000, while the Antarctic ice sheet appears heavily influenced by interannual variations.
Guoxin, Hu; Ying, Yang; Yuemei, Jiang; Wenjing, Xia
2017-04-01
This study evaluated the wear of an antagonist and friction and wear properties of dental zirconia ceramic that was subjected to microwave and conventional sintering methods. Ten specimens were fabricated from Lava brand zirconia and randomly assigned to microwave and conventional sintering groups. A profile tester for surface roughness was used to measure roughness of the specimens. Wear test was performed, and steatite ceramic was used as antagonist. Friction coefficient curves were recorded, and wear volume were calculated. Finally, optical microscope was used to observe the surface morphology of zirconia and steatite ceramics. Field emission scanning electron microscopy was used to observe the microstructure of zirconia. Wear volumes of microwave and conventionally sintered zirconia were (6.940±1.382)×10⁻², (7.952±1.815) ×10⁻² mm³, respectively. Moreover, wear volumes of antagonist after sintering by the considered methods were (14.189±4.745)×10⁻², (15.813±3.481)×10⁻² mm³, correspondingly. Statistically significant difference was not observed in the wear resistance of zirconia and wear volume of steatite ceramic upon exposure to two kinds of sintering methods. Optical microscopy showed that ploughed surfaces were apparent in zirconia. The wear surface of steatite ceramic against had craze, accompanied by plough. Scanning electron microscopy showed that zirconia was sintered compactly when subjected to both conventional sintering and microwave methods, whereas grains of zirconia sintered by microwave alone were smaller and more uniform. Two kinds of sintering methods are successfully used to produce dental zirconia ceramics with similar friction and wear properties. .
Hinged Capsulotomy – Does it Decrease Floaters After Yttrium Aluminum Garnet Laser Capsulotomy?
Alipour, Fatemeh; Jabbarvand, Mahmoud; Hashemian, Hesam; Hosseini, Simindokht; Khodaparast, Mehdi
2015-01-01
Objectives: The objective was to compare conventional circular yttrium aluminum garnet (YAG) laser capsulotomy with hinged capsulotomy to manage posterior capsular opacification (PCO). Materials and Methods: This prospective, randomized clinical trial enrolled pseudophakic patients with visually significant posterior capsule opacification. Patients were randomized to undergo posterior YAG laser capsulotomy with either conventional circular technique or a new technique with an inferior hinge. At 1-month postoperatively, patients were asked if they had any annoying floaters and the responses were compared between groups. P < 0.05 was considered statistically significant. Results: A total of 83 patients were enrolled. Forty-three patients underwent hinged posterior YAG capsulotomy and 40 patients underwent routine circular capsulotomy. At 1-month postoperatively, there was a statistically significant decrease in annoying floaters in the group that underwent circular capsulotomy (P = 0.02). There was no statistically significant association in the total energy delivered (P = 0.4) or the number of spots (P = 0.2) and patient perception of annoying floaters. Conclusion: Hinged YAG capsulotomy was effective at decreasing the rate of floaters in patients with PCO. PMID:26180476
Correlative weighted stacking for seismic data in the wavelet domain
Zhang, S.; Xu, Y.; Xia, J.; ,
2004-01-01
Horizontal stacking plays a crucial role for modern seismic data processing, for it not only compresses random noise and multiple reflections, but also provides a foundational data for subsequent migration and inversion. However, a number of examples showed that random noise in adjacent traces exhibits correlation and coherence. The average stacking and weighted stacking based on the conventional correlative function all result in false events, which are caused by noise. Wavelet transform and high order statistics are very useful methods for modern signal processing. The multiresolution analysis in wavelet theory can decompose signal on difference scales, and high order correlative function can inhibit correlative noise, for which the conventional correlative function is of no use. Based on the theory of wavelet transform and high order statistics, high order correlative weighted stacking (HOCWS) technique is presented in this paper. Its essence is to stack common midpoint gathers after the normal moveout correction by weight that is calculated through high order correlative statistics in the wavelet domain. Synthetic examples demonstrate its advantages in improving the signal to noise (S/N) ration and compressing the correlative random noise.
A statistical method for the conservative adjustment of false discovery rate (q-value).
Lai, Yinglei
2017-03-14
q-value is a widely used statistical method for estimating false discovery rate (FDR), which is a conventional significance measure in the analysis of genome-wide expression data. q-value is a random variable and it may underestimate FDR in practice. An underestimated FDR can lead to unexpected false discoveries in the follow-up validation experiments. This issue has not been well addressed in literature, especially in the situation when the permutation procedure is necessary for p-value calculation. We proposed a statistical method for the conservative adjustment of q-value. In practice, it is usually necessary to calculate p-value by a permutation procedure. This was also considered in our adjustment method. We used simulation data as well as experimental microarray or sequencing data to illustrate the usefulness of our method. The conservativeness of our approach has been mathematically confirmed in this study. We have demonstrated the importance of conservative adjustment of q-value, particularly in the situation that the proportion of differentially expressed genes is small or the overall differential expression signal is weak.
Video-based teleradiology for intraosseous lesions. A receiver operating characteristic analysis.
Tyndall, D A; Boyd, K S; Matteson, S R; Dove, S B
1995-11-01
Immediate access to off-site expert diagnostic consultants regarding unusual radiographic findings or radiographic quality assurance issues could be a current problem for private dental practitioners. Teleradiology, a system for transmitting radiographic images, offers a potential solution to this problem. Although much research has been done to evaluate feasibility and utilization of teleradiology systems in medical imaging, little research on dental applications has been performed. In this investigation 47 panoramic films with an equal distribution of images with intraosseous jaw lesions and no disease were viewed by a panel of observers with teleradiology and conventional viewing methods. The teleradiology system consisted of an analog video-based system simulating remote radiographic consultation between a general dentist and a dental imaging specialist. Conventional viewing consisted of traditional viewbox methods. Observers were asked to identify the presence or absence of 24 intraosseous lesions and to determine their locations. No statistically significant differences in modalities or observers were identified between methods at the 0.05 level. The results indicate that viewing intraosseous lesions of video-based panoramic images is equal to conventional light box viewing.
Mirzakouchaki, Behnam; Sharghi, Reza; Shirazi, Samaneh; Moghimi, Mahsan; Shahrbaf, Shirin
2016-01-01
Background Different in-vitro studies have reported various results regarding shear bond strength (SBS) of orthodontic brackets when SEP technique is compared to conventional system. This in-vivo study was designed to compare the effect of conventional acid-etching and self-etching primer adhesive (SEP) systems on SBS and debonding characteristics of metal and ceramic orthodontic brackets. Material and Methods 120 intact first maxillary and mandibular premolars of 30 orthodontic patients were selected and bonded with metal and ceramic brackets using conventional acid-etch or self-etch primer system. The bonded brackets were incorporated into the wire during the study period to simulate the real orthodontic treatment condition. The teeth were extracted and debonded after 30 days. The SBS, debonding characteristics and adhesive remnant indices (ARI) were determined in all groups. Results The mean SBS of metal brackets was 10.63±1.42 MPa in conventional and 9.38±1.53 MPa in SEP system, (P=0.004). No statistically significant difference was noted between conventional and SEP systems in ceramic brackets. The frequency of 1, 2 and 3 ARI scores and debonding within the adhesive were the most common among all groups. No statistically significant difference was observed regarding ARI or failure mode of debonded specimens in different brackets or bonding systems. Conclusions The SBS of metal brackets bonded using conventional system was significantly higher than SEP system, although the SBS of SEP system was clinically acceptable. No significant difference was found between conventional and SEP systems used with ceramic brackets. Total SBS of metal brackets was significantly higher than ceramic brackets. Due to adequate SBS of SEP system in bonding the metal brackets, it can be used as an alternative for conventional system. Key words:Shear bond strength, Orthodontic brackets, Adhesive remnant index, self-etch. PMID:26855704
Vojdani, M; Torabi, K; Farjood, E; Khaledi, AAR
2013-01-01
Statement of Problem: Metal-ceramic crowns are most commonly used as the complete coverage restorations in clinical daily use. Disadvantages of conventional hand-made wax-patterns introduce some alternative ways by means of CAD/CAM technologies. Purpose: This study compares the marginal and internal fit of copings cast from CAD/CAM and conventional fabricated wax-patterns. Materials and Method: Twenty-four standardized brass dies were prepared and randomly divided into 2 groups according to the wax-patterns fabrication method (CAD/CAM technique and conventional method) (n=12). All the wax-patterns were fabricated in a standard fashion by means of contour, thickness and internal relief (M1-M12: representative of CAD/CAM group, C1-C12: representative of conventional group). CAD/CAM milling machine (Cori TEC 340i; imes-icore GmbH, Eiterfeld, Germany) was used to fabricate the CAD/CAM group wax-patterns. The copings cast from 24 wax-patterns were cemented to the corresponding dies. For all the coping-die assemblies cross-sectional technique was used to evaluate the marginal and internal fit at 15 points. The Student’s t- test was used for statistical analysis (α=0.05). Results: The overall mean (SD) for absolute marginal discrepancy (AMD) was 254.46 (25.10) um for CAD/CAM group and 88.08(10.67) um for conventional group (control). The overall mean of internal gap total (IGT) was 110.77(5.92) um for CAD/CAM group and 76.90 (10.17) um for conventional group. The Student’s t-test revealed significant differences between 2 groups. Marginal and internal gaps were found to be significantly higher at all measured areas in CAD/CAM group than conventional group (p< 0.001). Conclusion: Within limitations of this study, conventional method of wax-pattern fabrication produced copings with significantly better marginal and internal fit than CAD/CAM (machine-milled) technique. All the factors for 2 groups were standardized except wax pattern fabrication technique, therefore, only the conventional group results in copings with clinically acceptable margins of less than 120um. PMID:24724133
An indirect approach to the extensive calculation of relationship coefficients
Colleau, Jean-Jacques
2002-01-01
A method was described for calculating population statistics on relationship coefficients without using corresponding individual data. It relied on the structure of the inverse of the numerator relationship matrix between individuals under investigation and ancestors. Computation times were observed on simulated populations and were compared to those incurred with a conventional direct approach. The indirect approach turned out to be very efficient for multiplying the relationship matrix corresponding to planned matings (full design) by any vector. Efficiency was generally still good or very good for calculating statistics on these simulated populations. An extreme implementation of the method is the calculation of inbreeding coefficients themselves. Relative performances of the indirect method were good except when many full-sibs during many generations existed in the population. PMID:12270102
A New Method for Extubation: Comparison between Conventional and New Methods.
Yousefshahi, Fardin; Barkhordari, Khosro; Movafegh, Ali; Tavakoli, Vida; Paknejad, Omalbanin; Bina, Payvand; Yousefshahi, Hadi; Sheikh Fathollahi, Mahmood
2012-08-01
Extubation is associated with the risk of complications such as accumulated secretion above the endotracheal tube cuff, eventual atelectasia following a reduction in pulmonary volumes because of a lack of physiological positive end expiratory pressure, and intra-tracheal suction. In order to reduce these complications, and, based on basic physiological principles, a new practical extubation method is presented in this article. The study was designed as a six-month prospective cross-sectional clinical trial. Two hundred fifty-seven patients undergoing coronary artery bypass grafting (CABG) were divided into two groups based on their scheduled surgery time. The first group underwent the conventional extubation method, while the other group was extubated according to a new described method. Arterial blood gas (ABG) analysis results before and after extubation were compared between the two groups to find the effect of the extubation method on the ABG parameters and the oxygenation profile. In all time intervals, the partial pressure of oxygen in arterial blood / fraction of inspired oxygen (PaO(2)/FiO(2)) ratio in the new method group patients was improved compared to that in the conventional method; some differences, like PaO(2)/FiO(2) four hours after extubation, were statistically significant, however (p value=0.0063). The new extubation method improved some respiratory parameters and thus attenuated oxygenation complications and amplified oxygenation after extubation.
Jadhav, Vivek Dattatray; Motwani, Bhagwan K.; Shinde, Jitendra; Adhapure, Prasad
2017-01-01
Aims: The aim of this study was to evaluate the marginal fit and surface roughness of complete cast crowns made by a conventional and an accelerated casting technique. Settings and Design: This study was divided into three parts. In Part I, the marginal fit of full metal crowns made by both casting techniques in the vertical direction was checked, in Part II, the fit of sectional metal crowns in the horizontal direction made by both casting techniques was checked, and in Part III, the surface roughness of disc-shaped metal plate specimens made by both casting techniques was checked. Materials and Methods: A conventional technique was compared with an accelerated technique. In Part I of the study, the marginal fit of the full metal crowns as well as in Part II, the horizontal fit of sectional metal crowns made by both casting techniques was determined, and in Part III, the surface roughness of castings made with the same techniques was compared. Statistical Analysis Used: The results of the t-test and independent sample test do not indicate statistically significant differences in the marginal discrepancy detected between the two casting techniques. Results: For the marginal discrepancy and surface roughness, crowns fabricated with the accelerated technique were significantly different from those fabricated with the conventional technique. Conclusions: Accelerated casting technique showed quite satisfactory results, but the conventional technique was superior in terms of marginal fit and surface roughness. PMID:29042726
Jabbari, Hamidreza; Fakhri, Mohammad; Lotfaliani, Mojtaba; Kiani, Arda
2013-01-01
It is suggested that hot electrocoagulation-enabled forceps (hot biopsy) may reduce hemorrhage risk after the biopsy in endobronchial tumors. The main concern in this method is possible reduction of the specimen's quality. To compare the procedure related hemorrhage with hot biopsy and conventional forceps biopsy and the diagnostic quality of the obtained specimens with either technique. In this prospective study, assessment of the biopsy samples and quantity of hemorrhage were done in a blind fashion. At first, for each patient a definite clinical diagnosis was made based on pathologic examination of all available samples, clinical data, and imaging findings. Then, second pathologist reviewed all samples to evaluate the quality of the samples. A total of 36 patients with endobronchial lesions were included in this study. Definite diagnosis was made in 83% of the patients. Diagnostic yield of the two methods were not statistically different, while the mean hemorrhage grades of all hot biopsy protocols were significantly lower as compared to that of conventional biopsy (p=0.003, p<0.001 and p<0.001 for 10,20and40 voltages respectively). No significant difference was detected between the qualities of specimens obtained by hot biopsy methods in comparison with conventional biopsy (p>0.05 for all three voltages). Hot biopsy can be a valuable alternative to forceps biopsy in evaluating endobronchial lesions.
P Waiker, Veena; Shivalingappa, Shanthakumar
2015-01-01
Platelet rich plasma is known for its hemostatic, adhesive and healing properties in view of the multiple growth factors released from the platelets to the site of wound. The primary objective of this study was to use autologous platelet rich plasma (PRP) in wound beds for anchorage of skin grafts instead of conventional methods like sutures, staplers or glue. In a single center based randomized controlled prospective study of nine months duration, 200 patients with wounds were divided into two equal groups. Autologous PRP was applied on wound beds in PRP group and conventional methods like staples/sutures used to anchor the skin grafts in a control group. Instant graft adherence to wound bed was statistically significant in the PRP group. Time of first post-graft inspection was delayed, and hematoma, graft edema, discharge from graft site, frequency of dressings and duration of stay in plastic surgery unit were significantly less in the PRP group. Autologous PRP ensured instant skin graft adherence to wound bed in comparison to conventional methods of anchorage. Hence, we recommend the use of autologous PRP routinely on wounds prior to resurfacing to ensure the benefits of early healing.
Roos, Malgorzata; Stawarczyk, Bogna
2012-07-01
This study evaluated and compared Weibull parameters of resin bond strength values using six different general-purpose statistical software packages for two-parameter Weibull distribution. Two-hundred human teeth were randomly divided into 4 groups (n=50), prepared and bonded on dentin according to the manufacturers' instructions using the following resin cements: (i) Variolink (VAN, conventional resin cement), (ii) Panavia21 (PAN, conventional resin cement), (iii) RelyX Unicem (RXU, self-adhesive resin cement) and (iv) G-Cem (GCM, self-adhesive resin cement). Subsequently, all specimens were stored in water for 24h at 37°C. Shear bond strength was measured and the data were analyzed using Anderson-Darling goodness-of-fit (MINITAB 16) and two-parameter Weibull statistics with the following statistical software packages: Excel 2011, SPSS 19, MINITAB 16, R 2.12.1, SAS 9.1.3. and STATA 11.2 (p≤0.05). Additionally, the three-parameter Weibull was fitted using MNITAB 16. Two-parameter Weibull calculated with MINITAB and STATA can be compared using an omnibus test and using 95% CI. In SAS only 95% CI were directly obtained from the output. R provided no estimates of 95% CI. In both SAS and R the global comparison of the characteristic bond strength among groups is provided by means of the Weibull regression. EXCEL and SPSS provided no default information about 95% CI and no significance test for the comparison of Weibull parameters among the groups. In summary, conventional resin cement VAN showed the highest Weibull modulus and characteristic bond strength. There are discrepancies in the Weibull statistics depending on the software package and the estimation method. The information content in the default output provided by the software packages differs to very high extent. Copyright © 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Stream-Dashboard: A Big Data Stream Clustering Framework with Applications to Social Media Streams
ERIC Educational Resources Information Center
Hawwash, Basheer
2013-01-01
Data mining is concerned with detecting patterns of data in raw datasets, which are then used to unearth knowledge that might not have been discovered using conventional querying or statistical methods. This discovered knowledge has been used to empower decision makers in countless applications spanning across many multi-disciplinary areas…
Task-based statistical image reconstruction for high-quality cone-beam CT
NASA Astrophysics Data System (ADS)
Dang, Hao; Webster Stayman, J.; Xu, Jennifer; Zbijewski, Wojciech; Sisniega, Alejandro; Mow, Michael; Wang, Xiaohui; Foos, David H.; Aygun, Nafi; Koliatsos, Vassilis E.; Siewerdsen, Jeffrey H.
2017-11-01
Task-based analysis of medical imaging performance underlies many ongoing efforts in the development of new imaging systems. In statistical image reconstruction, regularization is often formulated in terms to encourage smoothness and/or sharpness (e.g. a linear, quadratic, or Huber penalty) but without explicit formulation of the task. We propose an alternative regularization approach in which a spatially varying penalty is determined that maximizes task-based imaging performance at every location in a 3D image. We apply the method to model-based image reconstruction (MBIR—viz., penalized weighted least-squares, PWLS) in cone-beam CT (CBCT) of the head, focusing on the task of detecting a small, low-contrast intracranial hemorrhage (ICH), and we test the performance of the algorithm in the context of a recently developed CBCT prototype for point-of-care imaging of brain injury. Theoretical predictions of local spatial resolution and noise are computed via an optimization by which regularization (specifically, the quadratic penalty strength) is allowed to vary throughout the image to maximize local task-based detectability index ({{d}\\prime} ). Simulation studies and test-bench experiments were performed using an anthropomorphic head phantom. Three PWLS implementations were tested: conventional (constant) penalty; a certainty-based penalty derived to enforce constant point-spread function, PSF; and the task-based penalty derived to maximize local detectability at each location. Conventional (constant) regularization exhibited a fairly strong degree of spatial variation in {{d}\\prime} , and the certainty-based method achieved uniform PSF, but each exhibited a reduction in detectability compared to the task-based method, which improved detectability up to ~15%. The improvement was strongest in areas of high attenuation (skull base), where the conventional and certainty-based methods tended to over-smooth the data. The task-driven reconstruction method presents a promising regularization method in MBIR by explicitly incorporating task-based imaging performance as the objective. The results demonstrate improved ICH conspicuity and support the development of high-quality CBCT systems.
Ezoddini Ardakani, Fatemeh; Zangoie Booshehri, Maryam; Banadaki, Seyed Hossein Saeed; Nafisi-Moghadam, Reza
2012-01-01
Background Scaphoid fractures are the most common type of carpal fractures. Objectives The aim of the study was to compare the diagnostic value of panoramic and conventional radiographs of the wrist in scaphoid fractures. Patients and Methods The panoramic and conventional radiographs of 122 patients with acute and chronic wrist trauma were studied. The radiographs were analyzed and examined by two independent radiologist observers; one physician radiologist and one maxillofacial radiologist. The final diagnosis was made by an orthopedic specialist. Kappa test was used for statistical calculations, inter- and intra-observer agreement and correlation between the two techniques. Results Wrist panoramic radiography was more accurate than conventional radiography for ruling out scaphoid fractures. There was an agreement in 85% or more of the cases. Agreement values were higher with better inter and intra observer agreement for panoramic examinations than conventional radiographic examinations. Conclusion The panoramic examination of the wrist is a useful technique for the diagnosis and follow-up of scaphoid fractures. Its use is recommended as a complement to conventional radiography in cases with inconclusive findings. PMID:23599708
The potential for increased power from combining P-values testing the same hypothesis.
Ganju, Jitendra; Julie Ma, Guoguang
2017-02-01
The conventional approach to hypothesis testing for formal inference is to prespecify a single test statistic thought to be optimal. However, we usually have more than one test statistic in mind for testing the null hypothesis of no treatment effect but we do not know which one is the most powerful. Rather than relying on a single p-value, combining p-values from prespecified multiple test statistics can be used for inference. Combining functions include Fisher's combination test and the minimum p-value. Using randomization-based tests, the increase in power can be remarkable when compared with a single test and Simes's method. The versatility of the method is that it also applies when the number of covariates exceeds the number of observations. The increase in power is large enough to prefer combined p-values over a single p-value. The limitation is that the method does not provide an unbiased estimator of the treatment effect and does not apply to situations when the model includes treatment by covariate interaction.
Huang, Zhi; Wang, Xin-zhi; Hou, Yue-Zhong
2015-02-01
Making impressions for maxillectomy patients is an essential but difficult task. This study developed a novel method to fabricate individual trays by computer-aided design (CAD) and rapid prototyping (RP) to simplify the process and enhance patient safety. Five unilateral maxillectomy patients were recruited for this study. For each patient, a computed tomography (CT) scan was taken. Based on the 3D surface reconstruction of the target area, an individual tray was manufactured by CAD/RP. With a conventional custom tray as control, two final impressions were made using the different types of tray for each patient. The trays were sectioned, and in each section the thickness of the material was measured at six evenly distributed points. Descriptive statistics and paired t-test were used to examine the difference of the impression thickness. SAS 9.3 was applied in the statistical analysis. Afterwards, all casts were then optically 3D scanned and compared digitally to evaluate the feasibility of this method. Impressions of all five maxillectomy patients were successfully made with individual trays fabricated by CAD/RP and traditional trays. The descriptive statistics of impression thickness measurement showed slightly more uneven results in the traditional trays, but no statistical significance was shown. A 3D digital comparison showed acceptable discrepancies within 1 mm in the majority of cast areas. The largest difference of 3 mm was observed in the buccal wall of the defective areas. Moderate deviations of 1 to 2 mm were detected in the buccal and labial vestibular groove areas. This study confirmed the feasibility of a novel method of fabricating individual trays by CAD/RP. Impressions made by individual trays manufactured using CAD/RP had a uniform thickness, with an acceptable level of accuracy compared to those made through conventional processes. © 2014 by the American College of Prosthodontists.
Santamaría-Arrieta, Gorka; Brizuela-Velasco, Aritza; Fernández-González, Felipe J.; Chávarri-Prado, David; Chento-Valiente, Yelko; Solaberrieta, Eneko; Diéguez-Pereira, Markel; Yurrebaso-Asúa, Jaime
2016-01-01
Background This study evaluated the influence of implant site preparation depth on primary stability measured by insertion torque and resonance frequency analysis (RFA). Material and Methods Thirty-two implant sites were prepared in eight veal rib blocks. Sixteen sites were prepared using the conventional drilling sequence recommended by the manufacturer to a working depth of 10mm. The remaining 16 sites were prepared using an oversize drilling technique (overpreparation) to a working depth of 12mm. Bone density was determined using cone beam computerized tomography (CBCT). The implants were placed and primary stability was measured by two methods: insertion torque (Ncm), and RFA (implant stability quotient [ISQ]). Results The highest torque values were achieved by the conventional drilling technique (10mm). The ANOVA test confirmed that there was a significant correlation between torque and drilling depth (p<0.05). However, no statistically significant differences were obtained between ISQ values at 10 or 12 mm drilling depths (p>0.05) at either measurement direction (cortical and medullar). No statistical relation between torque and ISQ values was identified, or between bone density and primary stability (p >0.05). Conclusions Vertical overpreparation of the implant bed will obtain lower insertion torque values, but does not produce statistically significant differences in ISQ values. Key words:Implant stability quotient, overdrilling, primary stability, resonance frequency analysis, torque. PMID:27398182
NASA Astrophysics Data System (ADS)
Filintas, Agathos, , Dr; Hatzigiannakis, Evagellos, , Dr; Arampatzis, George, , Dr; Ilias, Andreas; Panagopoulos, Andreas, , Dr; Hatzispiroglou, Ioannis
2015-04-01
The aim of the present study is a thorough comparison of hydrometry's conventional and innovative methods-tools for river flow monitoring. A case study was conducted in Stara river at Agios Germanos monitoring station (northwest Greece), in order to investigate possible deviations between conventional and innovative methods-tools on river flow velocity and discharge. For this study, two flowmeters were used, which manufac-tured in 2013 (OTT Messtechnik Gmbh, 2013), as follows: a) A conventional propeller flow velocity meter (OTT-Model C2) which is a me-chanical current flow meter with a certification of calibration BARGO, operated with a rod and a relocating device, along with a digital measuring device including an elec-tronic flow calculator, data logger and real time control display unit. The flowmeter has a measurement velocity range 0.025-4.000 m/s. b) An innovative electromagnetic flowmeter (OTT-Model MF pro) which it is con-sisted of a compact and light-weight sensor and a robust handheld unit. Both system components are designed to be attached to conventional wading rods. The electromag-netic flowmeter uses Faraday's Law of electromagnetic induction to measure the process flow. When an electrically conductive fluid flows along the meter, an electrode voltage is induced between a pair of electrodes placed at right angles to the direction of mag-netic field. The electrode voltage is directly proportional to the average fluid velocity. The electromagnetic flowmeter was operated with a rod and relocating device, along with a digital measuring device with various logging and graphical capabilities and vari-ous methods of velocity measurement (ISO/USGS standards). The flowmeter has a measurement velocity range 0.000-6.000 m/s. The river flow data were averaged over a pair measurement of 60+60 seconds and the measured river water flow velocity, depths and widths of the segments were used for the estimation of cross-section's mean flow velocity in each measured segment. Then it was used the mid-section method for the overall discharge calculation of all segments flow area. The cross-section characteristics, the river flow velocity of segments and the mean water flow velocity and discharge total profile were measured, calculated and an-notated respectively. A series of concurrent conventional and innovative (electromag-netic) flow measurements were performed during 2014. The results and statistical analysis showed that Froude number during the measurement period in all cases was Fr<1 which means that the water flow of the Stara river is classified as subcritical flow. The 12 months' study showed various advantages for the elec-tromagnetic sensor that is virtually maintenance-free because there are no moving parts, no calibration was required in practice, and it can be used even in the lowest water ve-locities from 0.000 m/s. Moreover, based on the concurrent hydromeasurements of the Stara River, on the velocity and discharge modelling and the statistical analysis, it was found that there was not a significant statistical difference (α=0.05) between mean velocity measured with a) conventional and b) electromagnetic method which seems to be more accurate in low velocities where a significant statistical difference was found. Acknowledgments Data in this study are collected in the framework of the elaboration of the national water resources monitoring network, supervised by the Special Secretariat for Water-Hellenic Ministry for the Environment and Climate Change. This project is elaborated in the framework of the operational program "Environment and Sustainable Development" which is co-funded by the National Strategic Reference Framework (NSRF) and the Public Investment Program (PIP).
A study of methods to estimate debris flow velocity
Prochaska, A.B.; Santi, P.M.; Higgins, J.D.; Cannon, S.H.
2008-01-01
Debris flow velocities are commonly back-calculated from superelevation events which require subjective estimates of radii of curvature of bends in the debris flow channel or predicted using flow equations that require the selection of appropriate rheological models and material property inputs. This research investigated difficulties associated with the use of these conventional velocity estimation methods. Radii of curvature estimates were found to vary with the extent of the channel investigated and with the scale of the media used, and back-calculated velocities varied among different investigated locations along a channel. Distinct populations of Bingham properties were found to exist between those measured by laboratory tests and those back-calculated from field data; thus, laboratory-obtained values would not be representative of field-scale debris flow behavior. To avoid these difficulties with conventional methods, a new preliminary velocity estimation method is presented that statistically relates flow velocity to the channel slope and the flow depth. This method presents ranges of reasonable velocity predictions based on 30 previously measured velocities. ?? 2008 Springer-Verlag.
High-order fuzzy time-series based on multi-period adaptation model for forecasting stock markets
NASA Astrophysics Data System (ADS)
Chen, Tai-Liang; Cheng, Ching-Hsue; Teoh, Hia-Jong
2008-02-01
Stock investors usually make their short-term investment decisions according to recent stock information such as the late market news, technical analysis reports, and price fluctuations. To reflect these short-term factors which impact stock price, this paper proposes a comprehensive fuzzy time-series, which factors linear relationships between recent periods of stock prices and fuzzy logical relationships (nonlinear relationships) mined from time-series into forecasting processes. In empirical analysis, the TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) and HSI (Heng Seng Index) are employed as experimental datasets, and four recent fuzzy time-series models, Chen’s (1996), Yu’s (2005), Cheng’s (2006) and Chen’s (2007), are used as comparison models. Besides, to compare with conventional statistic method, the method of least squares is utilized to estimate the auto-regressive models of the testing periods within the databases. From analysis results, the performance comparisons indicate that the multi-period adaptation model, proposed in this paper, can effectively improve the forecasting performance of conventional fuzzy time-series models which only factor fuzzy logical relationships in forecasting processes. From the empirical study, the traditional statistic method and the proposed model both reveal that stock price patterns in the Taiwan stock and Hong Kong stock markets are short-term.
Naghibi, Saeed; Seifirad, Sirous; Adami Dehkordi, Mahboobeh; Einolghozati, Sasan; Ghaffarian Eidgahi Moghadam, Nafiseh; Akhavan Rezayat, Amir; Seifirad, Soroush
2016-01-01
Chronic otitis media (COM) can be treated with tympanoplasty with or without mastoidectomy. In patients who have undergone middle ear surgery, three-dimensional spiral computed tomography (CT) scan plays an important role in optimizing surgical planning. This study was performed to compare the findings of three-dimensional reconstructed spiral and conventional CT scan of ossicular chain study in patients with COM. Fifty patients enrolled in the study underwent plane and three dimensional CT scan (PHILIPS-MX 8000). Ossicles changes, mastoid cavity, tympanic cavity, and presence of cholesteatoma were evaluated. Results of the two methods were then compared and interpreted by a radiologist, recorded in questionnaires, and analyzed. Logistic regression test and Kappa coefficient of agreement were used for statistical analyses. Sixty two ears with COM were found in physical examination. A significant difference was observed between the findings of the two methods in ossicle erosion (11.3% in conventional CT vs. 37.1% in spiral CT, P = 0.0001), decrease of mastoid air cells (82.3% in conventional CT vs. 93.5% in spiral CT, P = 0.001), and tympanic cavity opacity (12.9% in conventional CT vs. 40.3% in spiral CT, P=0.0001). No significant difference was observed between the findings of the two methods in ossicle destruction (6.5% conventional CT vs. 56.4% in spiral CT, P = 0.125), and presence of cholesteatoma (3.2% in conventional CT vs. 42% in spiral CT, P = 0.172). In this study, spiral CT scan demonstrated ossicle dislocation in 9.6%, decrease of mastoid air cells in 4.8%, and decrease of volume in the tympanic cavity in 1.6%; whereas, none of these findings were reported in the patients' conventional CT scans. Spiral-CT scan is superior to conventional CT in the diagnosis of lesions in COM before operation. It can be used for detailed evaluation of ossicular chain in such patients.
Prediction of Multiple-Trait and Multiple-Environment Genomic Data Using Recommender Systems.
Montesinos-López, Osval A; Montesinos-López, Abelardo; Crossa, José; Montesinos-López, José C; Mota-Sanchez, David; Estrada-González, Fermín; Gillberg, Jussi; Singh, Ravi; Mondal, Suchismita; Juliana, Philomin
2018-01-04
In genomic-enabled prediction, the task of improving the accuracy of the prediction of lines in environments is difficult because the available information is generally sparse and usually has low correlations between traits. In current genomic selection, although researchers have a large amount of information and appropriate statistical models to process it, there is still limited computing efficiency to do so. Although some statistical models are usually mathematically elegant, many of them are also computationally inefficient, and they are impractical for many traits, lines, environments, and years because they need to sample from huge normal multivariate distributions. For these reasons, this study explores two recommender systems: item-based collaborative filtering (IBCF) and the matrix factorization algorithm (MF) in the context of multiple traits and multiple environments. The IBCF and MF methods were compared with two conventional methods on simulated and real data. Results of the simulated and real data sets show that the IBCF technique was slightly better in terms of prediction accuracy than the two conventional methods and the MF method when the correlation was moderately high. The IBCF technique is very attractive because it produces good predictions when there is high correlation between items (environment-trait combinations) and its implementation is computationally feasible, which can be useful for plant breeders who deal with very large data sets. Copyright © 2018 Montesinos-Lopez et al.
Prediction of Multiple-Trait and Multiple-Environment Genomic Data Using Recommender Systems
Montesinos-López, Osval A.; Montesinos-López, Abelardo; Crossa, José; Montesinos-López, José C.; Mota-Sanchez, David; Estrada-González, Fermín; Gillberg, Jussi; Singh, Ravi; Mondal, Suchismita; Juliana, Philomin
2018-01-01
In genomic-enabled prediction, the task of improving the accuracy of the prediction of lines in environments is difficult because the available information is generally sparse and usually has low correlations between traits. In current genomic selection, although researchers have a large amount of information and appropriate statistical models to process it, there is still limited computing efficiency to do so. Although some statistical models are usually mathematically elegant, many of them are also computationally inefficient, and they are impractical for many traits, lines, environments, and years because they need to sample from huge normal multivariate distributions. For these reasons, this study explores two recommender systems: item-based collaborative filtering (IBCF) and the matrix factorization algorithm (MF) in the context of multiple traits and multiple environments. The IBCF and MF methods were compared with two conventional methods on simulated and real data. Results of the simulated and real data sets show that the IBCF technique was slightly better in terms of prediction accuracy than the two conventional methods and the MF method when the correlation was moderately high. The IBCF technique is very attractive because it produces good predictions when there is high correlation between items (environment–trait combinations) and its implementation is computationally feasible, which can be useful for plant breeders who deal with very large data sets. PMID:29097376
Aslanimehr, Masoomeh; Rezvani, Shirin; Mahmoudi, Ali; Moosavi, Najmeh
2017-01-01
Statement of the Problem: Candida species are believed to play an important role in initiation and progression of denture stomatitis. The type of the denture material also influences the adhesion of candida and development of stomatitis. Purpose: The aim of this study was comparing the adherence of candida albicans to the conventional and injection molding acrylic denture base materials. Materials and Method: Twenty injection molding and 20 conventional pressure pack acrylic discs (10×10×2 mm) were prepared according to their manufacturer’s instructions. Immediately before the study, samples were placed in sterile water for 3 days to remove residual monomers. The samples were then sterilized using an ultraviolet light unit for 10 minutes. 1×108 Cfu/ml suspension of candida albicans ATCC-10231 was prepared from 48 h cultured organism on sabouraud dextrose agar plates incubated at 37oC. 100 μL of this suspension was placed on the surface of each disk. After being incubated at 37oC for 1 hour, the samples were washed with normal saline to remove non-adherent cells. Attached cells were counted using the colony count method after shaking at 3000 rmp for 20 seconds. Finally, each group was tested for 108 times and the data were statistically analyzed by t-test. Results: Quantitative analysis revealed that differences in colony count average of candida albicans adherence to conventional acrylic materials (8.3×103) comparing to injection molding acrylic resins (6×103) were statistically significant (p<0.001). Conclusion: Significant reduction of candida albicans adherence to the injection acrylic resin materials makes them valuable for patients with high risk of denture stomatitis. PMID:28280761
Xiao, Yongling; Abrahamowicz, Michal
2010-03-30
We propose two bootstrap-based methods to correct the standard errors (SEs) from Cox's model for within-cluster correlation of right-censored event times. The cluster-bootstrap method resamples, with replacement, only the clusters, whereas the two-step bootstrap method resamples (i) the clusters, and (ii) individuals within each selected cluster, with replacement. In simulations, we evaluate both methods and compare them with the existing robust variance estimator and the shared gamma frailty model, which are available in statistical software packages. We simulate clustered event time data, with latent cluster-level random effects, which are ignored in the conventional Cox's model. For cluster-level covariates, both proposed bootstrap methods yield accurate SEs, and type I error rates, and acceptable coverage rates, regardless of the true random effects distribution, and avoid serious variance under-estimation by conventional Cox-based standard errors. However, the two-step bootstrap method over-estimates the variance for individual-level covariates. We also apply the proposed bootstrap methods to obtain confidence bands around flexible estimates of time-dependent effects in a real-life analysis of cluster event times.
Petigny, Loïc; Périno, Sandrine; Minuti, Matteo; Visinoni, Francesco; Wajsman, Joël; Chemat, Farid
2014-01-01
Microwave extraction and separation has been used to increase the concentration of the extract compared to the conventional method with the same solid/liquid ratio, reducing extraction time and separate at the same time Volatile Organic Compounds (VOC) from non-Volatile Organic Compounds (NVOC) of boldo leaves. As preliminary study, a response surface method has been used to optimize the extraction of soluble material and the separation of VOC from the plant in laboratory scale. The results from the statistical analysis revealed that the optimized conditions were: microwave power 200 W, extraction time 56 min and solid liquid ratio of 7.5% of plants in water. Lab scale optimized microwave method is compared to conventional distillation, and requires a power/mass ratio of 0.4 W/g of water engaged. This power/mass ratio is kept in order to upscale from lab to pilot plant. PMID:24776762
Comparison of superior septal approach with left atriotomy in mitral valve surgery
Aydin, Ebuzer; Arslan, Akin; Ozkokeli, Mehmet
2014-01-01
Objective In this study, we aimed to compare clinical outcomes of superior transseptal approach with the conventional left atriotomy in patients undergoing mitral valve surgery. Methods Between January 2010 and November 2012, a total of 91 consecutive adult patients (39 males, 52 females; mean age: 54.0±15.4 years; range, 16 to 82 years) who underwent mitral valve surgery in the Division of Cardiovascular Surgery at Koşuyolu Training Hospital were included. The patients were randomized to either superior transseptal approach (n=47) or conventional left atriotomy (n=44). Demographic characteristics of the patients, comorbidities, additional interventions, intraoperational data, pre- and postoperative electrophysiological study findings, and postoperative complications were recorded. Results Of all patients, 86.7% (n=79) were in New York Heart Association Class III, while 12 were in New York Heart Association Class IV. All patients underwent annuloplasty (42.9%) or valve replacement surgery (57.1%). There was no significant difference in pre- and postoperative electrocardiogram findings between the groups. Change from baseline in the cardiac rhythm was statistically significant in superior transseptal approach group alone (P<0.001). There was no statistically significant difference in mortality rate between the groups. Permanent pacemaker implantation was performed in 10.6% of the patients in superior transseptal approach group and 4.5% in the conventional left atriotomy group. No statistically significant difference in bleeding, total length of hospital and intensive care unit stay, the presence of low cardiac output syndrome was observed between the groups. Conclusion Our study results suggest that superior transseptal approach does not lead to serious or fatal adverse effects on sinus node function or atrial vulnerability, compared to conventional approach. PMID:25372911
Kim, Jung Hoon; Lee, Jae Young; Baek, Jee Hyun; Eun, Hyo Won; Kim, Young Jae; Han, Joon Koo; Choi, Byung Ihn
2015-02-01
OBJECTIVE. The purposes of this study were to compare staging accuracy of high-resolution sonography (HRUS) with combined low- and high-MHz transducers with that of conventional sonography for gallbladder cancer and to investigate the differences in the imaging findings of neoplastic and nonneoplastic gallbladder polyps. MATERIALS AND METHODS. Our study included 37 surgically proven gallbladder cancer (T1a = 7, T1b = 2, T2 = 22, T3 = 6), including 15 malignant neoplastic polyps and 73 surgically proven polyps (neoplastic = 31, nonneoplastic = 42) that underwent HRUS and conventional transabdominal sonography. Two radiologists assessed T-category and predefined polyp findings on HRUS and conventional transabdominal sonography. Statistical analyses were performed using chi-square and McNemar tests. RESULTS. The diagnostic accuracy for the T category was T1a = 92-95%, T1b = 89-95%, T2 = 78-86%, and T3 = 84-89%, all with good agreement (κ = 0.642) using HRUS. The diagnostic accuracy for differentiating T1 from T2 or greater than T2 was 92% and 89% on HRUS and 65% and 70% with conventional transabdominal sonography. Statistically common findings for neoplastic polyps included size greater than 1 cm, single lobular surface, vascular core, hypoechoic polyp, and hypoechoic foci (p < 0.05). The value of HRUS in the differential diagnosis of a gallbladder polyp was more clearly depicted internal echo foci than conventional transabdominal sonography (39 vs 21). A polyp size greater than 1 cm was independently associated with a neoplastic polyp (odds ratio = 7.5, p = 0.02). The AUC of a polyp size greater than 1 cm was 0.877. The sensitivity and specificity were 66.67% and 89.13%, respectively. CONCLUSION. HRUS is a simple method that enables accurate T categorization of gallbladder carcinoma. It provides high-resolution images of gallbladder polyps and may have a role in stratifying the risk for malignancy.
Shivakumarswamy, Udasimath; Arakeri, Surekha U; Karigowdar, Mahesh H; Yelikar, BR
2012-01-01
Background: The cytological examinations of serous effusions have been well-accepted, and a positive diagnosis is often considered as a definitive diagnosis. It helps in staging, prognosis and management of the patients in malignancies and also gives information about various inflammatory and non-inflammatory lesions. Diagnostic problems arise in everyday practice to differentiate reactive atypical mesothelial cells and malignant cells by the routine conventional smear (CS) method. Aims: To compare the morphological features of the CS method with those of the cell block (CB) method and also to assess the utility and sensitivity of the CB method in the cytodiagnosis of pleural effusions. Materials and Methods: The study was conducted in the cytology section of the Department of Pathology. Sixty pleural fluid samples were subjected to diagnostic evaluation for over a period of 20 months. Along with the conventional smears, cell blocks were prepared by using 10% alcohol–formalin as a fixative agent. Statistical analysis with the ‘z test’ was performed to identify the cellularity, using the CS and CB methods. Mc. Naemer's χ2test was used to identify the additional yield for malignancy by the CB method. Results: Cellularity and additional yield for malignancy was 15% more by the CB method. Conclusions: The CB method provides high cellularity, better architectural patterns, morphological features and an additional yield of malignant cells, and thereby, increases the sensitivity of the cytodiagnosis when compared with the CS method. PMID:22438610
Chung, Tae Nyoung; Kim, Sun Wook; You, Je Sung; Chung, Hyun Soo
2016-01-01
Objective Tube thoracostomy (TT) is a commonly performed intensive care procedure. Simulator training may be a good alternative method for TT training, compared with conventional methods such as apprenticeship and animal skills laboratory. However, there is insufficient evidence supporting use of a simulator. The aim of this study is to determine whether training with medical simulator is associated with faster TT process, compared to conventional training without simulator. Methods This is a simulation study. Eligible participants were emergency medicine residents with very few (≤3 times) TT experience. Participants were randomized to two groups: the conventional training group, and the simulator training group. While the simulator training group used the simulator to train TT, the conventional training group watched the instructor performing TT on a cadaver. After training, all participants performed a TT on a cadaver. The performance quality was measured as correct placement and time delay. Subjects were graded if they had difficulty on process. Results Estimated median procedure time was 228 seconds in the conventional training group and 75 seconds in the simulator training group, with statistical significance (P=0.040). The difficulty grading did not show any significant difference among groups (overall performance scale, 2 vs. 3; P=0.094). Conclusion Tube thoracostomy training with a medical simulator, when compared to no simulator training, is associated with a significantly faster procedure, when performed on a human cadaver. PMID:27752610
Leclercq, A; Labeille, B; Perrot, J-L; Vercherin, P; Cambazard, F
2016-01-01
Leg ulcers are a common condition. There have been very few studies of combined therapy involving VAC (vacuum-assisted closure) and skin graft. We performed a randomized controlled trial of VAC therapy vs. hydrocolloid dressings over 5 days following autologous grafting on chronic leg ulcers. The primary objective was to assess the difference in success (defined as a reduction in wound area of at least 50% at 1 month) between the two dressing methods. Forty-six patients with ulcers present for over one month were included. Following a 7-day hospitalization period, follow-up was performed for 3 months on an outpatient basis. Our study does not demonstrate a statistically significant difference, with a 45.8% success rate in the VAC group vs. 40.9% in the conventional dressing group (P=0.73). In the venous ulcer group, the success rate was 57.9% for VAC vs. 40% for conventional dressings (P=0.3). The difference in favor of VAC in this group was not statistically significant, most likely due to an insufficient number of patients studied. Our study does not demonstrate superiority of VAC associated with skin graft over conventional dressings. We observed more complications with VAC (40%) than with conventional dressings (23%) (P=0.06). Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Jonsen, Ian D; Myers, Ransom A; James, Michael C
2006-09-01
1. Biological and statistical complexity are features common to most ecological data that hinder our ability to extract meaningful patterns using conventional tools. Recent work on implementing modern statistical methods for analysis of such ecological data has focused primarily on population dynamics but other types of data, such as animal movement pathways obtained from satellite telemetry, can also benefit from the application of modern statistical tools. 2. We develop a robust hierarchical state-space approach for analysis of multiple satellite telemetry pathways obtained via the Argos system. State-space models are time-series methods that allow unobserved states and biological parameters to be estimated from data observed with error. We show that the approach can reveal important patterns in complex, noisy data where conventional methods cannot. 3. Using the largest Atlantic satellite telemetry data set for critically endangered leatherback turtles, we show that the diel pattern in travel rates of these turtles changes over different phases of their migratory cycle. While foraging in northern waters the turtles show similar travel rates during day and night, but on their southward migration to tropical waters travel rates are markedly faster during the day. These patterns are generally consistent with diving data, and may be related to changes in foraging behaviour. Interestingly, individuals that migrate southward to breed generally show higher daytime travel rates than individuals that migrate southward in a non-breeding year. 4. Our approach is extremely flexible and can be applied to many ecological analyses that use complex, sequential data.
Zhou, Zheng; Dai, Cong; Liu, Wei-xin
2015-06-01
TNF-α has an important role in the pathogenesis of ulcerative colitis (UC). It seems that anti-TNF-α therapy is beneficial in the treatment of UC. The aim was to assess the effectiveness of Infliximab and Adalimamab with UC compared with con- ventional therapy. The Pubmed and Embase databases were searched for studies investigating the efficacy of infliximab and adalimumab on UC. Infliximab had a statistically significant effects in induction of clinical response (RR = 1.67; 95% CI 1.12 to 2.50) of UC compared with conventional therapy, but those had not a statistically significant effects in clinical remission (RR = 1.63; 95% CI 0.84 to 3.18) and reduction of colectomy rate (RR = 0.54; 95% CI 0.26 to 1.12) of UC. And adalimumab had a statistically significant effects in induction of clinical remission (RR =1.82; 95% CI 1.24 to 2.67) and clinical response (RR =1.36; 95% CI 1.13 to 1.64) of UC compared with conventional therapy. Our meta-analyses suggested that Infliximab had a statistically significant effects in induction of clinical response of UC compared with conventional therapy and adalimumab had a statistically significant effects in induction of clinical remission and clinical response of UC compared with conventional therapy.
Comparative study between manual injection intraosseous anesthesia and conventional oral anesthesia
Ata-Ali, Javier; Oltra-Moscardó, María J.; Peñarrocha-Diago, María; Peñarrocha, Miguel
2012-01-01
Objective: To compare intraosseous anesthesia (IA) with the conventional oral anesthesia techniques. Materials and methods: A simple-blind, prospective clinical study was carried out. Each patient underwent two anesthetic techniques: conventional (local infiltration and locoregional anesthetic block) and intraosseous, for res-pective dental operations. In order to allow comparison of IA versus conventional anesthesia, the two operations were similar and affected the same two teeth in opposite quadrants. Results: A total of 200 oral anesthetic procedures were carried out in 100 patients. The mean patient age was 28.6±9.92 years. Fifty-five vestibular infiltrations and 45 mandibular blocks were performed. All patients were also subjected to IA. The type of intervention (conservative or endodontic) exerted no significant influence (p=0.58 and p=0.62, respectively). The latency period was 8.52±2.44 minutes for the conventional techniques and 0.89±0.73 minutes for IA – the difference being statistically significant (p<0.05). Regarding patient anesthesia sensation, the infiltrative techniques lasted a maximum of one hour, the inferior alveolar nerve blocks lasted between 1-3 hours, and IA lasted only 2.5 minutes – the differences being statistically significant (p≤0.0000, Φ=0.29). Anesthetic success was recorded in 89% of the conventional procedures and in 78% of the IA. Most patients preferred IA (61%) (p=0.0032). Conclusions: The two anesthetic procedures have been compared for latency, duration of anesthetic effect, anesthetic success rate and patient preference. Intraosseous anesthesia has been shown to be a technique to be taken into account when planning conservative and endodontic treatments. Key words: Anesthesia, intraosseous, oral anesthesia, Stabident®, infiltrative, mandibular block. PMID:22143700
Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems
2016-06-28
harsh propagation environments. Conventional filtering techniques fail to provide satisfactory performance in many important nonlinear or non...Gaussian scenarios. In addition, there is a lack of a unified methodology for the design and analysis of different filtering techniques. To address...these problems, we have proposed a new filtering methodology called belief condensation (BC) DISTRIBUTION A: Distribution approved for public release
Results of a real-time irradiation of lithium P/N and conventional N/P silicon solar cells.
NASA Technical Reports Server (NTRS)
Reynard, D. L.; Peterson, D. G.
1972-01-01
Eight types of lithium-diffused P/N and three types of conventional 10 ohm-cm N/P silicon solar cells were irradiated at four different temperatures with a strontium-90 radioisotope at a rate typical of that expected in earth orbit. The six-month irradiation confirmed earlier accelerator results, showed that certain cell types outperform others at the various temperatures, and, in general, verified the recent improvements and potential usefulness of lithium solar cells. The experimental approach and statistical methods and analyses employed yielded increased confidence in the validity of the results. Injection level effects were observed to be significant.
Cheah, A K W; Kangkorn, T; Tan, E H; Loo, M L; Chong, S J
2018-01-01
Accurate total body surface area burned (TBSAB) estimation is a crucial aspect of early burn management. It helps guide resuscitation and is essential in the calculation of fluid requirements. Conventional methods of estimation can often lead to large discrepancies in burn percentage estimation. We aim to compare a new method of TBSAB estimation using a three-dimensional smart-phone application named 3D Burn Resuscitation (3D Burn) against conventional methods of estimation-Rule of Palm, Rule of Nines and the Lund and Browder chart. Three volunteer subjects were moulaged with simulated burn injuries of 25%, 30% and 35% total body surface area (TBSA), respectively. Various healthcare workers were invited to use both the 3D Burn application as well as the conventional methods stated above to estimate the volunteer subjects' burn percentages. Collective relative estimations across the groups showed that when used, the Rule of Palm, Rule of Nines and the Lund and Browder chart all over-estimated burns area by an average of 10.6%, 19.7%, and 8.3% TBSA, respectively, while the 3D Burn application under-estimated burns by an average of 1.9%. There was a statistically significant difference between the 3D Burn application estimations versus all three other modalities ( p < 0.05). Time of using the application was found to be significantly longer than traditional methods of estimation. The 3D Burn application, although slower, allowed more accurate TBSAB measurements when compared to conventional methods. The validation study has shown that the 3D Burn application is useful in improving the accuracy of TBSAB measurement. Further studies are warranted, and there are plans to repeat the above study in a different centre overseas as part of a multi-centre study, with a view of progressing to a prospective study that compares the accuracy of the 3D Burn application against conventional methods on actual burn patients.
A refined method for multivariate meta-analysis and meta-regression
Jackson, Daniel; Riley, Richard D
2014-01-01
Making inferences about the average treatment effect using the random effects model for meta-analysis is problematic in the common situation where there is a small number of studies. This is because estimates of the between-study variance are not precise enough to accurately apply the conventional methods for testing and deriving a confidence interval for the average effect. We have found that a refined method for univariate meta-analysis, which applies a scaling factor to the estimated effects’ standard error, provides more accurate inference. We explain how to extend this method to the multivariate scenario and show that our proposal for refined multivariate meta-analysis and meta-regression can provide more accurate inferences than the more conventional approach. We explain how our proposed approach can be implemented using standard output from multivariate meta-analysis software packages and apply our methodology to two real examples. © 2013 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd. PMID:23996351
Shamata, Awatif; Thompson, Tim
2018-05-10
Non-contact three-dimensional (3D) surface scanning has been applied in forensic medicine and has been shown to mitigate shortcoming of traditional documentation methods. The aim of this paper is to assess the efficiency of structured light 3D surface scanning in recording traumatic injuries of live cases in clinical forensic medicine. The work was conducted in Medico-Legal Centre in Benghazi, Libya. A structured light 3D surface scanner and ordinary digital camera with close-up lens were used to record the injuries and to have 3D and two-dimensional (2D) documents of the same traumas. Two different types of comparison were performed. Firstly, the 3D wound documents were compared to 2D documents based on subjective visual assessment. Additionally, 3D wound measurements were compared to conventional measurements and this was done to determine whether there was a statistical significant difference between them. For this, Friedman test was used. The study established that the 3D wound documents had extra features over the 2D documents. Moreover; the 3D scanning method was able to overcome the main deficiencies of the digital photography. No statistically significant difference was found between the 3D and conventional wound measurements. The Spearman's correlation established strong, positive correlation between the 3D and conventional measurement methods. Although, the 3D surface scanning of the injuries of the live subjects faced some difficulties, the 3D results were appreciated, the validity of 3D measurements based on the structured light 3D scanning was established. Further work will be achieved in forensic pathology to scan open injuries with depth information. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.
Saha, Sonali; Jaiswal, JN; Samadi, Firoza
2014-01-01
ABSTRACT Aim: The present study was taken up to clinically evaluate and compare effectiveness of transcutaneous electrical nerve stimulator (TENS) and comfort control syringe (CCS) in various pediatric dental procedures as an alternative to the conventional method of local anesthesia (LA) administration. Materials and methods: Ninety healthy children having at least one deciduous molar tooth indicated for extraction in either maxillary right or left quadrant in age group of 6 to 10 years were randomly divided into three equal groups having 30 subjects each. Group I: LA administration using conventional syringe, group II: LA administration using TENS along with the conventional syringe, group III: LA administration using CCS. After LA by the three techniques, pain, anxiety and heart rate were measured. Statistical analysis: The observations, thus, obtained were subjected to statistical analysis using analysis of variance (ANOVA), student t-test and paired t-test. Results: The mean pain score was maximum in group I followed by group II, while group III revealed the minimum pain, where LA was administered using CCS. Mean anxiety score was maximum in group I followed by group II, while group III revealed the minimum score. Mean heart rate was maximum in group I followed in descending order by groups II and III. Conclusion: The study supports the belief that CCS could be a viable alternative in comparison to the other two methods of LA delivery in children. How to cite this article: Bansal N, Saha S, Jaiswal JN, Samadi F. Pain Elimination during Injection with Newer Electronic Devices: A Comparative Evaluation in Children. Int J Clin Pediatr Dent 2014;7(2):71-76. PMID:25356003
Digital versus conventional techniques for pattern fabrication of implant-supported frameworks
Alikhasi, Marzieh; Rohanian, Ahmad; Ghodsi, Safoura; Kolde, Amin Mohammadpour
2018-01-01
Objective: The aim of this experimental study was to compare retention of frameworks cast from wax patterns fabricated by three different methods. Materials and Methods: Thirty-six implant analogs connected to one-piece abutments were divided randomly into three groups according to the wax pattern fabrication method (n = 12). Computer-aided design/computer-aided manufacturing (CAD/CAM) milling machine, three-dimensional printer, and conventional technique were used for fabrication of waxing patterns. All laboratory procedures were performed by an expert-reliable technician to eliminate intra-operator bias. The wax patterns were cast, finished, and seated on related abutment analogs. The number of adjustment times was recorded and analyzed by Kruskal–Wallis test. Frameworks were cemented on the corresponding analogs with zinc phosphate cement and tensile resistance test was used to measure retention value. Statistical Analysis Used: One-way analysis of variance (ANOVA) and post hoc Tukey tests were used for statistical analysis. Level of significance was set at P < 0.05. Results: The mean retentive values of 680.36 ± 21.93 N, 440.48 ± 85.98 N, and 407.23 ± 67.48 N were recorded for CAD/CAM, rapid prototyping, and conventional group, respectively. One-way ANOVA test revealed significant differences among the three groups (P < 0.001). The post hoc Tukey test showed significantly higher retention for CAD/CAM group (P < 0.001), while there was no significant difference between the two other groups (P = 0.54). CAD/CAM group required significantly more adjustments (P < 0.001). Conclusions: CAD/CAM-fabricated wax patterns showed significantly higher retention for implant-supported cement-retained frameworks; this could be a valuable help when there are limitations in the retention of single-unit implant restorations. PMID:29657528
Ghorbanpour, Arsalan; Azghani, Mahmoud Reza; Taghipour, Mohammad; Salahzadeh, Zahra; Ghaderi, Fariba; Oskouei, Ali E
2018-04-01
[Purpose] The aim of this study was to compare the effects of "McGill stabilization exercises" and "conventional physiotherapy" on pain, functional disability and active back flexion and extension range of motion in patients with chronic non-specific low back pain. [Subjects and Methods] Thirty four patients with chronic non-specific low back pain were randomly assigned to McGill stabilization exercises group (n=17) and conventional physiotherapy group (n=17). In both groups, patients performed the corresponding exercises for six weeks. The visual analog scale (VAS), Quebec Low Back Pain Disability Scale Questionnaire and inclinometer were used to measure pain, functional disability, and active back flexion and extension range of motion, respectively. [Results] Statistically significant improvements were observed in pain, functional disability, and active back extension range of motion in McGill stabilization exercises group. However, active back flexion range of motion was the only clinical symptom that statistically increased in patients who performed conventional physiotherapy. There was no significant difference between the clinical characteristics while compared these two groups of patients. [Conclusion] The results of this study indicated that McGill stabilization exercises and conventional physiotherapy provided approximately similar improvement in pain, functional disability, and active back range of motion in patients with chronic non-specific low back pain. However, it appears that McGill stabilization exercises provide an additional benefit to patients with chronic non-specific low back, especially in pain and functional disability improvement.
Precision of guided scanning procedures for full-arch digital impressions in vivo.
Zimmermann, Moritz; Koller, Christina; Rumetsch, Moritz; Ender, Andreas; Mehl, Albert
2017-11-01
System-specific scanning strategies have been shown to influence the accuracy of full-arch digital impressions. Special guided scanning procedures have been implemented for specific intraoral scanning systems with special regard to the digital orthodontic workflow. The aim of this study was to evaluate the precision of guided scanning procedures compared to conventional impression techniques in vivo. Two intraoral scanning systems with implemented full-arch guided scanning procedures (Cerec Omnicam Ortho; Ormco Lythos) were included along with one conventional impression technique with irreversible hydrocolloid material (alginate). Full-arch impressions were taken three times each from 5 participants (n = 15). Impressions were then compared within the test groups using a point-to-surface distance method after best-fit model matching (OraCheck). Precision was calculated using the (90-10%)/2 quantile and statistical analysis with one-way repeated measures ANOVA and post hoc Bonferroni test was performed. The conventional impression technique with alginate showed the lowest precision for full-arch impressions with 162.2 ± 71.3 µm. Both guided scanning procedures performed statistically significantly better than the conventional impression technique (p < 0.05). Mean values for group Cerec Omnicam Ortho were 74.5 ± 39.2 µm and for group Ormco Lythos 91.4 ± 48.8 µm. The in vivo precision of guided scanning procedures exceeds conventional impression techniques with the irreversible hydrocolloid material alginate. Guided scanning procedures may be highly promising for clinical applications, especially for digital orthodontic workflows.
Adaptive filtering in biological signal processing.
Iyer, V K; Ploysongsang, Y; Ramamoorthy, P A
1990-01-01
The high dependence of conventional optimal filtering methods on the a priori knowledge of the signal and noise statistics render them ineffective in dealing with signals whose statistics cannot be predetermined accurately. Adaptive filtering methods offer a better alternative, since the a priori knowledge of statistics is less critical, real time processing is possible, and the computations are less expensive for this approach. Adaptive filtering methods compute the filter coefficients "on-line", converging to the optimal values in the least-mean square (LMS) error sense. Adaptive filtering is therefore apt for dealing with the "unknown" statistics situation and has been applied extensively in areas like communication, speech, radar, sonar, seismology, and biological signal processing and analysis for channel equalization, interference and echo canceling, line enhancement, signal detection, system identification, spectral analysis, beamforming, modeling, control, etc. In this review article adaptive filtering in the context of biological signals is reviewed. An intuitive approach to the underlying theory of adaptive filters and its applicability are presented. Applications of the principles in biological signal processing are discussed in a manner that brings out the key ideas involved. Current and potential future directions in adaptive biological signal processing are also discussed.
Hydrogen peroxide test for intraoperative bile leak detection.
Trehan, V; Rao, Pankaj P; Naidu, C S; Sharma, Anuj K; Singh, A K; Sharma, Sanjay; Gaur, Amit; Kulkarni, S V; Pathak, N
2017-07-01
Bile leakage (BL) is a common complication following liver surgery, ranging from 3 to 27% in different series. To reduce the incidence of post-operative BL various BL tests have been applied since ages, but no method is foolproof and every method has their own limitations. In this study we used a relatively simpler technique to detect the BL intra-operatively. Topical application of 1.5% diluted hydrogen peroxide (H 2 O 2 ) was used to detect the BL from cut surface of liver and we compared this with conventional saline method to know the efficacy. A total of 31 patients included all patients who underwent liver resection and donor hepatectomies as part of Living Donor Liver Transplantation. After complete liver resection, the conventional saline test followed by topical diluted 1.5% H 2 O 2 test was performed on all. A BL was demonstrated in 11 patients (35.48%) by the conventional saline method and in 19 patients (61.29%) by H 2 O 2 method. Statistically compared by Wilcoxon signed-rank test showed significant difference ( P = 0.014) for minor liver resections group and ( P = 0.002) for major liver resections group. The topical application of H 2 O 2 is a simple and effective method of detection of BL from cut surface of liver. It is an easy, non-invasive, cheap, less time consuming, reproducible, and sensitive technique with no obvious disadvantages.
Falgreen, Steffen; Laursen, Maria Bach; Bødker, Julie Støve; Kjeldsen, Malene Krag; Schmitz, Alexander; Nyegaard, Mette; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin
2014-06-05
In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves' dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Time independent summary statistics may aid the understanding of drugs' action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies.
2014-01-01
Background In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves’ dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. Results First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is useful for biological interpretations. Conclusion Time independent summary statistics may aid the understanding of drugs’ action mechanism on tumour cells and potentially renew previous drug sensitivity evaluation studies. PMID:24902483
Extending local canonical correlation analysis to handle general linear contrasts for FMRI data.
Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar
2012-01-01
Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic.
JIANG, QUAN; ZHANG, YUAN; CHEN, JIAN; ZHANG, YUN-XIAO; HE, ZHU
2014-01-01
The aim of this study was to investigate the diagnostic value of the Virtual Touch™ tissue quantification (VTQ) and elastosonography technologies in benign and malignant breast tumors. Routine preoperative ultrasound, elastosonography and VTQ examinations were performed on 86 patients with breast lesions. The elastosonography score and VTQ speed grouping of each lesion were measured and compared with the pathological findings. The difference in the elastosonography score between the benign and malignant breast tumors was statistically significant (P<0.05). The detection rate for an elastosonography score of 1–3 points in benign tumors was 68.09% and that for an elastosonography score of 4–5 points in malignant tumors was 82.05%. The difference in VTQ speed values between the benign and malignant tumors was also statistically significant (P<0.05). In addition, the diagnostic accuracy of conventional ultrasound, elastosonography, VTQ technology and the combined methods showed statistically significant differences (P<0.05). The use of the three technologies in combination significantly improved the diagnostic accuracy to 91.86%. In conclusion, the combination of conventional ultrasound, elastosonography and VTQ technology can significantly improve accuracy in the diagnosis of breast cancer. PMID:25187797
Extending Local Canonical Correlation Analysis to Handle General Linear Contrasts for fMRI Data
Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar
2012-01-01
Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic. PMID:22461786
Model independent approach to the single photoelectron calibration of photomultiplier tubes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saldanha, R.; Grandi, L.; Guardincerri, Y.
2017-08-01
The accurate calibration of photomultiplier tubes is critical in a wide variety of applications in which it is necessary to know the absolute number of detected photons or precisely determine the resolution of the signal. Conventional calibration methods rely on fitting the photomultiplier response to a low intensity light source with analytical approximations to the single photoelectron distribution, often leading to biased estimates due to the inability to accurately model the full distribution, especially at low charge values. In this paper we present a simple statistical method to extract the relevant single photoelectron calibration parameters without making any assumptions aboutmore » the underlying single photoelectron distribution. We illustrate the use of this method through the calibration of a Hamamatsu R11410 photomultiplier tube and study the accuracy and precision of the method using Monte Carlo simulations. The method is found to have significantly reduced bias compared to conventional methods and works under a wide range of light intensities, making it suitable for simultaneously calibrating large arrays of photomultiplier tubes.« less
Theory of atomic spectral emission intensity
NASA Astrophysics Data System (ADS)
Yngström, Sten
1994-07-01
The theoretical derivation of a new spectral line intensity formula for atomic radiative emission is presented. The theory is based on first principles of quantum physics, electrodynamics, and statistical physics. Quantum rules lead to revision of the conventional principle of local thermal equilibrium of matter and radiation. Study of electrodynamics suggests absence of spectral emission from fractions of the numbers of atoms and ions in a plasma due to radiative inhibition caused by electromagnetic force fields. Statistical probability methods are extended by the statement: A macroscopic physical system develops in the most probable of all conceivable ways consistent with the constraining conditions for the system. The crucial role of statistical physics in transforming quantum logic into common sense logic is stressed. The theory is strongly supported by experimental evidence.
Holographic Refraction and the Measurement of Spherical Ametropia.
Nguyen, Nicholas Hoai Nam
2016-10-01
To evaluate the performance of a holographic logMAR chart for the subjective spherical refraction of the human eye. Bland-Altman analysis was used to assess the level of agreement between subjective spherical refraction using the holographic logMAR chart and conventional autorefraction and subjective spherical refraction. The 95% limits of agreement (LoA) were calculated between holographic refraction and the two standard methods (subjective and autorefraction). Holographic refraction has a lower mean spherical refraction when compared to conventional refraction (LoA 0.11 ± 0.65 D) and when compared to autorefraction (LoA 0.36 ± 0.77 D). After correcting for systemic bias, this is comparable between autorefraction and conventional subjective refraction (LoA 0.45 ± 0.79 D). After correcting for differences in vergence distance and chromatic aberration between holographic and conventional refraction, approximately 65% (group 1) of measurements between holography and conventional subjective refraction were similar (MD = 0.13 D, SD = 0.00 D). The remaining 35% (group 2) had a mean difference of 0.45 D (SD = 0.12 D) between the two subjective methods. Descriptive statistics showed group 2's mean age (21 years, SD = 13 years) was considerably lower than group 1's mean age (41 years, SD = 17), suggesting accommodation may have a role in the greater mean difference of group 2. Overall, holographic refraction has good agreement with conventional refraction and is a viable alternative for spherical subjective refraction. A larger bias between holographic and conventional refraction was found in younger subjects than older subjects, suggesting an association between accommodation and myopic over-correction during holographic refraction.
On the establishment and maintenance of a modern conventional terrestrial reference system
NASA Technical Reports Server (NTRS)
Bock, Y.; Zhu, S. Y.
1982-01-01
The frame of the Conventional Terrestrial Reference System (CTS) is defined by an adopted set of coordinates, at a fundamental epoxh, of a global network of stations which contribute the vertices of a fundamental polyhedron. A method to estimate this set of coordinates using a combination of modern three dimensional geodetic systems is presented. Once established, the function of the CTS is twofold. The first is to monitor the external (or global) motions of the polyhedron with respect to the frame of a Conventional Inertial Reference System, i.e., those motions common to all stations. The second is to monitor the internal motions (or deformations) of the polyhedron, i.e., those motions that are not common to all stations. Two possible estimators for use in earth deformation analysis are given and their statistical and physical properties are described.
Research on ionospheric tomography based on variable pixel height
NASA Astrophysics Data System (ADS)
Zheng, Dunyong; Li, Peiqing; He, Jie; Hu, Wusheng; Li, Chaokui
2016-05-01
A novel ionospheric tomography technique based on variable pixel height was developed for the tomographic reconstruction of the ionospheric electron density distribution. The method considers the height of each pixel as an unknown variable, which is retrieved during the inversion process together with the electron density values. In contrast to conventional computerized ionospheric tomography (CIT), which parameterizes the model with a fixed pixel height, the variable-pixel-height computerized ionospheric tomography (VHCIT) model applies a disturbance to the height of each pixel. In comparison with conventional CIT models, the VHCIT technique achieved superior results in a numerical simulation. A careful validation of the reliability and superiority of VHCIT was performed. According to the results of the statistical analysis of the average root mean square errors, the proposed model offers an improvement by 15% compared with conventional CIT models.
Precipitate statistics in an Al-Mg-Si-Cu alloy from scanning precession electron diffraction data
NASA Astrophysics Data System (ADS)
Sunde, J. K.; Paulsen, Ø.; Wenner, S.; Holmestad, R.
2017-09-01
The key microstructural feature providing strength to age-hardenable Al alloys is nanoscale precipitates. Alloy development requires a reliable statistical assessment of these precipitates, in order to link the microstructure with material properties. Here, it is demonstrated that scanning precession electron diffraction combined with computational analysis enable the semi-automated extraction of precipitate statistics in an Al-Mg-Si-Cu alloy. Among the main findings is the precipitate number density, which agrees well with a conventional method based on manual counting and measurements. By virtue of its data analysis objectivity, our methodology is therefore seen as an advantageous alternative to existing routines, offering reproducibility and efficiency in alloy statistics. Additional results include improved qualitative information on phase distributions. The developed procedure is generic and applicable to any material containing nanoscale precipitates.
Vojdani, M; Torabi, K; Farjood, E; Khaledi, Aar
2013-09-01
Metal-ceramic crowns are most commonly used as the complete coverage restorations in clinical daily use. Disadvantages of conventional hand-made wax-patterns introduce some alternative ways by means of CAD/CAM technologies. This study compares the marginal and internal fit of copings cast from CAD/CAM and conventional fabricated wax-patterns. Twenty-four standardized brass dies were prepared and randomly divided into 2 groups according to the wax-patterns fabrication method (CAD/CAM technique and conventional method) (n=12). All the wax-patterns were fabricated in a standard fashion by means of contour, thickness and internal relief (M1-M12: representative of CAD/CAM group, C1-C12: representative of conventional group). CAD/CAM milling machine (Cori TEC 340i; imes-icore GmbH, Eiterfeld, Germany) was used to fabricate the CAD/CAM group wax-patterns. The copings cast from 24 wax-patterns were cemented to the corresponding dies. For all the coping-die assemblies cross-sectional technique was used to evaluate the marginal and internal fit at 15 points. The Student's t- test was used for statistical analysis (α=0.05). The overall mean (SD) for absolute marginal discrepancy (AMD) was 254.46 (25.10) um for CAD/CAM group and 88.08(10.67) um for conventional group (control). The overall mean of internal gap total (IGT) was 110.77(5.92) um for CAD/CAM group and 76.90 (10.17) um for conventional group. The Student's t-test revealed significant differences between 2 groups. Marginal and internal gaps were found to be significantly higher at all measured areas in CAD/CAM group than conventional group (p< 0.001). Within limitations of this study, conventional method of wax-pattern fabrication produced copings with significantly better marginal and internal fit than CAD/CAM (machine-milled) technique. All the factors for 2 groups were standardized except wax pattern fabrication technique, therefore, only the conventional group results in copings with clinically acceptable margins of less than 120um.
Biolik, A; Heide, S; Lessig, R; Hachmann, V; Stoevesandt, D; Kellner, J; Jäschke, C; Watzke, S
2018-04-01
One option for improving the quality of medical post mortem examinations is through intensified training of medical students, especially in countries where such a requirement exists regardless of the area of specialisation. For this reason, new teaching and learning methods on this topic have recently been introduced. These new approaches include e-learning modules or SkillsLab stations; one way to objectify the resultant learning outcomes is by means of the OSCE process. However, despite offering several advantages, this examination format also requires considerable resources, in particular in regards to medical examiners. For this reason, many clinical disciplines have already implemented computer-based OSCE examination formats. This study investigates whether the conventional exam format for the OSCE forensic "Death Certificate" station could be replaced with a computer-based approach in future. For this study, 123 students completed the OSCE "Death Certificate" station, using both a computer-based and conventional format, half starting with the Computer the other starting with the conventional approach in their OSCE rotation. Assignment of examination cases was random. The examination results for the two stations were compared and both overall results and the individual items of the exam checklist were analysed by means of inferential statistics. Following statistical analysis of examination cases of varying difficulty levels and correction of the repeated measures effect, the results of both examination formats appear to be comparable. Thus, in the descriptive item analysis, while there were some significant differences between the computer-based and conventional OSCE stations, these differences were not reflected in the overall results after a correction factor was applied (e.g. point deductions for assistance from the medical examiner was possible only at the conventional station). Thus, we demonstrate that the computer-based OSCE "Death Certificate" station is a cost-efficient and standardised format for examination that yields results comparable to those from a conventional format exam. Moreover, the examination results also indicate the need to optimize both the test itself (adjusting the degree of difficulty of the case vignettes) and the corresponding instructional and learning methods (including, for example, the use of computer programmes to complete the death certificate in small group formats in the SkillsLab). Copyright © 2018 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Statistical Methods for Generalized Linear Models with Covariates Subject to Detection Limits.
Bernhardt, Paul W; Wang, Huixia J; Zhang, Daowen
2015-05-01
Censored observations are a common occurrence in biomedical data sets. Although a large amount of research has been devoted to estimation and inference for data with censored responses, very little research has focused on proper statistical procedures when predictors are censored. In this paper, we consider statistical methods for dealing with multiple predictors subject to detection limits within the context of generalized linear models. We investigate and adapt several conventional methods and develop a new multiple imputation approach for analyzing data sets with predictors censored due to detection limits. We establish the consistency and asymptotic normality of the proposed multiple imputation estimator and suggest a computationally simple and consistent variance estimator. We also demonstrate that the conditional mean imputation method often leads to inconsistent estimates in generalized linear models, while several other methods are either computationally intensive or lead to parameter estimates that are biased or more variable compared to the proposed multiple imputation estimator. In an extensive simulation study, we assess the bias and variability of different approaches within the context of a logistic regression model and compare variance estimation methods for the proposed multiple imputation estimator. Lastly, we apply several methods to analyze the data set from a recently-conducted GenIMS study.
Optimal weighting of data to detect climatic change - Application to the carbon dioxide problem
NASA Technical Reports Server (NTRS)
Bell, T. L.
1982-01-01
It is suggested that a weighting of surface temperature data, using information about the expected level of warming in different seasons and geographical regions and statistical information about the amount of natural variability in surface temperature, can improve the chances of early detection of carbon dioxide concentration-induced climatic warming. A preliminary analysis of the optimal weighting method presented suggests that it is 25 per cent more effective in revealing surface warming than the conventional method, in virtue of the fact that 25 per cent more data must conventionally be analyzed in order to arrive at a similar probability of detection. An approximate calculation suggests that the warming ought to have already been detected, if the only sources of significant surface temperature variability had time scales of less than one year.
Stabile, Sueli Aparecida Batista; Evangelista, Dilson Henrique Ramos; Talamonte, Valdely Helena; Lippi, Umberto Gazi; Lopes, Reginaldo Guedes Coelho
2012-01-01
To compare two oncotic cervical cytology techniques, the conventional and the liquid-based cytology, in low risk patients for uterine cervical cancer. Comparative prospective study with 100 patients who came to their annual gynecological exam, and were submitted simultaneously to both techniques. We used the McNemar test, with a significance level of p < 0.05 to compare the results obtained related to adequacy of the smear quality, descriptive diagnosis prevalence, guided biopsy confirmation and histology. Adequacy of the smear was similar for both methods. The quality with squamocolumnar junction in 93% of conventional cytology and in 84% of the liquid-based cytology had statistical significance. As for the diagnosis of atypical cells they were detected in 3% of conventional cytology and in 10% of liquid-based cytology (p = 0.06). Atypical squamous cells of undetermined significance were the most prevalent abnormality. The liquid-based cytology performance was better when compared with colposcopy (guided biopsy), presenting sensitivity of 66.7% and specificity of 100%. There was no cytological and histological concordance for the conventional cytology. Liquid-based cytology had a better performance to diagnose atypical cells and the cytohistological concordance was higher than in the conventional cytology.
Ahmad, Aqeel; Negri, Ignacio; Oliveira, Wladecir; Brown, Christopher; Asiimwe, Peter; Sammons, Bernard; Horak, Michael; Jiang, Changjian; Carson, David
2016-02-01
As part of an environmental risk assessment, the potential impact of genetically modified (GM) maize MON 87411 on non-target arthropods (NTAs) was evaluated in the field. MON 87411 confers resistance to corn rootworm (CRW; Diabrotica spp.) by expressing an insecticidal double-stranded RNA (dsRNA) transcript and the Cry3Bb1 protein and tolerance to the herbicide glyphosate by producing the CP4 EPSPS protein. Field trials were conducted at 14 sites providing high geographic and environmental diversity within maize production areas from three geographic regions including the U.S., Argentina, and Brazil. MON 87411, the conventional control, and four commercial conventional reference hybrids were evaluated for NTA abundance and damage. Twenty arthropod taxa met minimum abundance criteria for valid statistical analysis. Nine of these taxa occurred in at least two of the three regions and in at least four sites across regions. These nine taxa included: aphid, predatory earwig, lacewing, ladybird beetle, leafhopper, minute pirate bug, parasitic wasp, sap beetle, and spider. In addition to wide regional distribution, these taxa encompass the ecological functions of herbivores, predators and parasitoids in maize agro-ecosystems. Thus, the nine arthropods may serve as representative taxa of maize agro-ecosystems, and thereby support that analysis of relevant data generated in one region can be transportable for the risk assessment of the same or similar GM crop products in another region. Across the 20 taxa analyzed, no statistically significant differences in abundance were detected between MON 87411 and the conventional control for 123 of the 128 individual-site comparisons (96.1%). For the nine widely distributed taxa, no statistically significant differences in abundance were detected between MON 87411 and the conventional control. Furthermore, no statistically significant differences were detected between MON 87411 and the conventional control for 53 out of 56 individual-site comparisons (94.6 %) of NTA pest damage to the crop. In each case where a significant difference was observed in arthropod abundance or damage, the mean value for MON 87411 was within the reference range and/or the difference was not consistently observed across collection methods and/or sites. Thus, the differences were not representative of an adverse effect unfamiliar to maize and/or were not indicative of a consistent plant response associated with the GM traits. Results from this study support a conclusion of no adverse environmental impact of MON 87411 on NTAs compared to conventional maize and demonstrate the utility of relevant transportable data across regions for the ERA of GM crops.
Effectiveness of feature and classifier algorithms in character recognition systems
NASA Astrophysics Data System (ADS)
Wilson, Charles L.
1993-04-01
At the first Census Optical Character Recognition Systems Conference, NIST generated accuracy data for more than character recognition systems. Most systems were tested on the recognition of isolated digits and upper and lower case alphabetic characters. The recognition experiments were performed on sample sizes of 58,000 digits, and 12,000 upper and lower case alphabetic characters. The algorithms used by the 26 conference participants included rule-based methods, image-based methods, statistical methods, and neural networks. The neural network methods included Multi-Layer Perceptron's, Learned Vector Quantitization, Neocognitrons, and cascaded neural networks. In this paper 11 different systems are compared using correlations between the answers of different systems, comparing the decrease in error rate as a function of confidence of recognition, and comparing the writer dependence of recognition. This comparison shows that methods that used different algorithms for feature extraction and recognition performed with very high levels of correlation. This is true for neural network systems, hybrid systems, and statistically based systems, and leads to the conclusion that neural networks have not yet demonstrated a clear superiority to more conventional statistical methods. Comparison of these results with the models of Vapnick (for estimation problems), MacKay (for Bayesian statistical models), Moody (for effective parameterization), and Boltzmann models (for information content) demonstrate that as the limits of training data variance are approached, all classifier systems have similar statistical properties. The limiting condition can only be approached for sufficiently rich feature sets because the accuracy limit is controlled by the available information content of the training set, which must pass through the feature extraction process prior to classification.
Price, Charlotte; Stallard, Nigel; Creton, Stuart; Indans, Ian; Guest, Robert; Griffiths, David; Edwards, Philippa
2010-01-01
Acute inhalation toxicity of chemicals has conventionally been assessed by the median lethal concentration (LC50) test (organisation for economic co-operation and development (OECD) TG 403). Two new methods, the recently adopted acute toxic class method (ATC; OECD TG 436) and a proposed fixed concentration procedure (FCP), have recently been considered, but statistical evaluations of these methods did not investigate the influence of differential sensitivity between male and female rats on the outcomes. This paper presents an analysis of data from the assessment of acute inhalation toxicity for 56 substances. Statistically significant differences between the LC50 for males and females were found for 16 substances, with greater than 10-fold differences in the LC50 for two substances. The paper also reports a statistical evaluation of the three test methods in the presence of unanticipated gender differences. With TG 403, a gender difference leads to a slightly greater chance of under-classification. This is also the case for the ATC method, but more pronounced than for TG 403, with misclassification of nearly all substances from Globally Harmonised System (GHS) class 3 into class 4. As the FCP uses females only, if females are more sensitive, the classification is unchanged. If males are more sensitive, the procedure may lead to under-classification. Additional research on modification of the FCP is thus proposed. PMID:20488841
Alsharbaty, Mohammed Hussein M; Alikhasi, Marzieh; Zarrati, Simindokht; Shamshiri, Ahmed Reza
2018-02-09
To evaluate the accuracy of a digital implant impression technique using a TRIOS 3Shape intraoral scanner (IOS) compared to conventional implant impression techniques (pick-up and transfer) in clinical situations. Thirty-six patients who had two implants (Implantium, internal connection) ranging in diameter between 3.8 and 4.8 mm in posterior regions participated in this study after signing a consent form. Thirty-six reference models (RM) were fabricated by attaching two impression copings intraorally, splinted with autopolymerizing acrylic resin, verified by sectioning through the middle of the index, and rejoined again with freshly mixed autopolymerizing acrylic resin pattern (Pattern Resin) with the brush bead method. After that, the splinted assemblies were attached to implant analogs (DANSE) and impressed with type III dental stone (Gypsum Microstone) in standard plastic die lock trays. Thirty-six working casts were fabricated for each conventional impression technique (i.e., pick-up and transfer). Thirty-six digital impressions were made with a TRIOS 3Shape IOS. Eight of the digitally scanned files were damaged; 28 digital scan files were retrieved to STL format. A coordinate-measuring machine (CMM) was used to record linear displacement measurements (x, y, and z-coordinates), interimplant distances, and angular displacements for the RMs and conventionally fabricated working casts. CATIA 3D evaluation software was used to assess the digital STL files for the same variables as the CMM measurements. CMM measurements made on the RMs and conventionally fabricated working casts were compared with 3D software measurements made on the digitally scanned files. Data were statistically analyzed using the generalized estimating equation (GEE) with an exchangeable correlation matrix and linear method, followed by the Bonferroni method for pairwise comparisons (α = 0.05). The results showed significant differences between the pick-up and digital groups in all of the measured variables (p < 0.001). Concerning the transfer and digital groups, the results were statistically significant in angular displacement (p < 0.001), distance measurements (p = 0.01), and linear displacement (p = 0.03); however, between the pick-up and transfer groups, there was no statistical significance in all of the measured variables (interimplant distance deviation, linear displacement, and angular displacement deviations). According to the results of this study, the digital implant impression technique had the least accuracy. Based on the study outcomes, distance and angulation errors associated with the intraoral digital implant impressions were too large to fabricate well-fitting restorations for partially edentulous patients. The pick-up implant impression technique was the most accurate, and the transfer technique revealed comparable accuracy to it. © 2018 by the American College of Prosthodontists.
Muko, Soyoka; Shimatani, Ichiro K; Nozawa, Yoko
2014-07-01
Spatial distributions of individuals are conventionally analysed by representing objects as dimensionless points, in which spatial statistics are based on centre-to-centre distances. However, if organisms expand without overlapping and show size variations, such as is the case for encrusting corals, interobject spacing is crucial for spatial associations where interactions occur. We introduced new pairwise statistics using minimum distances between objects and demonstrated their utility when examining encrusting coral community data. We also calculated the conventional point process statistics and the grid-based statistics to clarify the advantages and limitations of each spatial statistical method. For simplicity, coral colonies were approximated by disks in these demonstrations. Focusing on short-distance effects, the use of minimum distances revealed that almost all coral genera were aggregated at a scale of 1-25 cm. However, when fragmented colonies (ramets) were treated as a genet, a genet-level analysis indicated weak or no aggregation, suggesting that most corals were randomly distributed and that fragmentation was the primary cause of colony aggregations. In contrast, point process statistics showed larger aggregation scales, presumably because centre-to-centre distances included both intercolony spacing and colony sizes (radius). The grid-based statistics were able to quantify the patch (aggregation) scale of colonies, but the scale was strongly affected by the colony size. Our approach quantitatively showed repulsive effects between an aggressive genus and a competitively weak genus, while the grid-based statistics (covariance function) also showed repulsion although the spatial scale indicated from the statistics was not directly interpretable in terms of ecological meaning. The use of minimum distances together with previously proposed spatial statistics helped us to extend our understanding of the spatial patterns of nonoverlapping objects that vary in size and the associated specific scales. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.
Lagrangian Particle Tracking Simulation for Warm-Rain Processes in Quasi-One-Dimensional Domain
NASA Astrophysics Data System (ADS)
Kunishima, Y.; Onishi, R.
2017-12-01
Conventional cloud simulations are based on the Euler method and compute each microphysics process in a stochastic way assuming infinite numbers of particles within each numerical grid. They therefore cannot provide the Lagrangian statistics of individual particles in cloud microphysics (i.e., aerosol particles, cloud particles, and rain drops) nor discuss the statistical fluctuations due to finite number of particles. We here simulate the entire precipitation process of warm-rain, with tracking individual particles. We use the Lagrangian Cloud Simulator (LCS), which is based on the Euler-Lagrangian framework. In that framework, flow motion and scalar transportation are computed with the Euler method, and particle motion with the Lagrangian one. The LCS tracks particle motions and collision events individually with considering the hydrodynamic interaction between approaching particles with a superposition method, that is, it can directly represent the collisional growth of cloud particles. It is essential for trustworthy collision detection to take account of the hydrodynamic interaction. In this study, we newly developed a stochastic model based on the Twomey cloud condensation nuclei (CCN) activation for the Lagrangian tracking simulation and integrated it into the LCS. Coupling with the Euler computation for water vapour and temperature fields, the initiation and condensational growth of water droplets were computed in the Lagrangian way. We applied the integrated LCS for a kinematic simulation of warm-rain processes in a vertically-elongated domain of, at largest, 0.03×0.03×3000 (m3) with horizontal periodicity. Aerosol particles with a realistic number density, 5×107 (m3), were evenly distributed over the domain at the initial state. Prescribed updraft at the early stage initiated development of a precipitating cloud. We have confirmed that the obtained bulk statistics fairly agree with those from a conventional spectral-bin scheme for a vertical column domain. The centre of the discussion will be the Lagrangian statistics which is collected from the individual behaviour of the tracked particles.
Oppugning the assumptions of spatial averaging of segment and joint orientations.
Pierrynowski, Michael Raymond; Ball, Kevin Arthur
2009-02-09
Movement scientists frequently calculate "arithmetic averages" when examining body segment or joint orientations. Such calculations appear routinely, yet are fundamentally flawed. Three-dimensional orientation data are computed as matrices, yet three-ordered Euler/Cardan/Bryant angle parameters are frequently used for interpretation. These parameters are not geometrically independent; thus, the conventional process of averaging each parameter is incorrect. The process of arithmetic averaging also assumes that the distances between data are linear (Euclidean); however, for the orientation data these distances are geodesically curved (Riemannian). Therefore we question (oppugn) whether use of the conventional averaging approach is an appropriate statistic. Fortunately, exact methods of averaging orientation data have been developed which both circumvent the parameterization issue, and explicitly acknowledge the Euclidean or Riemannian distance measures. The details of these matrix-based averaging methods are presented and their theoretical advantages discussed. The Euclidian and Riemannian approaches offer appealing advantages over the conventional technique. With respect to practical biomechanical relevancy, examinations of simulated data suggest that for sets of orientation data possessing characteristics of low dispersion, an isotropic distribution, and less than 30 degrees second and third angle parameters, discrepancies with the conventional approach are less than 1.1 degrees . However, beyond these limits, arithmetic averaging can have substantive non-linear inaccuracies in all three parameterized angles. The biomechanics community is encouraged to recognize that limitations exist with the use of the conventional method of averaging orientations. Investigations requiring more robust spatial averaging over a broader range of orientations may benefit from the use of matrix-based Euclidean or Riemannian calculations.
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)
2001-01-01
Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.
Balasubramanian, Sasikala; Paneerselvam, Elavenil; Guruprasad, T; Pathumai, M; Abraham, Simin; Krishnakumar Raja, V. B.
2017-01-01
Objective: The aim of this randomized clinical trial was to assess the efficacy of exclusive lingual nerve block (LNB) in achieving selective lingual soft-tissue anesthesia in comparison with conventional inferior alveolar nerve block (IANB). Materials and Methods: A total of 200 patients indicated for the extraction of lower premolars were recruited for the study. The samples were allocated by randomization into control and study groups. Lingual soft-tissue anesthesia was achieved by IANB and exclusive LNB in the control and study group, respectively. The primary outcome variable studied was anesthesia of ipsilateral lingual mucoperiosteum, floor of mouth and tongue. The secondary variables assessed were (1) taste sensation immediately following administration of local anesthesia and (2) mouth opening and lingual nerve paresthesia on the first postoperative day. Results: Data analysis for descriptive and inferential statistics was performed using SPSS (IBM SPSS Statistics for Windows, Version 22.0, Armonk, NY: IBM Corp. Released 2013) and a P < 0.05 was considered statistically significant. In comparison with the control group, the study group (LNB) showed statistically significant anesthesia of the lingual gingiva of incisors, molars, anterior floor of the mouth, and anterior tongue. Conclusion: Exclusive LNB is superior to IAN nerve block in achieving selective anesthesia of lingual soft tissues. It is technically simple and associated with minimal complications as compared to IAN block. PMID:29264294
Gill, Vikas; Reddy, Y. N. N.; Sanadhya, Sudhanshu; Aapaliya, Pankaj; Sharma, Nidhi
2014-01-01
Background: Debonding procedure is time consuming and damaging to the enamel if performed with improper technique. Various debonding methods include: the conventional methods that use pliers or wrenches, an ultrasonic method, electrothermal devices, air pressure impulse devices, diamond burs to grind the brackets off the tooth surface and lasers. Among all these methods, using debonding pliers is most convenient and effective method but has been reported to cause damage to the teeth. Recently, a New Debonding Instrument designed specifically for ceramic and composite brackets has been introduced. As this is a new instrument, little information is available on efficacy of this instrument. The purpose of this study was to evaluate the debonding characteristics of both “the conventional debonding Pliers” and “the New debonding instrument” when removing ceramic, composite and metallic brackets. Materials and Methods: One Hundred Thirty eight extracted maxillary premolar teeth were collected and divided into two Groups: Group A and Group B (n = 69) respectively. They were further divided into 3 subGroups (n = 23) each according to the types of brackets to be bonded. In subGroups A1 and B1{stainless steel};A2 and B2{ceramic};A3 and B3{composite}adhesive precoated maxillary premolar brackets were used. Among them {ceramic and composite} adhesive pre-coated maxillary premolar brackets were bonded. All the teeth were etched using 37% phosphoric acid for 15 seconds and the brackets were bonded using Transbond XT primer. Brackets were debonded using Conventional Debonding Plier and New Debonding Instrument (Group B). After debonding, the enamel surface of each tooth was examined under stereo microscope (10X magnifications). Amodifiedadhesive remnant index (ARI) was used to quantify the amount of remaining adhesive on each tooth. Results: The observations demonstrate that the results of New Debonding Instrument for debonding of metal, ceramic and composite brackets were statistically significantly different (p = 0.04) and superior from the results of conventional debonding Pliers. Conclusion: The debonding efficiency of New Debonding Instrument is better than the debonding efficiency of Conventional Debonding Pliers for use of metal, ceramic and composite brackets respectively. PMID:25177639
Apical Negative Pressure irrigation presents tissue compatibility in immature teeth
Pucinelli, Carolina Maschietto; da Silva, Léa Assed Bezerra; Cohenca, Nestor; Romualdo, Priscilla Coutinho; da Silva, Raquel Assed Bezerra; Consolaro, Alberto; de Queiroz, Alexandra Mussolino; Nelson, Paulo
2017-01-01
Abstract Aim: To compare the apical negative pressure irrigation (ANP) with conventional irrigation in the teeth of immature dogs with apical periodontitis. Methods: Fifty-two immature pre-molar root canals were randomly assigned into 4 groups: ANP (n=15); conventional irrigation (n=17); healthy teeth (control) (n = 10); and teeth with untreated apical periodontitis (control) (n=10). After induction of apical periodontitis, teeth were instrumented using EndoVac® (apical negative pressure irrigation) or conventional irrigation. The animals were euthanized after 90 days. The sections were stained by HE and analyzed under conventional and fluorescence microscopy. TRAP histoenzymology was also performed. Statistical analyses were performed with the significance level set at 5%. Results: There was difference in the histopathological parameters between ANP and conventional groups (p<0.05). The ANP group showed a predominance of low magnitude inflammatory infiltrate, a smaller periodontal ligament, and lower mineralized tissue resorption. There were no differences in the periapical lesion extensions between the ANP and conventional groups (p>0.05). However, a lower number of osteoclasts was observed in the ANP group (p<0.05). Conclusion: The EndoVac® irrigation system presented better biological results and more advanced repair process in immature teeth with apical periodontitis than the conventional irrigation system, confirming the hypothesis. PMID:29211282
Kim, Sung Jae; Kim, Sung Hwan; Kim, Young Hwan; Chun, Yong Min
2015-01-01
The authors have observed a failure to achieve secure fixation in elderly patients when inserting a half-pin at the anteromedial surface of the tibia. The purpose of this study was to compare two methods for inserting a half-pin at tibia diaphysis in elderly patients. Twenty cadaveric tibias were divided into Group C or V. A half-pin was inserted into the tibias of Group C via the conventional method, from the anteromedial surface to the interosseous border of the tibia diaphysis, and into the tibias of Group V via the vertical method, from the anterior border to the posterior surface at the same level. The maximum insertion torque was measured during the bicortical insertion with a torque driver. The thickness of the cortex was measured by micro-computed tomography. The relationship between the thickness of the cortex engaged and the insertion torque was investigated. The maximum insertion torque and the thickness of the cortex were significantly higher in Group V than Group C. Both groups exhibited a statistically significant linear correlation between torque and thickness by Spearman's rank correlation analysis. Half-pins inserted by the vertical method achieved purchase of more cortex than those inserted by the conventional method. Considering that cortical thickness and insertion torque in Group V were significantly greater than those in Group C, we suggest that the vertical method of half-pin insertion may be an alternative to the conventional method in elderly patients.
Pandey, Pinki; Dixit, Alok; Tanwar, Aparna; Sharma, Anuradha; Mittal, Sanjeev
2014-01-01
Introduction: Our study presents a new deparaffinizing and hematoxylin and eosin (H and E) staining method that involves the use of easily available, nontoxic and eco-friendly liquid diluted dish washing soap (DWS) by completely eliminating expensive and hazardous xylene and alcohol from deparaffinizing and rehydration prior to staining, staining and from dehydration prior to mounting. The aim was to evaluate and compare the quality of liquid DWS treated xylene and alcohol free (XAF) sections with that of the conventional H and E sections. Materials and Methods: A total of 100 paraffin embedded tissue blocks from different tissues were included. From each tissue block, one section was stained with conventional H and E (normal sections) and the other with XAF H and E (soapy sections) staining method. Slides were scored using five parameters: Nuclear, cytoplasmic, clarity, uniformity, and crispness of staining. Z-test was used for statistical analysis. Results: Soapy sections scored better for cytoplasmic (90%) and crisp staining (95%) with a statistically significant difference. Whereas for uniformity of staining, normal sections (88%) scored over soapy sections (72%) (Z = 2.82, P < 0.05). For nuclear (90%) and clarity of staining (90%) total scored favored soapy sections, but the difference was not statistically significant. About 84% normal sections stained adequately for diagnosis when compared with 86% in soapy sections (Z = 0.396, P > 0.05). Conclusion: Liquid DWS is a safe and efficient alternative to xylene and alcohol in deparaffinization and routine H and E staining procedure. We are documenting this project that can be used as a model for other histology laboratories. PMID:25328332
Using machine learning to assess covariate balance in matching studies.
Linden, Ariel; Yarnold, Paul R
2016-12-01
In order to assess the effectiveness of matching approaches in observational studies, investigators typically present summary statistics for each observed pre-intervention covariate, with the objective of showing that matching reduces the difference in means (or proportions) between groups to as close to zero as possible. In this paper, we introduce a new approach to distinguish between study groups based on their distributions of the covariates using a machine-learning algorithm called optimal discriminant analysis (ODA). Assessing covariate balance using ODA as compared with the conventional method has several key advantages: the ability to ascertain how individuals self-select based on optimal (maximum-accuracy) cut-points on the covariates; the application to any variable metric and number of groups; its insensitivity to skewed data or outliers; and the use of accuracy measures that can be widely applied to all analyses. Moreover, ODA accepts analytic weights, thereby extending the assessment of covariate balance to any study design where weights are used for covariate adjustment. By comparing the two approaches using empirical data, we are able to demonstrate that using measures of classification accuracy as balance diagnostics produces highly consistent results to those obtained via the conventional approach (in our matched-pairs example, ODA revealed a weak statistically significant relationship not detected by the conventional approach). Thus, investigators should consider ODA as a robust complement, or perhaps alternative, to the conventional approach for assessing covariate balance in matching studies. © 2016 John Wiley & Sons, Ltd.
A Generic multi-dimensional feature extraction method using multiobjective genetic programming.
Zhang, Yang; Rockett, Peter I
2009-01-01
In this paper, we present a generic feature extraction method for pattern classification using multiobjective genetic programming. This not only evolves the (near-)optimal set of mappings from a pattern space to a multi-dimensional decision space, but also simultaneously optimizes the dimensionality of that decision space. The presented framework evolves vector-to-vector feature extractors that maximize class separability. We demonstrate the efficacy of our approach by making statistically-founded comparisons with a wide variety of established classifier paradigms over a range of datasets and find that for most of the pairwise comparisons, our evolutionary method delivers statistically smaller misclassification errors. At very worst, our method displays no statistical difference in a few pairwise comparisons with established classifier/dataset combinations; crucially, none of the misclassification results produced by our method is worse than any comparator classifier. Although principally focused on feature extraction, feature selection is also performed as an implicit side effect; we show that both feature extraction and selection are important to the success of our technique. The presented method has the practical consequence of obviating the need to exhaustively evaluate a large family of conventional classifiers when faced with a new pattern recognition problem in order to attain a good classification accuracy.
Metal ion transport quantified by ICP-MS in intact cells
Figueroa, Julio A. Landero; Stiner, Cory A.; Radzyukevich, Tatiana L.; Heiny, Judith A.
2016-01-01
The use of ICP-MS to measure metal ion content in biological tissues offers a highly sensitive means to study metal-dependent physiological processes. Here we describe the application of ICP-MS to measure membrane transport of Rb and K ions by the Na,K-ATPase in mouse skeletal muscles and human red blood cells. The ICP-MS method provides greater precision and statistical power than possible with conventional tracer flux methods. The method is widely applicable to studies of other metal ion transporters and metal-dependent processes in a range of cell types and conditions. PMID:26838181
Metal ion transport quantified by ICP-MS in intact cells.
Figueroa, Julio A Landero; Stiner, Cory A; Radzyukevich, Tatiana L; Heiny, Judith A
2016-02-03
The use of ICP-MS to measure metal ion content in biological tissues offers a highly sensitive means to study metal-dependent physiological processes. Here we describe the application of ICP-MS to measure membrane transport of Rb and K ions by the Na,K-ATPase in mouse skeletal muscles and human red blood cells. The ICP-MS method provides greater precision and statistical power than possible with conventional tracer flux methods. The method is widely applicable to studies of other metal ion transporters and metal-dependent processes in a range of cell types and conditions.
Unconventional tail configurations for transport aircraft
NASA Astrophysics Data System (ADS)
Sánchez-Carmona, A.; Cuerno-Rejado, C.; García-Hernández, L.
2017-06-01
This article presents the bases of a methodology in order to size unconventional tail configurations for transport aircraft. The case study of this paper is a V-tail con¦guration. Firstly, an aerodynamic study is developed for determining stability derivatives and aerodynamic forces. The objective is to size a tail such as it develops at least the same static stability derivatives than a conventional reference aircraft. The optimum is obtained minimizing its weight. The weight is estimated through two methods: adapted Farrar£s method and a statistical method. The solution reached is heavier than the reference, but it reduces the wetted area.
Ocular Biocompatibility of Nitinol Intraocular Clips
Velez-Montoya, Raul; Erlanger, Michael
2012-01-01
Purpose. To evaluate the tolerance and biocompatibility of a preformed nitinol intraocular clip in an animal model after anterior segment surgery. Methods. Yucatan mini-pigs were used. A 30-gauge prototype injector was used to attach a shape memory nitinol clip to the iris of five pigs. Another five eyes received conventional polypropylene suture with a modified Seipser slip knot. The authors compared the surgical time of each technique. All eyes underwent standard full-field electroretinogram at baseline and 8 weeks after surgery. The animals were euthanized and eyes collected for histologic analysis after 70 days (10 weeks) postsurgery. The corneal thickness, corneal endothelial cell counts, specular microscopy parameters, retina cell counts, and electroretinogram parameters were compared between the groups. A two sample t-test for means and a P value of 0.05 were use for assessing statistical differences between measurements. Results. The injection of the nitinol clip was 15 times faster than conventional suturing. There were no statistical differences between the groups for corneal thickness, endothelial cell counts, specular microscopy parameters, retina cell counts, and electroretinogram measurements. Conclusions. The nitinol clip prototype is well tolerated and showed no evidence of toxicity in the short-term. The injectable delivery system was faster and technically less challenging than conventional suture techniques. PMID:22064995
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harkness, A. L.
1977-09-01
Nine elements from each batch of fuel elements manufactured for the EBR-II reactor have been analyzed for /sup 235/U content by NDA methods. These values, together with those of the manufacturer, are used to estimate the product variance and the variances of the two measuring methods. These variances are compared with the variances computed from the stipulations of the contract. A method is derived for resolving the several variances into their within-batch and between-batch components. Some of these variance components have also been estimated by independent and more familiar conventional methods for comparison.
Quality evaluation of no-reference MR images using multidirectional filters and image statistics.
Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik
2018-09-01
This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.
Mokhtari, Negar; Shirazi, Alireza-Sarraf
2017-01-01
Background Techniques with adequate accuracy of working length determination along with shorter duration of treatment in pulpectomy procedure seems to be essential in pediatric dentistry. The aim of the present study was to evaluate the accuracy of root canal length measurement with Root ZX II apex locator and rotary system in pulpectomy of primary teeth. Material and Methods In this randomized control clinical trial complete pulpectomy was performed on 80 mandibular primary molars in 80, 4-6-year-old children. The study population was randomly divided into case and control groups. In control group conventional pulpectomy was performed and in the case group working length was determined by electronic apex locator Root ZXII and instrumented with Mtwo rotary files. Statistical evaluation was performed using Mann-Whitney and Chi-Square tests (P<0.05). Results There were no significant differences between electronic apex locator Root ZXII and conventional method in accuracy of root canal length determination. However significantly less time was needed for instrumenting with rotary files (P=0.000). Conclusions Considering the comparable results in accuracy of root canal length determination and the considerably shorter instrumentation time in Root ZXII apex locator and rotary system, it may be suggested for pulpectomy in primary molar teeth. Key words:Rotary technique, conventional technique, pulpectomy, primary teeth. PMID:29302280
Vina, Andres; Peters, Albert J.; Ji, Lei
2003-01-01
There is a global concern about the increase in atmospheric concentrations of greenhouse gases. One method being discussed to encourage greenhouse gas mitigation efforts is based on a trading system whereby carbon emitters can buy effective mitigation efforts from farmers implementing conservation tillage practices. These practices sequester carbon from the atmosphere, and such a trading system would require a low-cost and accurate method of verification. Remote sensing technology can offer such a verification technique. This paper is focused on the use of standard image processing procedures applied to a multispectral Ikonos image, to determine whether it is possible to validate that farmers have complied with agreements to implement conservation tillage practices. A principal component analysis (PCA) was performed in order to isolate image variance in cropped fields. Analyses of variance (ANOVA) statistical procedures were used to evaluate the capability of each Ikonos band and each principal component to discriminate between conventional and conservation tillage practices. A logistic regression model was implemented on the principal component most effective in discriminating between conventional and conservation tillage, in order to produce a map of the probability of conventional tillage. The Ikonos imagery, in combination with ground-reference information, proved to be a useful tool for verification of conservation tillage practices.
A comparison of problem-based learning and conventional teaching in nursing ethics education.
Lin, Chiou-Fen; Lu, Meei-Shiow; Chung, Chun-Chih; Yang, Che-Ming
2010-05-01
The aim of this study was to compare the learning effectiveness of peer tutored problem-based learning and conventional teaching of nursing ethics in Taiwan. The study adopted an experimental design. The peer tutored problem-based learning method was applied to an experimental group and the conventional teaching method to a control group. The study sample consisted of 142 senior nursing students who were randomly assigned to the two groups. All the students were tested for their nursing ethical discrimination ability both before and after the educational intervention. A learning satisfaction survey was also administered to both groups at the end of each course. After the intervention, both groups showed a significant increase in ethical discrimination ability. There was a statistically significant difference between the ethical discrimination scores of the two groups (P < 0.05), with the experimental group on average scoring higher than the control group. There were significant differences in satisfaction with self-motivated learning and critical thinking between the groups. Peer tutored problem-based learning and lecture-type conventional teaching were both effective for nursing ethics education, but problem-based learning was shown to be more effective. Peer tutored problem-based learning has the potential to enhance the efficacy of teaching nursing ethics in situations in which there are personnel and resource constraints.
Data-adaptive test statistics for microarray data.
Mukherjee, Sach; Roberts, Stephen J; van der Laan, Mark J
2005-09-01
An important task in microarray data analysis is the selection of genes that are differentially expressed between different tissue samples, such as healthy and diseased. However, microarray data contain an enormous number of dimensions (genes) and very few samples (arrays), a mismatch which poses fundamental statistical problems for the selection process that have defied easy resolution. In this paper, we present a novel approach to the selection of differentially expressed genes in which test statistics are learned from data using a simple notion of reproducibility in selection results as the learning criterion. Reproducibility, as we define it, can be computed without any knowledge of the 'ground-truth', but takes advantage of certain properties of microarray data to provide an asymptotically valid guide to expected loss under the true data-generating distribution. We are therefore able to indirectly minimize expected loss, and obtain results substantially more robust than conventional methods. We apply our method to simulated and oligonucleotide array data. By request to the corresponding author.
Neurological and mental health outcomes among conventional and organic farmers in Indiana, USA.
Khan, Khalid M; Baidya, Retushi; Aryal, Ashamsa; Farmer, James R; Valliant, Julia
2018-06-20
Every farming method, whether conventional or organic, has been associated with some sort of risky behaviors leading to health issues among farmers. Substantial evidence is not available in the literature to determine whether the magnitudes of health outcomes vary between conventional and organic farmers. The study investigated whether self-reported neurological and mental health symptoms differ between conventional and organic farmers living in Indiana, USA. A self-reported questionnaire survey collected information from 200 conventional and 157 organic farmers of Indiana on demographic characteristics, depression and neurological symptoms. Statistical analyses were conducted to observe the differences in self-reported symptoms by groups of farmers. It was observed that the conventional farmers had significantly higher age-adjusted mean neurological symptom score (p<0.01) than the organic farmers. Regression models revealed positive and significant associations of conventional farming with total (β =1.34; p=0.02), sensory (β =0.83; p=0.001) and behavioural (β =0.09; p=0.03) symptoms after accounting for age, income, education and years in farming. Positive but non-significant associations were also observed in conventional farmers with cognitive and motor symptoms, and with all subscales of depression symptoms in the adjusted models. The findings obtained suggest the importance of a larger study to further explain the difference in mental and neurological health effects in these two categories of farmers.
Sunada, Katsuhisa
2015-01-01
Background Conventional anesthetic nerve block injections into the mandibular foramen risk causing nerve damage. This study aimed to compare the efficacy and safety of the anterior technique (AT) of inferior alveolar nerve block using felypressin-propitocaine with a conventional nerve block technique (CT) using epinephrine and lidocaine for anesthesia via the mandibular foramen. Methods Forty healthy university students with no recent dental work were recruited as subjects and assigned to two groups: right side CT or right side AT. Anesthesia was evaluated in terms of success rate, duration of action, and injection pain. These parameters were assessed at the first incisor, premolar, and molar, 60 min after injection. Chi-square and unpaired t-tests were used for statistical comparisons, with a P value of < 0.05 designating significance. Results The two nerve block techniques generated comparable success rates for the right mandible, with rates of 65% (CT) and 60% (AT) at both the first molar and premolar, and rates of 60% (CT) and 50% (AT) at the lateral incisor. The duration of anesthesia using the CT was 233 ± 37 min, which was approximately 40 min shorter than using the AT. This difference was statistically significant (P < 0.05). Injection pain using the AT was rated as milder compared with the CT. This difference was also statistically significant (P < 0.05). Conclusions The AT is no less successful than the CT for inducing anesthesia, and has the added benefits of a significantly longer duration of action and significantly less pain. PMID:28879260
NIRS-SPM: statistical parametric mapping for near infrared spectroscopy
NASA Astrophysics Data System (ADS)
Tak, Sungho; Jang, Kwang Eun; Jung, Jinwook; Jang, Jaeduck; Jeong, Yong; Ye, Jong Chul
2008-02-01
Even though there exists a powerful statistical parametric mapping (SPM) tool for fMRI, similar public domain tools are not available for near infrared spectroscopy (NIRS). In this paper, we describe a new public domain statistical toolbox called NIRS-SPM for quantitative analysis of NIRS signals. Specifically, NIRS-SPM statistically analyzes the NIRS data using GLM and makes inference as the excursion probability which comes from the random field that are interpolated from the sparse measurement. In order to obtain correct inference, NIRS-SPM offers the pre-coloring and pre-whitening method for temporal correlation estimation. For simultaneous recording NIRS signal with fMRI, the spatial mapping between fMRI image and real coordinate in 3-D digitizer is estimated using Horn's algorithm. These powerful tools allows us the super-resolution localization of the brain activation which is not possible using the conventional NIRS analysis tools.
Timing Calibration in PET Using a Time Alignment Probe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moses, William W.; Thompson, Christopher J.
2006-05-05
We evaluate the Scanwell Time Alignment Probe for performing the timing calibration for the LBNL Prostate-Specific PET Camera. We calibrate the time delay correction factors for each detector module in the camera using two methods--using the Time Alignment Probe (which measures the time difference between the probe and each detector module) and using the conventional method (which measures the timing difference between all module-module combinations in the camera). These correction factors, which are quantized in 2 ns steps, are compared on a module-by-module basis. The values are in excellent agreement--of the 80 correction factors, 62 agree exactly, 17 differ bymore » 1 step, and 1 differs by 2 steps. We also measure on-time and off-time counting rates when the two sets of calibration factors are loaded into the camera and find that they agree within statistical error. We conclude that the performance using the Time Alignment Probe and conventional methods are equivalent.« less
Alarcón, Gonzalo; Barraza, Gabriela; Vera, Andrea; Wozniak, Aniela; García, Patricia
2016-02-01
Trichomonas vaginalis, Mycoplasma hominis and Ureaplasma spp. are microorganisms responsible for genitourinary and pregnancy pathologies. Nucleic acid amplification methods have shown several advantages, but have not been widely studied for the detection of these microorganisms. To implement a conventional polymerase chain reaction (PCR) for the detection of the microorganisms and to compare its results versus the methods currently used at our laboratory. 91 available samples were processed by PCR, culture (M. hominis y Ureaplasma spp.) and wet mount (T vaginalis). Results were compared and statistically analyzed by kappa agreement test. 85, 80 and 87 samples resulted in agreement for the detection of M. hominis, Ureaplasma spp. y T. vaginalis, respectively. For M. hominis and Ureaplasma spp., agreement was substantial, whereas for T. vaginalis it was moderate, however, for the latter, PCR detected more cases than wet mount. We recommend the implementation of PCR for detection of T. vaginalis whereas culture kit is still a useful method for the other microorganisms.
Gowda, Dhananjaya; Airaksinen, Manu; Alku, Paavo
2017-09-01
Recently, a quasi-closed phase (QCP) analysis of speech signals for accurate glottal inverse filtering was proposed. However, the QCP analysis which belongs to the family of temporally weighted linear prediction (WLP) methods uses the conventional forward type of sample prediction. This may not be the best choice especially in computing WLP models with a hard-limiting weighting function. A sample selective minimization of the prediction error in WLP reduces the effective number of samples available within a given window frame. To counter this problem, a modified quasi-closed phase forward-backward (QCP-FB) analysis is proposed, wherein each sample is predicted based on its past as well as future samples thereby utilizing the available number of samples more effectively. Formant detection and estimation experiments on synthetic vowels generated using a physical modeling approach as well as natural speech utterances show that the proposed QCP-FB method yields statistically significant improvements over the conventional linear prediction and QCP methods.
Profile changes after conventional and chin shield genioplasty
Singh, Stuti; Mehrotra, Divya; Mohammad, S.
2014-01-01
Introduction The aim of this study was to compare the profile changes after conventional and chin shield genioplasty. Material and method 20 patients with retruded chin were randomly allocated to two different groups. The experimental group had chin shield osteotomy with interposition of hydroxyapatite collagen graft soaked in platelet rich plasma, while the controls had a conventional genioplasty. The outcome variables evaluated were lip seal, chin thickness, mandibular base length, SNB, labiomental angle, anterior lower facial height, transverse chin shift, and complications. Results There was an increase in chin thickness among all, but a significant increase in anterior lower facial height was seen in the experimental group only. There was no statistically significant difference in satisfaction score in both groups. Conclusion Chin shield genioplasty provides horizontal as well as vertical lengthening of chin without deepening of the mentolabial fold. Hydroxyapatite collagen bone graft and platelet rich plasma promote healing, induce bone formation and reduce bone resorption. PMID:25737921
Effects of quantum coherence on work statistics
NASA Astrophysics Data System (ADS)
Xu, Bao-Ming; Zou, Jian; Guo, Li-Sha; Kong, Xiang-Mu
2018-05-01
In the conventional two-point measurement scheme of quantum thermodynamics, quantum coherence is destroyed by the first measurement. But as we know the coherence really plays an important role in the quantum thermodynamics process, and how to describe the work statistics for a quantum coherent process is still an open question. In this paper, we use the full counting statistics method to investigate the effects of quantum coherence on work statistics. First, we give a general discussion and show that for a quantum coherent process, work statistics is very different from that of the two-point measurement scheme, specifically the average work is increased or decreased and the work fluctuation can be decreased by quantum coherence, which strongly depends on the relative phase, the energy level structure, and the external protocol. Then, we concretely consider a quenched one-dimensional transverse Ising model and show that quantum coherence has a more significant influence on work statistics in the ferromagnetism regime compared with that in the paramagnetism regime, so that due to the presence of quantum coherence the work statistics can exhibit the critical phenomenon even at high temperature.
Cross, Alan; Collard, Mark; Nelson, Andrew
2008-01-01
The conventional method of estimating heat balance during locomotion in humans and other hominins treats the body as an undifferentiated mass. This is problematic because the segments of the body differ with respect to several variables that can affect thermoregulation. Here, we report a study that investigated the impact on heat balance during locomotion of inter-segment differences in three of these variables: surface area, skin temperature and rate of movement. The approach adopted in the study was to generate heat balance estimates with the conventional method and then compare them with heat balance estimates generated with a method that takes into account inter-segment differences in surface area, skin temperature and rate of movement. We reasoned that, if the hypothesis that inter-segment differences in surface area, skin temperature and rate of movement affect heat balance during locomotion is correct, the estimates yielded by the two methods should be statistically significantly different. Anthropometric data were collected on seven adult male volunteers. The volunteers then walked on a treadmill at 1.2 m/s while 3D motion capture cameras recorded their movements. Next, the conventional and segmented methods were used to estimate the volunteers' heat balance while walking in four ambient temperatures. Lastly, the estimates produced with the two methods were compared with the paired t-test. The estimates of heat balance during locomotion yielded by the two methods are significantly different. Those yielded by the segmented method are significantly lower than those produced by the conventional method. Accordingly, the study supports the hypothesis that inter-segment differences in surface area, skin temperature and rate of movement impact heat balance during locomotion. This has important implications not only for current understanding of heat balance during locomotion in hominins but also for how future research on this topic should be approached. PMID:18560580
Cross, Alan; Collard, Mark; Nelson, Andrew
2008-06-18
The conventional method of estimating heat balance during locomotion in humans and other hominins treats the body as an undifferentiated mass. This is problematic because the segments of the body differ with respect to several variables that can affect thermoregulation. Here, we report a study that investigated the impact on heat balance during locomotion of inter-segment differences in three of these variables: surface area, skin temperature and rate of movement. The approach adopted in the study was to generate heat balance estimates with the conventional method and then compare them with heat balance estimates generated with a method that takes into account inter-segment differences in surface area, skin temperature and rate of movement. We reasoned that, if the hypothesis that inter-segment differences in surface area, skin temperature and rate of movement affect heat balance during locomotion is correct, the estimates yielded by the two methods should be statistically significantly different. Anthropometric data were collected on seven adult male volunteers. The volunteers then walked on a treadmill at 1.2 m/s while 3D motion capture cameras recorded their movements. Next, the conventional and segmented methods were used to estimate the volunteers' heat balance while walking in four ambient temperatures. Lastly, the estimates produced with the two methods were compared with the paired t-test. The estimates of heat balance during locomotion yielded by the two methods are significantly different. Those yielded by the segmented method are significantly lower than those produced by the conventional method. Accordingly, the study supports the hypothesis that inter-segment differences in surface area, skin temperature and rate of movement impact heat balance during locomotion. This has important implications not only for current understanding of heat balance during locomotion in hominins but also for how future research on this topic should be approached.
Accuracy and Landmark Error Calculation Using Cone-Beam Computed Tomography–Generated Cephalograms
Grauer, Dan; Cevidanes, Lucia S. H.; Styner, Martin A.; Heulfe, Inam; Harmon, Eric T.; Zhu, Hongtu; Proffit, William R.
2010-01-01
Objective To evaluate systematic differences in landmark position between cone-beam computed tomography (CBCT)–generated cephalograms and conventional digital cephalograms and to estimate how much variability should be taken into account when both modalities are used within the same longitudinal study. Materials and Methods Landmarks on homologous cone-beam computed tomographic–generated cephalograms and conventional digital cephalograms of 46 patients were digitized, registered, and compared via the Hotelling T2 test. Results There were no systematic differences between modalities in the position of most landmarks. Three landmarks showed statistically significant differences but did not reach clinical significance. A method for error calculation while combining both modalities in the same individual is presented. Conclusion In a longitudinal follow-up for assessment of treatment outcomes and growth of one individual, the error due to the combination of the two modalities might be larger than previously estimated. PMID:19905853
Dissecting effects of complex mixtures: who's afraid of informative priors?
Thomas, Duncan C; Witte, John S; Greenland, Sander
2007-03-01
Epidemiologic studies commonly investigate multiple correlated exposures, which are difficult to analyze appropriately. Hierarchical modeling provides a promising approach for analyzing such data by adding a higher-level structure or prior model for the exposure effects. This prior model can incorporate additional information on similarities among the correlated exposures and can be parametric, semiparametric, or nonparametric. We discuss the implications of applying these models and argue for their expanded use in epidemiology. While a prior model adds assumptions to the conventional (first-stage) model, all statistical methods (including conventional methods) make strong intrinsic assumptions about the processes that generated the data. One should thus balance prior modeling assumptions against assumptions of validity, and use sensitivity analyses to understand their implications. In doing so - and by directly incorporating into our analyses information from other studies or allied fields - we can improve our ability to distinguish true causes of disease from noise and bias.
Controlling false-negative errors in microarray differential expression analysis: a PRIM approach.
Cole, Steve W; Galic, Zoran; Zack, Jerome A
2003-09-22
Theoretical considerations suggest that current microarray screening algorithms may fail to detect many true differences in gene expression (Type II analytic errors). We assessed 'false negative' error rates in differential expression analyses by conventional linear statistical models (e.g. t-test), microarray-adapted variants (e.g. SAM, Cyber-T), and a novel strategy based on hold-out cross-validation. The latter approach employs the machine-learning algorithm Patient Rule Induction Method (PRIM) to infer minimum thresholds for reliable change in gene expression from Boolean conjunctions of fold-induction and raw fluorescence measurements. Monte Carlo analyses based on four empirical data sets show that conventional statistical models and their microarray-adapted variants overlook more than 50% of genes showing significant up-regulation. Conjoint PRIM prediction rules recover approximately twice as many differentially expressed transcripts while maintaining strong control over false-positive (Type I) errors. As a result, experimental replication rates increase and total analytic error rates decline. RT-PCR studies confirm that gene inductions detected by PRIM but overlooked by other methods represent true changes in mRNA levels. PRIM-based conjoint inference rules thus represent an improved strategy for high-sensitivity screening of DNA microarrays. Freestanding JAVA application at http://microarray.crump.ucla.edu/focus
A simple blind placement of the left-sided double-lumen tubes.
Zong, Zhi Jun; Shen, Qi Ying; Lu, Yao; Li, Yuan Hai
2016-11-01
One-lung ventilation (OLV) has been commonly provided by using a double-lumen tube (DLT). Previous reports have indicated the high incidence of inappropriate DLT positioning in conventional maneuvers.After obtaining approval from the medical ethics committee of First Affiliated Hospital of Anhui Medical University and written consent from patients, 88 adult patients belonging to American society of anesthesiologists (ASA) physical status grade I or II, and undergoing elective thoracic surgery requiring a left-side DLT for OLV were enrolled in this prospective, single-blind, randomized controlled study. Patients were randomly allocated to 1 of 2 groups: simple maneuver group or conventional maneuver group. The simple maneuver is a method that relies on partially inflating the bronchial balloon and recreating the effect of a carinal hook on the DLTs to give an idea of orientation and depth. After the induction of anesthesia the patients were intubated with a left-sided Robertshaw DLT using one of the 2 intubation techniques. After intubation of each DLT, an anesthesiologist used flexible bronchoscopy to evaluate the patient while the patient lay in a supine position. The number of optimal position and the time required to place DLT in correct position were recorded.Time for the intubation of DLT took 100 ± 16.2 seconds (mean ± SD) in simple maneuver group and 95.1 ± 20.8 seconds in conventional maneuver group. The difference was not statistically significant (P = 0.221). Time for fiberoptic bronchoscope (FOB) took 22 ± 4.8 seconds in simple maneuver group and was statistically faster than that in conventional maneuver group (43.6 ± 23.7 seconds, P < 0.001). Nearly 98% of the 44 intubations in simple maneuver group were considered as in optimal position while only 52% of the 44 intubations in conventional maneuver group were in optimal position, and the difference was statistically significant (P < 0.001).This simple maneuver is more rapid and more accurate to position left-sided DLTs, it may be substituted for FOB during positioning of a left-sided DLT in condition that FOB is unavailable or inapplicable.
Alavi, Shiva; Kachuie, Marzie
2017-01-01
Background: This study was conducted to assess the hardness of orthodontic brackets produced by metal injection molding (MIM) and conventional methods and different orthodontic wires (stainless steel, nickel-titanium [Ni-Ti], and beta-titanium alloys) for better clinical results. Materials and Methods: A total of 15 specimens from each brand of orthodontic brackets and wires were examined. The brackets (Elite Opti-Mim which is produced by MIM process and Ultratrimm which is produced by conventional brazing method) and the wires (stainless steel, Ni-Ti, and beta-titanium) were embedded in epoxy resin, followed by grinding, polishing, and coating. Then, X-ray energy dispersive spectroscopy (EDS) microanalysis was applied to assess their elemental composition. The same specimen surfaces were repolished and used for Vickers microhardness assessment. Hardness was statistically analyzed with Kruskal–Wallis test, followed by Mann–Whitney test at the 0.05 level of significance. Results: The X-ray EDS analysis revealed different ferrous or co-based alloys in each bracket. The maximum mean hardness values of the wires were achieved for stainless steel (SS) (529.85 Vickers hardness [VHN]) versus the minimum values for beta-titanium (334.65 VHN). Among the brackets, Elite Opti-Mim exhibited significantly higher VHN values (262.66 VHN) compared to Ultratrimm (206.59 VHN). VHN values of wire alloys were significantly higher than those of the brackets. Conclusion: MIM orthodontic brackets exhibited hardness values much lower than those of SS orthodontic archwires and were more compatible with NiTi and beta-titanium archwires. A wide range of microhardness values has been reported for conventional orthodontic brackets and it should be considered that the manufacturing method might be only one of the factors affecting the mechanical properties of orthodontic brackets including hardness. PMID:28928783
Simulation methods to estimate design power: an overview for applied research
2011-01-01
Background Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. Methods We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. Results We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Conclusions Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research. PMID:21689447
Virtual and stereoscopic anatomy: when virtual reality meets medical education.
de Faria, Jose Weber Vieira; Teixeira, Manoel Jacobsen; de Moura Sousa Júnior, Leonardo; Otoch, Jose Pinhata; Figueiredo, Eberval Gadelha
2016-11-01
OBJECTIVE The authors sought to construct, implement, and evaluate an interactive and stereoscopic resource for teaching neuroanatomy, accessible from personal computers. METHODS Forty fresh brains (80 hemispheres) were dissected. Images of areas of interest were captured using a manual turntable and processed and stored in a 5337-image database. Pedagogic evaluation was performed in 84 graduate medical students, divided into 3 groups: 1 (conventional method), 2 (interactive nonstereoscopic), and 3 (interactive and stereoscopic). The method was evaluated through a written theory test and a lab practicum. RESULTS Groups 2 and 3 showed the highest mean scores in pedagogic evaluations and differed significantly from Group 1 (p < 0.05). Group 2 did not differ statistically from Group 3 (p > 0.05). Size effects, measured as differences in scores before and after lectures, indicate the effectiveness of the method. ANOVA results showed significant difference (p < 0.05) between groups, and the Tukey test showed statistical differences between Group 1 and the other 2 groups (p < 0.05). No statistical differences between Groups 2 and 3 were found in the practicum. However, there were significant differences when Groups 2 and 3 were compared with Group 1 (p < 0.05). CONCLUSIONS The authors conclude that this method promoted further improvement in knowledge for students and fostered significantly higher learning when compared with traditional teaching resources.
Choudhary, Garima; Gill, Vikas; Reddy, Y N N; Sanadhya, Sudhanshu; Aapaliya, Pankaj; Sharma, Nidhi
2014-07-01
Debonding procedure is time consuming and damaging to the enamel if performed with improper technique. Various debonding methods include: the conventional methods that use pliers or wrenches, an ultrasonic method, electrothermal devices, air pressure impulse devices, diamond burs to grind the brackets off the tooth surface and lasers. Among all these methods, using debonding pliers is most convenient and effective method but has been reported to cause damage to the teeth. Recently, a New Debonding Instrument designed specifically for ceramic and composite brackets has been introduced. As this is a new instrument, little information is available on efficacy of this instrument. The purpose of this study was to evaluate the debonding characteristics of both "the conventional debonding Pliers" and "the New debonding instrument" when removing ceramic, composite and metallic brackets. One Hundred Thirty eight extracted maxillary premolar teeth were collected and divided into two Groups: Group A and Group B (n = 69) respectively. They were further divided into 3 subGroups (n = 23) each according to the types of brackets to be bonded. In subGroups A1 and B1{stainless steel};A2 and B2{ceramic};A3 and B3{composite}adhesive precoated maxillary premolar brackets were used. Among them {ceramic and composite} adhesive pre-coated maxillary premolar brackets were bonded. All the teeth were etched using 37% phosphoric acid for 15 seconds and the brackets were bonded using Transbond XT primer. Brackets were debonded using Conventional Debonding Plier and New Debonding Instrument (Group B). After debonding, the enamel surface of each tooth was examined under stereo microscope (10X magnifications). Amodifiedadhesive remnant index (ARI) was used to quantify the amount of remaining adhesive on each tooth. The observations demonstrate that the results of New Debonding Instrument for debonding of metal, ceramic and composite brackets were statistically significantly different (p = 0.04) and superior from the results of conventional debonding Pliers. The debonding efficiency of New Debonding Instrument is better than the debonding efficiency of Conventional Debonding Pliers for use of metal, ceramic and composite brackets respectively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Çildağ, Mehmet Burak, E-mail: mbcildag@yahoo.com; Çildağ, Songül, E-mail: songulcildag@yahoo.com; Köseoğlu, Ömer Faruk Kutsi, E-mail: kutsikoseoglu@yahoo.com
ObjectiveThe aim of this study is to investigate the potential association of neutrophil–lymphocyte ratio (NLR) between primary patency of percutaneous transluminal angioplasty (PTA) in hemodialysis arteriovenous fistula stenosis and type (Conventional and Drug-Eluting) of balloons used in PTA.Material-MethodThis retrospective study consists of 78 patients with significant arteriovenous fistulas stenosis who were treated with PTA by using Drug-Eluting Balloon (DEB) (n = 29) or Conventional Balloon (CB) (n = 49). NLR was calculated from preinterventional blood samples. All patients were classified into two groups. Group A; primary patency <12 months (43/78), Group B; primary patency ≥12 months (35/78). Cox regression analysis and Kaplan–Meier method were used to determine respectivelymore » independent factors affecting the primary patency and to compare the primary patency for the two balloon types.ResultsNLR ratio and balloon type of the two groups were significantly different (p = 0.002, p = 0.010). The cut-off value of NLR was 3.18 for determination of primary patency, with sensitivity of 81.4 % and specificity of 51.4 %. Primary patency rates between PTA with DEB and CB displayed statistically significant differences (p < 0.05). The cut-off value was 3.28 for determination of 12-month primary patency with the conventional balloon group; sensitivity was 81.8 % and specificity was 81.3 %. There was no statistical relation between NLR levels and the drug-eluting balloon group in 12-month primary patency (p = 0.927).ConclusionIncreased level of NLR may be a risk factor in the development of early AVF restenosis after successful PTA. Preferring Drug-Eluting Balloon at an increased level of NLR can be beneficial to prolong patency.« less
2013-01-01
Background Ultrasonic bone-cutting surgery has been introduced as a feasible alternative to the conventional sharp instruments used in craniomaxillofacial surgery because of its precision and safety. The piezosurgery medical device allows the efficient cutting of mineralized tissues with minimal trauma to soft tissues. Piezoelectric osteotome has found its role in surgically assisted rapid maxillary expansion (SARME), a procedure well established to correct transverse maxillary discrepancies. The advantages include minimal risk to critical anatomic structures. The purpose of this clinical comparative study (CIS 2007-237-M) was to present the advantages of the piezoelectric cut as a minimally invasive device in surgically assisted, rapid maxillary expansion by protecting the maxillary sinus mucosal lining. Methods Thirty patients (18 females and 12 males) at the age of 18 to 54 underwent a surgically assisted palatal expansion of the maxilla with a combined orthodontic and surgical approach. The patients were randomly divided into two separate treatment groups. While Group 1 received conventional surgery using an oscillating saw, Group 2 was treated with piezosurgery. The following parameters were examined: blood pressure, blood values, required medication, bleeding level in the maxillary sinus, duration of inpatient stay, duration of surgery and height of body temperature. Results The results displayed no statistically significant differences between the two groups regarding laboratory blood values and inpatient stay. The duration of surgery revealed a significant discrepancy. Deploying piezosurgery took the surgeon an average of 10 minutes longer than working with a conventional-saw technique. However, the observation of the bleeding level in the paranasal sinus presented a major and statistically significant advantage of piezosurgery: on average the bleeding level was one category above the one of the remaining patients. Conclusion This method of piezoelectric surgery with all its advantages is going to replace many conventional operating procedures in oral and maxillofacial surgery. Trial registration CIS 2007-237-M PMID:23414112
Nedelcu, R; Olsson, P; Nyström, I; Rydén, J; Thor, A
2018-02-01
To evaluate a novel methodology using industrial scanners as a reference, and assess in vivo accuracy of 3 intraoral scanners (IOS) and conventional impressions. Further, to evaluate IOS precision in vivo. Four reference-bodies were bonded to the buccal surfaces of upper premolars and incisors in five subjects. After three reference-scans, ATOS Core 80 (ATOS), subjects were scanned three times with three IOS systems: 3M True Definition (3M), CEREC Omnicam (OMNI) and Trios 3 (TRIOS). One conventional impression (IMPR) was taken, 3M Impregum Penta Soft, and poured models were digitized with laboratory scanner 3shape D1000 (D1000). Best-fit alignment of reference-bodies and 3D Compare Analysis was performed. Precision of ATOS and D1000 was assessed for quantitative evaluation and comparison. Accuracy of IOS and IMPR were analyzed using ATOS as reference. Precision of IOS was evaluated through intra-system comparison. Precision of ATOS reference scanner (mean 0.6 μm) and D1000 (mean 0.5 μm) was high. Pairwise multiple comparisons of reference-bodies located in different tooth positions displayed a statistically significant difference of accuracy between two scanner-groups: 3M and TRIOS, over OMNI (p value range 0.0001 to 0.0006). IMPR did not show any statistically significant difference to IOS. However, deviations of IOS and IMPR were within a similar magnitude. No statistical difference was found for IOS precision. The methodology can be used for assessing accuracy of IOS and IMPR in vivo in up to five units bilaterally from midline. 3M and TRIOS had a higher accuracy than OMNI. IMPR overlapped both groups. Intraoral scanners can be used as a replacement for conventional impressions when restoring up to ten units without extended edentulous spans. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Fetit, Ahmed E; Novak, Jan; Peet, Andrew C; Arvanitits, Theodoros N
2015-09-01
The aim of this study was to assess the efficacy of three-dimensional texture analysis (3D TA) of conventional MR images for the classification of childhood brain tumours in a quantitative manner. The dataset comprised pre-contrast T1 - and T2-weighted MRI series obtained from 48 children diagnosed with brain tumours (medulloblastoma, pilocytic astrocytoma and ependymoma). 3D and 2D TA were carried out on the images using first-, second- and higher order statistical methods. Six supervised classification algorithms were trained with the most influential 3D and 2D textural features, and their performances in the classification of tumour types, using the two feature sets, were compared. Model validation was carried out using the leave-one-out cross-validation (LOOCV) approach, as well as stratified 10-fold cross-validation, in order to provide additional reassurance. McNemar's test was used to test the statistical significance of any improvements demonstrated by 3D-trained classifiers. Supervised learning models trained with 3D textural features showed improved classification performances to those trained with conventional 2D features. For instance, a neural network classifier showed 12% improvement in area under the receiver operator characteristics curve (AUC) and 19% in overall classification accuracy. These improvements were statistically significant for four of the tested classifiers, as per McNemar's tests. This study shows that 3D textural features extracted from conventional T1 - and T2-weighted images can improve the diagnostic classification of childhood brain tumours. Long-term benefits of accurate, yet non-invasive, diagnostic aids include a reduction in surgical procedures, improvement in surgical and therapy planning, and support of discussions with patients' families. It remains necessary, however, to extend the analysis to a multicentre cohort in order to assess the scalability of the techniques used. Copyright © 2015 John Wiley & Sons, Ltd.
Mapping urban environmental noise: a land use regression method.
Xie, Dan; Liu, Yi; Chen, Jining
2011-09-01
Forecasting and preventing urban noise pollution are major challenges in urban environmental management. Most existing efforts, including experiment-based models, statistical models, and noise mapping, however, have limited capacity to explain the association between urban growth and corresponding noise change. Therefore, these conventional methods can hardly forecast urban noise at a given outlook of development layout. This paper, for the first time, introduces a land use regression method, which has been applied for simulating urban air quality for a decade, to construct an urban noise model (LUNOS) in Dalian Municipality, Northwest China. The LUNOS model describes noise as a dependent variable of surrounding various land areas via a regressive function. The results suggest that a linear model performs better in fitting monitoring data, and there is no significant difference of the LUNOS's outputs when applied to different spatial scales. As the LUNOS facilitates a better understanding of the association between land use and urban environmental noise in comparison to conventional methods, it can be regarded as a promising tool for noise prediction for planning purposes and aid smart decision-making.
NASA Technical Reports Server (NTRS)
Kim, Hakil; Swain, Philip H.
1990-01-01
An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
An adaptive multi-feature segmentation model for infrared image
NASA Astrophysics Data System (ADS)
Zhang, Tingting; Han, Jin; Zhang, Yi; Bai, Lianfa
2016-04-01
Active contour models (ACM) have been extensively applied to image segmentation, conventional region-based active contour models only utilize global or local single feature information to minimize the energy functional to drive the contour evolution. Considering the limitations of original ACMs, an adaptive multi-feature segmentation model is proposed to handle infrared images with blurred boundaries and low contrast. In the proposed model, several essential local statistic features are introduced to construct a multi-feature signed pressure function (MFSPF). In addition, we draw upon the adaptive weight coefficient to modify the level set formulation, which is formed by integrating MFSPF with local statistic features and signed pressure function with global information. Experimental results demonstrate that the proposed method can make up for the inadequacy of the original method and get desirable results in segmenting infrared images.
Li, Ke; Chen, Peng
2011-01-01
Structural faults, such as unbalance, misalignment and looseness, etc., often occur in the shafts of rotating machinery. These faults may cause serious machine accidents and lead to great production losses. This paper proposes an intelligent method for diagnosing structural faults of rotating machinery using ant colony optimization (ACO) and relative ratio symptom parameters (RRSPs) in order to detect faults and distinguish fault types at an early stage. New symptom parameters called “relative ratio symptom parameters” are defined for reflecting the features of vibration signals measured in each state. Synthetic detection index (SDI) using statistical theory has also been defined to evaluate the applicability of the RRSPs. The SDI can be used to indicate the fitness of a RRSP for ACO. Lastly, this paper also compares the proposed method with the conventional neural networks (NN) method. Practical examples of fault diagnosis for a centrifugal fan are provided to verify the effectiveness of the proposed method. The verification results show that the structural faults often occurring in the centrifugal fan, such as unbalance, misalignment and looseness states are effectively identified by the proposed method, while these faults are difficult to detect using conventional neural networks. PMID:22163833
Li, Ke; Chen, Peng
2011-01-01
Structural faults, such as unbalance, misalignment and looseness, etc., often occur in the shafts of rotating machinery. These faults may cause serious machine accidents and lead to great production losses. This paper proposes an intelligent method for diagnosing structural faults of rotating machinery using ant colony optimization (ACO) and relative ratio symptom parameters (RRSPs) in order to detect faults and distinguish fault types at an early stage. New symptom parameters called "relative ratio symptom parameters" are defined for reflecting the features of vibration signals measured in each state. Synthetic detection index (SDI) using statistical theory has also been defined to evaluate the applicability of the RRSPs. The SDI can be used to indicate the fitness of a RRSP for ACO. Lastly, this paper also compares the proposed method with the conventional neural networks (NN) method. Practical examples of fault diagnosis for a centrifugal fan are provided to verify the effectiveness of the proposed method. The verification results show that the structural faults often occurring in the centrifugal fan, such as unbalance, misalignment and looseness states are effectively identified by the proposed method, while these faults are difficult to detect using conventional neural networks.
Ashnagar, Sajjad; Monzavi, Abbas; Abbasi, Mehdi; Aghajani, Mahdi; Chiniforush, Nasim
2017-01-01
Introduction: Today, bleaching is a routine noninvasive alternative for treatment of discolored teeth. The aim of this study was to determine whether conventional or laser activated bleaching predispose teeth to develop caries or not. Methods: Sixty human molars were mounted on acrylic cylinders and their Knoop microhardness (KHN) as well as DIAGNOdent (DD) values were recorded. They were divided into 4 experimental groups; G1) conventional bleaching with 40% hydrogen peroxide gel, G2) Diode laser assisted bleaching with same gel, G3) Nd:YAG laser assisted bleaching with the same gel, G4) control group. After bleaching, all samples were subjected to a three day pH cycling regimen and then, KHN and DD values were measured. Results: All groups had significant reduction in KHN values. It seems that there is no statistically meaningful difference between changes in enamel microhardness of the sample groups and all groups have changed in a similar amount. Reduction of DD scores were significant in Diode laser and conventional groups, however changes in Nd:YAG laser and control groups were not significant. Changes in DD values have followed a similar pattern among groups, except in G1- G4 and G2-G4 couples. Conventional and diode laser groups had a meaningful difference in reduction of DD values in comparison with the control group. Conclusion: It can be concluded that bleaching whether conventional or laser activated, does not make teeth vulnerable to develop carious lesions.
Kawata, Masaaki; Sato, Chikara
2007-06-01
In determining the three-dimensional (3D) structure of macromolecular assemblies in single particle analysis, a large representative dataset of two-dimensional (2D) average images from huge number of raw images is a key for high resolution. Because alignments prior to averaging are computationally intensive, currently available multireference alignment (MRA) software does not survey every possible alignment. This leads to misaligned images, creating blurred averages and reducing the quality of the final 3D reconstruction. We present a new method, in which multireference alignment is harmonized with classification (multireference multiple alignment: MRMA). This method enables a statistical comparison of multiple alignment peaks, reflecting the similarities between each raw image and a set of reference images. Among the selected alignment candidates for each raw image, misaligned images are statistically excluded, based on the principle that aligned raw images of similar projections have a dense distribution around the correctly aligned coordinates in image space. This newly developed method was examined for accuracy and speed using model image sets with various signal-to-noise ratios, and with electron microscope images of the Transient Receptor Potential C3 and the sodium channel. In every data set, the newly developed method outperformed conventional methods in robustness against noise and in speed, creating 2D average images of higher quality. This statistically harmonized alignment-classification combination should greatly improve the quality of single particle analysis.
Kawashima, Hiroki; Hayashi, Norio; Ohno, Naoki; Matsuura, Yukihiro; Sanada, Shigeru
2015-08-01
To evaluate the patient identification ability of radiographers, previous and current chest radiographs were assessed with observer study utilizing a receiver operating characteristics (ROCs) analysis. This study included portable and conventional chest radiographs from 43 same and 43 different patients. The dataset used in this study was divided into the three following groups: (1) a pair of portable radiographs, (2) a pair of conventional radiographs, and (3) a combination of each type of radiograph. Seven observers participated in this ROC study, which aimed to identify same or different patients, using these datasets. ROC analysis was conducted to calculate the average area under ROC curve obtained by each observer (AUCave), and a statistical test was performed using the multi-reader multi-case method. Comparable results were obtained with pairs of portable (AUCave: 0.949) and conventional radiographs (AUCave: 0.951). In a comparison between the same modality, there were no significant differences. In contrast, the ability to identify patients by comparing a portable and conventional radiograph (AUCave: 0.873) was lower than with the matching datasets (p=0.002 and p=0.004, respectively). In conclusion, the use of different imaging modalities reduces radiographers' ability to identify their patients.
Effect of Robotic-Assisted Gait Training in Patients With Incomplete Spinal Cord Injury
Shin, Ji Cheol; Kim, Ji Yong; Park, Han Kyul
2014-01-01
Objective To determine the effect of robotic-assisted gait training (RAGT) compared to conventional overground training. Methods Sixty patients with motor incomplete spinal cord injury (SCI) were included in a prospective, randomized clinical trial by comparing RAGT to conventional overground training. The RAGT group received RAGT three sessions per week at duration of 40 minutes with regular physiotherapy in 4 weeks. The conventional group underwent regular physiotherapy twice a day, 5 times a week. Main outcomes were lower extremity motor score of American Spinal Injury Association impairment scale (LEMS), ambulatory motor index (AMI), Spinal Cord Independence Measure III mobility section (SCIM3-M), and walking index for spinal cord injury version II (WISCI-II) scale. Results At the end of rehabilitation, both groups showed significant improvement in LEMS, AMI, SCIM3-M, and WISCI-II. Based on WISCI-II, statistically significant improvement was observed in the RAGT group. For the remaining variables, no difference was found. Conclusion RAGT combined with conventional physiotherapy could yield more improvement in ambulatory function than conventional therapy alone. RAGT should be considered as one additional tool to provide neuromuscular reeducation in patient with incomplete SCI. PMID:25566469
Cochand-Priollet, Béatrix; Cartier, Isabelle; de Cremoux, Patricia; Le Galès, Catherine; Ziol, Marianne; Molinié, Vincent; Petitjean, Alain; Dosda, Anne; Merea, Estelle; Biaggi, Annonciade; Gouget, Isabelle; Arkwright, Sylviane; Vacher-Lavenu, Marie-Cécile; Vielh, Philippe; Coste, Joël
2005-11-01
Many articles concerning conventional Pap smears, ThinPrep liquid-based cytology (LBC) and Hybrid-Capture II HPV test (HC II) have been published. This study, carried out by the French Society of Clinical Cytology, may be conspicuous for several reasons: it was financially independent; it compared the efficiency of the conventional Pap smear and LBC, of the conventional Pap smear and HC II, and included an economic study based on real costs; for all the women, a "gold standard" reference method, colposcopy, was available and biopsies were performed whenever a lesion was detected; The conventional Pap smear, the LBC (split-sample technique), the colposcopy, and the biopsies were done at the same time. This study included 2,585 women shared into two groups: a group A of a high-risk population, a group B of a screening population. The statistical analysis of the results showed that conventional Pap smears consistently had superior or equivalent sensitivity and specificity than LBC for the lesions at threshold CIN-I (Cervical Intraepithelial Neoplasia) or CIN-II or higher. It underlined the low specificity of the HC II. Finally, the LBC mean cost was never covered by the Social Security tariff.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siversson, Carl, E-mail: carl.siversson@med.lu.se; Nordström, Fredrik; Department of Radiation Physics, Skåne University Hospital, Lund 214 28
2015-10-15
Purpose: In order to enable a magnetic resonance imaging (MRI) only workflow in radiotherapy treatment planning, methods are required for generating Hounsfield unit (HU) maps (i.e., synthetic computed tomography, sCT) for dose calculations, directly from MRI. The Statistical Decomposition Algorithm (SDA) is a method for automatically generating sCT images from a single MR image volume, based on automatic tissue classification in combination with a model trained using a multimodal template material. This study compares dose calculations between sCT generated by the SDA and conventional CT in the male pelvic region. Methods: The study comprised ten prostate cancer patients, for whommore » a 3D T2 weighted MRI and a conventional planning CT were acquired. For each patient, sCT images were generated from the acquired MRI using the SDA. In order to decouple the effect of variations in patient geometry between imaging modalities from the effect of uncertainties in the SDA, the conventional CT was nonrigidly registered to the MRI to assure that their geometries were well aligned. For each patient, a volumetric modulated arc therapy plan was created for the registered CT (rCT) and recalculated for both the sCT and the conventional CT. The results were evaluated using several methods, including mean average error (MAE), a set of dose-volume histogram parameters, and a restrictive gamma criterion (2% local dose/1 mm). Results: The MAE within the body contour was 36.5 ± 4.1 (1 s.d.) HU between sCT and rCT. Average mean absorbed dose difference to target was 0.0% ± 0.2% (1 s.d.) between sCT and rCT, whereas it was −0.3% ± 0.3% (1 s.d.) between CT and rCT. The average gamma pass rate was 99.9% for sCT vs rCT, whereas it was 90.3% for CT vs rCT. Conclusions: The SDA enables a highly accurate MRI only workflow in prostate radiotherapy planning. The dosimetric uncertainties originating from the SDA appear negligible and are notably lower than the uncertainties introduced by variations in patient geometry between imaging sessions.« less
Tsirogiannis, Panagiotis; Reissmann, Daniel R; Heydecke, Guido
2016-09-01
In existing published reports, some studies indicate the superiority of digital impression systems in terms of the marginal accuracy of ceramic restorations, whereas others show that the conventional method provides restorations with better marginal fit than fully digital fabrication. Which impression method provides the lowest mean values for marginal adaptation is inconclusive. The findings from those studies cannot be easily generalized, and in vivo studies that could provide valid and meaningful information are limited in the existing publications. The purpose of this study was to systematically review existing reports and evaluate the marginal fit of ceramic single-tooth restorations after either digital or conventional impression methods by combining the available evidence in a meta-analysis. The search strategy for this systematic review of the publications was based on a Population, Intervention, Comparison, and Outcome (PICO) framework. For the statistical analysis, the mean marginal fit values of each study were extracted and categorized according to the impression method to calculate the mean value, together with the 95% confidence intervals (CI) of each category, and to evaluate the impact of each impression method on the marginal adaptation by comparing digital and conventional techniques separately for in vitro and in vivo studies. Twelve studies were included in the meta-analysis from the 63 identified records after database searching. For the in vitro studies, where ceramic restorations were fabricated after conventional impressions, the mean value of the marginal fit was 58.9 μm (95% CI: 41.1-76.7 μm), whereas after digital impressions, it was 63.3 μm (95% CI: 50.5-76.0 μm). In the in vivo studies, the mean marginal discrepancy of the restorations after digital impressions was 56.1 μm (95% CI: 46.3-65.8 μm), whereas after conventional impressions, it was 79.2 μm (95% CI: 59.6-98.9 μm) No significant difference was observed regarding the marginal discrepancy of single-unit ceramic restorations fabricated after digital or conventional impressions. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Postoperative discomfort after Nd:YAG laser and conventional frenectomy: comparison of both genders.
Akpınar, A; Toker, H; Lektemur Alpan, A; Çalışır, M
2016-03-01
Evidence has suggested that males and females experience and report feeling pain differently. The aim of this study was to determine the postoperative perception levels of both females and males after neodymium-doped yttrium aluminum garnet (Nd:YAG) laser frenectomy and conventional frenectomy, and to compare the perceptions between genders. Eighty-nine patients requiring frenectomy were randomly assigned to have treatment with either the conventional frenectomy or with the Nd:YAG laser. Postoperative discomfort (pain, chewing, talking) was recorded using a visual analog scale (VAS) on the operation day and postoperative days 1, 3, 7 and 10. According to the female VAS scores of the pain, chewing and speaking discomfort were statistically higher in the conventional group than those of the laser group on the operation day, and on the first and third postoperative days. Pain discomfort in males was statistically higher in the conventional group than those of the laser group on the operation day. Speaking discomfort in males was statistically higher in the conventional group than those of the laser group on the operation day and the first postoperative day. The present study indicated that Nd:YAG laser treatment used for frenectomies provides better postoperative comfort for each gender, especially in females in terms of pain, chewing and speaking than the conventional procedure up to the seventh postoperative day. According to our results, Nd:YAG laser may provide a safe, bloodless, painless surgery and an impressive alternative for frenectomy operations. © 2015 Australian Dental Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ballbè, Montse; Catalan Network of Smoke-free Hospitals, L'Hospitalet de Llobregat, Barcelona; Cancer Prevention and Control Group, Institut d'Investigació Biomèdica de Bellvitge – IDIBELL, L’Hospitalet de Llobregat, Barcelona
Background: There is scarce evidence about passive exposure to the vapour released or exhaled from electronic cigarettes (e-cigarettes) under real conditions. The aim of this study is to characterise passive exposure to nicotine from e-cigarettes' vapour and conventional cigarettes' smoke at home among non-smokers under real-use conditions. Methods: We conducted an observational study with 54 non-smoker volunteers from different homes: 25 living at home with conventional smokers, 5 living with nicotine e-cigarette users, and 24 from control homes (not using conventional cigarettes neither e-cigarettes). We measured airborne nicotine at home and biomarkers (cotinine in saliva and urine). We calculated geometricmore » mean (GM) and geometric standard deviations (GSD). We also performed ANOVA and Student's t tests for the log-transformed data. We used Bonferroni-corrected t-tests to control the family error rate for multiple comparisons at 5%. Results: The GMs of airborne nicotine were 0.74 μg/m{sup 3} (GSD=4.05) in the smokers’ homes, 0.13 μg/m{sup 3} (GSD=2.4) in the e-cigarettes users’ homes, and 0.02 μg/m{sup 3} (GSD=3.51) in the control homes. The GMs of salivary cotinine were 0.38 ng/ml (GSD=2.34) in the smokers’ homes, 0.19 ng/ml (GSD=2.17) in the e-cigarettes users’ homes, and 0.07 ng/ml (GSD=1.79) in the control homes. Salivary cotinine concentrations of the non-smokers exposed to e-cigarette's vapour at home (all exposed ≥2 h/day) were statistically significant different that those found in non-smokers exposed to second-hand smoke ≥2 h/day and in non-smokers from control homes. Conclusions: The airborne markers were statistically higher in conventional cigarette homes than in e-cigarettes homes (5.7 times higher). However, concentrations of both biomarkers among non-smokers exposed to conventional cigarettes and e-cigarettes’ vapour were statistically similar (only 2 and 1.4 times higher, respectively). The levels of airborne nicotine and cotinine concentrations in the homes with e-cigarette users were higher than control homes (differences statistically significant). Our results show that non-smokers passively exposed to e-cigarettes absorb nicotine. - Highlights: • This is the first study of e-cigarette exposure at home under real-use conditions. • Airborne nicotine in homes with smokers were 5.7 times higher than in e-cig homes. • Cotinine of non-smokers exposed to e-cig and conventional cigarettes was similar. • Airborne nicotine in homes with e-cig users was higher than control homes. • Cotinine of non-smokers exposed to e-cig users was higher than in those no exposed.« less
Demystifying the Enigma of Smoking – An Observational Comparative Study on Tobacco Smoking
Nallakunta, Rajesh; Reddy, Sudhakara Reddy; Chennoju, Sai Kiran
2016-01-01
Introduction Smoking is a hazardous habit which causes definite changes in the oral cavity, consequently there exist changes in the mucosa when subjected to smoking. Palatal mucosa is first to be affected. The present study determines the palatal status in reverse smokers and conventional smokers. Aim To study and compare the clinical, cytological and histopathological changes in palatal mucosa among reverse and conventional smokers. Materials and Methods Study sample was categorized into two groups. Group 1 comprised of 20 subjects with the habit of reverse smoking and Group 2 comprised of 20 subjects with the habit of conventional smoking. Initially, clinical appearance of the palatal mucosa was recorded, followed by a cytological smear and biopsy of the involved area among all the subjects. The findings were studied clinically, the specimens were analysed cytologically and histopathologically, and compared among the two groups. Results The severity of clinical changes of the palatal mucosa among reverse smokers was statistically significant when compared to those of conventional smokers. There was no statistically significant difference observed in cytological staging between the groups with a p-value of 0.35. The histopathological changes in both the groups showed a significant difference with a p-value of 0.02. A significant positive correlation was observed between the clinical appearance, and cytological, histopathological changes. Conclusion Profound clinically aggressive changes were observed in group I compared to group II. Severity of dysplastic changes have been detected in few subjects through histopathological examination irrespective of no prominent clinical and cytological changes observed among the two groups. PMID:27190962
Garoushi, Sufyan K.; Hatem, Marwa; Lassila, Lippo V. J.; Vallittu, Pekka K.
2015-01-01
Abstract Objectives: To determine the marginal microleakage of Class II restorations made with different composite base materials and the static load-bearing capacity of direct composite onlay restorations. Methods: Class II cavities were prepared in 40 extracted molars. They were divided into five groups (n = 8/group) depending on composite base material used (everX Posterior, SDR, Tetric EvoFlow). After Class II restorations were completed, specimens were sectioned mid-sagitally. For each group, sectioned restorations were immersed in dye. Specimens were viewed under a stereo-microscope and the percentage of cavity leakage was calculated. Ten groups of onlay restorations were fabricated (n = 8/group); groups were made with composite base materials (everX Posterior, SDR, Tetric EvoFlow, Gradia Direct LoFlo) and covered by 1 mm layer of conventional (Tetric N-Ceram) or bulk fill (Tetric EvoCeram Bulk Fill) composites. Groups made only from conventional, bulk fill and short fiber composites were used as control. Specimens were statically loaded until fracture. Data were analyzed using ANOVA (p = 0.05). Results: Microleakage of restorations made of plain conventional composite or short fiber composite base material showed statistically (p < 0.05) lower values compared to other groups. ANOVA revealed that onlay restorations made from short fiber-reinforced composite (FRC) as base or plain restoration had statistically significant higher load-bearing capacity (1593 N) (p < 0.05) than other restorations. Conclusion: Restorations combining base of short FRC and surface layer of conventional composite displayed promising performance related to microleakage and load-bearing capacity. PMID:28642894
Deterministic annealing for density estimation by multivariate normal mixtures
NASA Astrophysics Data System (ADS)
Kloppenburg, Martin; Tavan, Paul
1997-03-01
An approach to maximum-likelihood density estimation by mixtures of multivariate normal distributions for large high-dimensional data sets is presented. Conventionally that problem is tackled by notoriously unstable expectation-maximization (EM) algorithms. We remove these instabilities by the introduction of soft constraints, enabling deterministic annealing. Our developments are motivated by the proof that algorithmically stable fuzzy clustering methods that are derived from statistical physics analogs are special cases of EM procedures.
Statistical Analysis of Hit/Miss Data (Preprint)
2012-07-01
HDBK-1823A, 2009). Other agencies and industries have also made use of this guidance (Gandossi et al., 2010) and ( Drury et al., 2006). It should...better accounting of false call rates such that the POD curve doesn’t converge to 0 for small flaw sizes. The difficulty with conventional methods...2002. Drury , Ghylin, and Holness, Error Analysis and Threat Magnitude for Carry-on Bag Inspection, Proceedings of the Human Factors and Ergonomic
Web-based surveys as an alternative to traditional mail methods.
Fleming, Christopher M; Bowden, Mark
2009-01-01
Environmental economists have long used surveys to gather information about people's preferences. A recent innovation in survey methodology has been the advent of web-based surveys. While the Internet appears to offer a promising alternative to conventional survey administration modes, concerns exist over potential sampling biases associated with web-based surveys and the effect these may have on valuation estimates. This paper compares results obtained from a travel cost questionnaire of visitors to Fraser Island, Australia, that was conducted using two alternate survey administration modes; conventional mail and web-based. It is found that response rates and the socio-demographic make-up of respondents to the two survey modes are not statistically different. Moreover, both modes yield similar consumer surplus estimates.
The multiple decrement life table: a unifying framework for cause-of-death analysis in ecology.
Carey, James R
1989-01-01
The multiple decrement life table is used widely in the human actuarial literature and provides statistical expressions for mortality in three different forms: i) the life table from all causes-of-death combined; ii) the life table disaggregated into selected cause-of-death categories; and iii) the life table with particular causes and combinations of causes eliminated. The purpose of this paper is to introduce the multiple decrement life table to the ecological literature by applying the methods to published death-by-cause information on Rhagoletis pomonella. Interrelations between the current approach and conventional tools used in basic and applied ecology are discussed including the conventional life table, Key Factor Analysis and Abbott's Correction used in toxicological bioassay.
Task-based data-acquisition optimization for sparse image reconstruction systems
NASA Astrophysics Data System (ADS)
Chen, Yujia; Lou, Yang; Kupinski, Matthew A.; Anastasio, Mark A.
2017-03-01
Conventional wisdom dictates that imaging hardware should be optimized by use of an ideal observer (IO) that exploits full statistical knowledge of the class of objects to be imaged, without consideration of the reconstruction method to be employed. However, accurate and tractable models of the complete object statistics are often difficult to determine in practice. Moreover, in imaging systems that employ compressive sensing concepts, imaging hardware and (sparse) image reconstruction are innately coupled technologies. We have previously proposed a sparsity-driven ideal observer (SDIO) that can be employed to optimize hardware by use of a stochastic object model that describes object sparsity. The SDIO and sparse reconstruction method can therefore be "matched" in the sense that they both utilize the same statistical information regarding the class of objects to be imaged. To efficiently compute SDIO performance, the posterior distribution is estimated by use of computational tools developed recently for variational Bayesian inference. Subsequently, the SDIO test statistic can be computed semi-analytically. The advantages of employing the SDIO instead of a Hotelling observer are systematically demonstrated in case studies in which magnetic resonance imaging (MRI) data acquisition schemes are optimized for signal detection tasks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, William; Krakowiak, Konrad J.; Ulm, Franz-Josef, E-mail: ulm@mit.edu
2014-01-15
According to recent developments in cement clinker engineering, the optimization of chemical substitutions in the main clinker phases offers a promising approach to improve both reactivity and grindability of clinkers. Thus, monitoring the chemistry of the phases may become part of the quality control at the cement plants, along with the usual measurements of the abundance of the mineralogical phases (quantitative X-ray diffraction) and the bulk chemistry (X-ray fluorescence). This paper presents a new method to assess these three complementary quantities with a single experiment. The method is based on electron microprobe spot analyses, performed over a grid located onmore » a representative surface of the sample and interpreted with advanced statistical tools. This paper describes the method and the experimental program performed on industrial clinkers to establish the accuracy in comparison to conventional methods. -- Highlights: •A new method of clinker characterization •Combination of electron probe technique with cluster analysis •Simultaneous assessment of phase abundance, composition and bulk chemistry •Experimental validation performed on industrial clinkers.« less
Multi-template tensor-based morphometry: Application to analysis of Alzheimer's disease
Koikkalainen, Juha; Lötjönen, Jyrki; Thurfjell, Lennart; Rueckert, Daniel; Waldemar, Gunhild; Soininen, Hilkka
2012-01-01
In this paper methods for using multiple templates in tensor-based morphometry (TBM) are presented and comparedtothe conventional single-template approach. TBM analysis requires non-rigid registrations which are often subject to registration errors. When using multiple templates and, therefore, multiple registrations, it can be assumed that the registration errors are averaged and eventually compensated. Four different methods are proposed for multi-template TBM. The methods were evaluated using magnetic resonance (MR) images of healthy controls, patients with stable or progressive mild cognitive impairment (MCI), and patients with Alzheimer's disease (AD) from the ADNI database (N=772). The performance of TBM features in classifying images was evaluated both quantitatively and qualitatively. Classification results show that the multi-template methods are statistically significantly better than the single-template method. The overall classification accuracy was 86.0% for the classification of control and AD subjects, and 72.1%for the classification of stable and progressive MCI subjects. The statistical group-level difference maps produced using multi-template TBM were smoother, formed larger continuous regions, and had larger t-values than the maps obtained with single-template TBM. PMID:21419228
NASA Astrophysics Data System (ADS)
Zhang, Zhu; Li, Hongbin; Tang, Dengping; Hu, Chen; Jiao, Yang
2017-10-01
Metering performance is the key parameter of an electronic voltage transformer (EVT), and it requires high accuracy. The conventional off-line calibration method using a standard voltage transformer is not suitable for the key equipment in a smart substation, which needs on-line monitoring. In this article, we propose a method for monitoring the metering performance of an EVT on-line based on cyber-physics correlation analysis. By the electrical and physical properties of a substation running in three-phase symmetry, the principal component analysis method is used to separate the metering deviation caused by the primary fluctuation and the EVT anomaly. The characteristic statistics of the measured data during operation are extracted, and the metering performance of the EVT is evaluated by analyzing the change in statistics. The experimental results show that the method successfully monitors the metering deviation of a Class 0.2 EVT accurately. The method demonstrates the accurate evaluation of on-line monitoring of the metering performance on an EVT without a standard voltage transformer.
Yang, Hyeri; Na, Jihye; Jang, Won-Hee; Jung, Mi-Sook; Jeon, Jun-Young; Heo, Yong; Yeo, Kyung-Wook; Jo, Ji-Hoon; Lim, Kyung-Min; Bae, SeungJin
2015-05-05
Mouse local lymph node assay (LLNA, OECD TG429) is an alternative test replacing conventional guinea pig tests (OECD TG406) for the skin sensitization test but the use of a radioisotopic agent, (3)H-thymidine, deters its active dissemination. New non-radioisotopic LLNA, LLNA:BrdU-FCM employs a non-radioisotopic analog, 5-bromo-2'-deoxyuridine (BrdU) and flow cytometry. For an analogous method, OECD TG429 performance standard (PS) advises that two reference compounds be tested repeatedly and ECt(threshold) values obtained must fall within acceptable ranges to prove within- and between-laboratory reproducibility. However, this criteria is somewhat arbitrary and sample size of ECt is less than 5, raising concerns about insufficient reliability. Here, we explored various statistical methods to evaluate the reproducibility of LLNA:BrdU-FCM with stimulation index (SI), the raw data for ECt calculation, produced from 3 laboratories. Descriptive statistics along with graphical representation of SI was presented. For inferential statistics, parametric and non-parametric methods were applied to test the reproducibility of SI of a concurrent positive control and the robustness of results were investigated. Descriptive statistics and graphical representation of SI alone could illustrate the within- and between-laboratory reproducibility. Inferential statistics employing parametric and nonparametric methods drew similar conclusion. While all labs passed within- and between-laboratory reproducibility criteria given by OECD TG429 PS based on ECt values, statistical evaluation based on SI values showed that only two labs succeeded in achieving within-laboratory reproducibility. For those two labs that satisfied the within-lab reproducibility, between-laboratory reproducibility could be also attained based on inferential as well as descriptive statistics. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Simulation methods to estimate design power: an overview for applied research.
Arnold, Benjamin F; Hogan, Daniel R; Colford, John M; Hubbard, Alan E
2011-06-20
Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research.
Quantitative analysis of tympanic membrane perforation: a simple and reliable method.
Ibekwe, T S; Adeosun, A A; Nwaorgu, O G
2009-01-01
Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.
Impact of virtual microscopy with conventional microscopy on student learning in dental histology.
Hande, Alka Harish; Lohe, Vidya K; Chaudhary, Minal S; Gawande, Madhuri N; Patil, Swati K; Zade, Prajakta R
2017-01-01
In dental histology, the assimilation of histological features of different dental hard and soft tissues is done by conventional microscopy. This traditional method of learning prevents the students from screening the entire slide and change of magnification. To address these drawbacks, modification in conventional microscopy has evolved and become motivation for changing the learning tool. Virtual microscopy is the technique in which there is complete digitization of the microscopic glass slide, which can be analyzed on a computer. This research is designed to evaluate the effectiveness of virtual microscopy with conventional microscopy on student learning in dental histology. A cohort of 105 students were included and randomized into three groups: A, B, and C. Group A students studied the microscopic features of oral histologic lesions by conventional microscopy, Group B by virtual microscopy, and Group C by both conventional and virtual microscopy. The students' understanding of the subject was evaluated by a prepared questionnaire. The effectiveness of the study designs on knowledge gains and satisfaction levels was assessed by statistical assessment of differences in mean test scores. The difference in score between Groups A, B, and C at pre- and post-test was highly significant. This enhanced understanding of the subject may be due to benefits of using virtual microscopy in teaching histology. The augmentation of conventional microscopy with virtual microscopy shows enhancement of the understanding of the subject as compared to the use of conventional microscopy and virtual microscopy alone.
SADEGHI, ROYA; SEDAGHAT, MOHAMMAD MEHDI; SHA AHMADI, FARAMARZ
2014-01-01
Introduction: Blended learning, a new approach in educational planning, is defined as an applying more than one method, strategy, technique or media in education. Todays, due to the development of infrastructure of Internet networks and the access of most of the students, the Internet can be utilized along with traditional and conventional methods of training. The aim of this study was to compare the students’ learning and satisfaction in combination of lecture and e-learning with conventional lecture methods. Methods: This quasi-experimental study is conducted among the sophomore students of Public Health School, Tehran University of Medical Science in 2012-2013. Four classes of the school are randomly selected and are divided into two groups. Education in two classes (45 students) was in the form of lecture method and in the other two classes (48 students) was blended method with e-Learning and lecture methods. The students’ knowledge about tuberculosis in two groups was collected and measured by using pre and post-test. This step has been done by sending self-reported electronic questionnaires to the students' email addresses through Google Document software. At the end of educational programs, students' satisfaction and comments about two methods were also collected by questionnaires. Statistical tests such as descriptive methods, paired t-test, independent t-test and ANOVA were done through the SPSS 14 software, and p≤0.05 was considered as significant difference. Results: The mean scores of the lecture and blended groups were 13.18±1.37 and 13.35±1.36, respectively; the difference between the pre-test scores of the two groups was not statistically significant (p=0.535). Knowledge scores increased in both groups after training, and the mean and standard deviation of knowledge scores of the lectures and combined groups were 16.51±0.69 and 16.18±1.06, respectively. The difference between the post-test scores of the two groups was not statistically significant (p=0.112). Students’ satisfaction in blended learning method was higher than lecture method. Conclusion: The results revealed that the blended method is effective in increasing the students' learning rate. E-learning can be used to teach some courses and might be considered as economic aspects. Since in universities of medical sciences in the country, the majority of students have access to the Internet and email address, using e-learning could be used as a supplement to traditional teaching methods or sometimes as educational alternative method because this method of teaching increases the students’ knowledge, satisfaction and attention. PMID:25512938
Horng, Huann-Cheng; Yuan, Chiou-Chung; Chao, Kuan-Chong; Cheng, Ming-Huei; Wang, Peng-Hui
2007-06-01
To evaluate the efficacy and acceptability of the Port-A-Cath (PAC) insertion method with (conventional group as II) and without (modified group as I) the aid of intraoperative fluoroscopy or other localizing devices. A total of 158 women with various kinds of gynecological cancers warranting PAC insertion (n = 86 in group I and n = 72 in group II, respectively) were evaluated. Data for analyses included patient age, main disease, dislocation site, surgical time, complications, and catheter outcome. There was no statistical difference between the two groups in terms of age, main disease, complications, and the experiencing of patent catheters. However, appropriate positioning (100% in group I, and 82% in group II) in the superior vena cava (SVC) showed statistical differences between the two groups (P = 0.001). In addition, the surgical time in group I was statistically shorter than that in group II (P < 0.001). The modified method for inserting the PAC offered the following benefits: including avoiding X-ray exposure for both the operator and the patient, defining the appropriate position in the SVC, and less surgical time. (c) 2007 Wiley-Liss, Inc.
Gurelik, Gokhan; Hasanreisoglu, Berati
2012-01-01
Purpose To compare the efficacy and safety of 23-gauge transconjunctival vitrectomy with the conventional 20-gauge method in idiopathic epiretinal membrane and macular hole surgery. Methods Sixty-one consecutive patients undergoing vitrectomy for idiopathic epiretinal membrane and macular hole were recruited to either 20- or 23-gauge vitrectomy groups and prospectively evaluated. Surgical success rates, operating time, surgery-related complications, long-term visual outcomes, and postoperative ocular surface problems are compared in the two groups. Results There were 31 eyes in the 20-gauge group and 33 eyes in the 23-gauge group. The macular hole closure rate after the first surgery was 83% and 90.9% in the 20-gauge and 23-gauge groups, respectively, with no significant difference between groups (p = 0.59). The success rate for idiopathic epiretinal membranes cases was 100% in both groups. There was no statistically significant difference between overall surgical times (p = 0.90). None of the patients in either group experienced postoperative complications of severe postoperative hypotony, vitreous hemorrhage or endophthalmitis, except one eye in the 20-gauge group, which was found to have retinal detachment. In both groups, statistically significant improvement in visual acuity was achieved 1-month postoperatively (p = 0.002) and thereafter at all postoperative visits (p < 0.05). The mean ocular surface scores were significantly lower in the 23-gauge group at all postoperative visits compared with the 20-gauge group scores (p = 0.001). Conclusions Transconjunctival 23-gauge vitrectomy appears to be as effective and safe as conventional 20-gauge vitrectomy in idiopathic epiretinal membrane and macular hole surgeries. PMID:23060720
A Non-Stationary Approach for Estimating Future Hydroclimatic Extremes Using Monte-Carlo Simulation
NASA Astrophysics Data System (ADS)
Byun, K.; Hamlet, A. F.
2017-12-01
There is substantial evidence that observed hydrologic extremes (e.g. floods, extreme stormwater events, and low flows) are changing and that climate change will continue to alter the probability distributions of hydrologic extremes over time. These non-stationary risks imply that conventional approaches for designing hydrologic infrastructure (or making other climate-sensitive decisions) based on retrospective analysis and stationary statistics will become increasingly problematic through time. To develop a framework for assessing risks in a non-stationary environment our study develops a new approach using a super ensemble of simulated hydrologic extremes based on Monte Carlo (MC) methods. Specifically, using statistically downscaled future GCM projections from the CMIP5 archive (using the Hybrid Delta (HD) method), we extract daily precipitation (P) and temperature (T) at 1/16 degree resolution based on a group of moving 30-yr windows within a given design lifespan (e.g. 10, 25, 50-yr). Using these T and P scenarios we simulate daily streamflow using the Variable Infiltration Capacity (VIC) model for each year of the design lifespan and fit a Generalized Extreme Value (GEV) probability distribution to the simulated annual extremes. MC experiments are then used to construct a random series of 10,000 realizations of the design lifespan, estimating annual extremes using the estimated unique GEV parameters for each individual year of the design lifespan. Our preliminary results for two watersheds in Midwest show that there are considerable differences in the extreme values for a given percentile between conventional MC and non-stationary MC approach. Design standards based on our non-stationary approach are also directly dependent on the design lifespan of infrastructure, a sensitivity which is notably absent from conventional approaches based on retrospective analysis. The experimental approach can be applied to a wide range of hydroclimatic variables of interest.
Taguchi optimization of bismuth-telluride based thermoelectric cooler
NASA Astrophysics Data System (ADS)
Anant Kishore, Ravi; Kumar, Prashant; Sanghadasa, Mohan; Priya, Shashank
2017-07-01
In the last few decades, considerable effort has been made to enhance the figure-of-merit (ZT) of thermoelectric (TE) materials. However, the performance of commercial TE devices still remains low due to the fact that the module figure-of-merit not only depends on the material ZT, but also on the operating conditions and configuration of TE modules. This study takes into account comprehensive set of parameters to conduct the numerical performance analysis of the thermoelectric cooler (TEC) using a Taguchi optimization method. The Taguchi method is a statistical tool that predicts the optimal performance with a far less number of experimental runs than the conventional experimental techniques. Taguchi results are also compared with the optimized parameters obtained by a full factorial optimization method, which reveals that the Taguchi method provides optimum or near-optimum TEC configuration using only 25 experiments against 3125 experiments needed by the conventional optimization method. This study also shows that the environmental factors such as ambient temperature and cooling coefficient do not significantly affect the optimum geometry and optimum operating temperature of TECs. The optimum TEC configuration for simultaneous optimization of cooling capacity and coefficient of performance is also provided.
Generalized Reich-Moore R-matrix approximation
NASA Astrophysics Data System (ADS)
Arbanas, Goran; Sobes, Vladimir; Holcomb, Andrew; Ducru, Pablo; Pigni, Marco; Wiarda, Dorothea
2017-09-01
A conventional Reich-Moore approximation (RMA) of R-matrix is generalized into a manifestly unitary form by introducing a set of resonant capture channels treated explicitly in a generalized, reduced R-matrix. A dramatic reduction of channel space witnessed in conventional RMA, from Nc × Nc full R-matrix to Np × Np reduced R-matrix, where Nc = Np + Nγ, Np and Nγ denoting the number of particle and γ-ray channels, respectively, is due to Np < Nγ. A corresponding reduction of channel space in generalized RMA (GRMA) is from Nc × Nc full R-matrix to N × N, where N = Np + N, and where N is the number of capture channels defined in GRMA. We show that N = Nλ where Nλ is the number of R-matrix levels. This reduction in channel space, although not as dramatic as in the conventional RMA, could be significant for medium and heavy nuclides where N < Nγ. The resonant capture channels defined by GRMA accommodate level-level interference (via capture channels) neglected in conventional RMA. The expression for total capture cross section in GRMA is formally equal to that of the full Nc × NcR-matrix. This suggests that GRMA could yield improved nuclear data evaluations in the resolved resonance range at a cost of introducing N(N - 1)/2 resonant capture width parameters relative to conventional RMA. Manifest unitarity of GRMA justifies a method advocated by Fröhner and implemented in the SAMMY nuclear data evaluation code for enforcing unitarity of conventional RMA. Capture widths of GRMA are exactly convertible into alternative R-matrix parameters via Brune tranform. Application of idealized statistical methods to GRMA shows that variance among conventional RMA capture widths in extant RMA evaluations could be used to estimate variance among off-diagonal elements neglected by conventional RMA. Significant departure of capture widths from an idealized distribution may indicate the presence of underlying doorway states.
Qiu, Jin; Cheng, Jiajing; Wang, Qingying; Hua, Jie
2014-01-01
Background The aim of this study was to compare the effects of the levonorgestrel-releasing intrauterine system (LNG-IUS) with conventional medical treatment in reducing heavy menstrual bleeding. Material/Methods Relevant studies were identified by a search of MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials, and clinical trials registries (from inception to April 2014). Randomized controlled trials comparing the LNG-IUS with conventional medical treatment (mefenamic acid, tranexamic acid, norethindrone, medroxyprogesterone acetate injection, or combined oral contraceptive pills) in patients with menorrhagia were included. Results Eight randomized controlled trials that included 1170 women (LNG-IUS, n=562; conventional medical treatment, n=608) met inclusion criteria. The LNG-IUS was superior to conventional medical treatment in reducing menstrual blood loss (as measured by the alkaline hematin method or estimated by pictorial bleeding assessment chart scores). More women were satisfied with the LNG-IUS than with the use of conventional medical treatment (odds ratio [OR] 5.19, 95% confidence interval [CI] 2.73–9.86). Compared with conventional medical treatment, the LNG-IUS was associated with a lower rate of discontinuation (14.6% vs. 28.9%, OR 0.39, 95% CI 0.20–0.74) and fewer treatment failures (9.2% vs. 31.0%, OR 0.18, 95% CI 0.10–0.34). Furthermore, quality of life assessment favored LNG-IUS over conventional medical treatment, although use of various measurements limited our ability to pool the data for more powerful evidence. Serious adverse events were statistically comparable between treatments. Conclusions The LNG-IUS was the more effective first choice for management of menorrhagia compared with conventional medical treatment. Long-term, randomized trials are required to further investigate patient-based outcomes and evaluate the cost-effectiveness of the LNG-IUS and other medical treatments. PMID:25245843
Quinn, Frank; Keogh, Paul; McDonald, Ailbhe; Hussey, David
2003-02-01
The use of virtual reality (VR) in the training of operative dentistry is a recent innovation and little research has been published on its efficacy compared to conventional training methods. To evaluate possible benefits, junior undergraduate dental students were randomly assigned to one of three groups: group 1 as taught by conventional means only; group 2 as trained by conventional means combined with VR repetition and reinforcement (with access to a human instructor for operative advice); and group 3 as trained by conventional means combined with VR repetition and reinforcement, but without instructor evaluation/advice, which was only supplied via the VR-associated software. At the end of the research period, all groups executed two class 1 preparations that were evaluated blindly by 'expert' trainers, under traditional criteria (outline, retention, smoothness, depth, wall angulation and cavity margin index). Analyses of resulting scores indicated a lack of significant differences between the three groups except for scores for the category of 'outline form', for group 2, which produced significantly lower (i.e. better) scores than the conventionally trained group. A statistical comparison between scores from two 'expert' examiners indicated lack of agreement, despite identical written and visual criteria being used for evaluation by both. Both examiners, however, generally showed similar trends in evaluation. An anonymous questionnaire suggested that students recognized the benefits of VR training (e.g. ready access to assessment, error identification and how they can be corrected), but the majority felt that it would not replace conventional training methods (95%), although participants recognized the potential for development of VR systems in dentistry. The most common reasons cited for the preference of conventional training were excessive critical feedback (55%), lack of personal contact (50%) and technical hardware difficulties (20%) associated with VR-based training.
Kaleli, Necati; Saraç, Duygu
2017-07-01
Most studies evaluating dental laser sintering systems have focused on the marginal accuracy of the restorations. However, the bond strength at the metal-ceramic interface is another important factor that affects the survival of restorations, and currently, few studies focus on this aspect. The purpose of this in vitro study was to compare the porcelain bond strength of cobalt-chromium (Co-Cr) metal frameworks prepared by using the conventional lost-wax technique, milling, direct metal laser sintering (DMLS), and laser cusing, a direct process powder-bed system. A total of 96 metal frameworks (n=24 in each group) were prepared by using conventional lost-wax (group C), milling (group M), DMLS (group LS), and direct process powder-bed (group LC) methods according to International Organization for Standardization standard ISO 9693-1. After porcelain application, a 3-point bend test was applied to each specimen by using a universal testing machine. Data were statistically analyzed using 1-way ANOVA and Tukey honest significant difference tests (α=.05). Failure types at the metal-ceramic interfaces were examined using stereomicroscopy. Additionally, 1 specimen from each group was prepared for scanning electron microscopy analysis to evaluate the surface topography of metal frameworks. The mean bond strength was 38.08 ±3.82 MPa for group C, 39.29 ±3.51 MPa for group M, 40.73 ±3.58 MPa for group LS, and 41.24 ±3.75 MPa for group LC. Statistically significant differences were observed among the 4 groups (P=.016). All groups, except for LS, exhibited adhesive and mixed type bond failure. Both of the laser sintering methods were found to be successful in terms of metal-ceramic bond strength. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Simultaneous 3D MR elastography of the in vivo mouse brain
NASA Astrophysics Data System (ADS)
Kearney, Steven P.; Majumdar, Shreyan; Royston, Thomas J.; Klatt, Dieter
2017-10-01
The feasibility of sample interval modulation (SLIM) magnetic resonance elastography (MRE) for the in vivo mouse brain is assessed, and an alternative SLIM-MRE encoding method is introduced. In SLIM-MRE, the phase accumulation for each motion direction is encoded simultaneously by varying either the start time of the motion encoding gradient (MEG), SLIM-phase constant (SLIM-PC), or the initial phase of the MEG, SLIM-phase varying (SLIM-PV). SLIM-PC provides gradient moment nulling, but the mutual gradient shift necessitates increased echo time (TE). SLIM-PV requires no increased TE, but exhibits non-uniform flow compensation. Comparison was to conventional MRE using six C57BL/6 mice. For SLIM-PC, the Spearman’s rank correlation to conventional MRE for the shear storage and loss modulus images were 80% and 76%, respectively, and likewise for SLIM-PV, 73% and 69%, respectively. The results of the Wilcoxon rank sum test showed that there were no statistically significant differences between the spatially averaged shear moduli derived from conventional-MRE, SLIM-PC, and SLIM-PV acquisitions. Both SLIM approaches were comparable to conventional MRE scans with Spearman’s rank correlation of 69%-80% and with 3 times reduction in scan time. The SLIM-PC method had the best correlation, and SLIM-PV may be a useful tool in experimental conditions, where both measurement time and T2 relaxation is critical.
Simultaneous 3D MR elastography of the in vivo mouse brain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kearney, Steven P.; Majumdar, Shreyan; Royston, Thomas J.
The feasibility of sample interval modulation (SLIM) magnetic resonance elastography (MRE) for the in vivo mouse brain is assessed, and an alternative SLIM-MRE encoding method is introduced. In SLIMMRE, the phase accumulation for each motion direction is encoded simultaneously by varying either the start time of the motion encoding gradient (MEG), SLIM-phase constant (SLIM-PC), or the initial phase of the MEG, SLIM-phase varying (SLIM-PV). SLIM-PC provides gradient moment nulling, but the mutual gradient shift necessitates increased echo time (TE). SLIM-PV requires no increased TE, but exhibits nonuniform flow compensation. Comparison was to conventional MRE using six C57BL/6 mice. For SLIMPC,more » the Spearman’s rank correlation to conventional MRE for the shear storage and loss modulus images were 80% and 76%, respectively, and likewise for SLIM-PV, 73% and 69%, respectively. The results of the Wilcoxon rank sum test showed that there were no statistically significant differences between the spatially averaged shear moduli derived from conventional-MRE, SLIM-PC, and SLIM-PV acquisitions. Both SLIM approaches were comparable to conventional MRE scans with Spearman’s rank correlation of 69%-80% and with 3 times reduction in scan time. As a result, the SLIM-PC method had the best correlation, and SLIM-PV may be a useful tool in experimental conditions, where both measurement time and T2 relaxation is critical.« less
Simultaneous 3D MR elastography of the in vivo mouse brain
Kearney, Steven P.; Majumdar, Shreyan; Royston, Thomas J.; ...
2017-09-15
The feasibility of sample interval modulation (SLIM) magnetic resonance elastography (MRE) for the in vivo mouse brain is assessed, and an alternative SLIM-MRE encoding method is introduced. In SLIMMRE, the phase accumulation for each motion direction is encoded simultaneously by varying either the start time of the motion encoding gradient (MEG), SLIM-phase constant (SLIM-PC), or the initial phase of the MEG, SLIM-phase varying (SLIM-PV). SLIM-PC provides gradient moment nulling, but the mutual gradient shift necessitates increased echo time (TE). SLIM-PV requires no increased TE, but exhibits nonuniform flow compensation. Comparison was to conventional MRE using six C57BL/6 mice. For SLIMPC,more » the Spearman’s rank correlation to conventional MRE for the shear storage and loss modulus images were 80% and 76%, respectively, and likewise for SLIM-PV, 73% and 69%, respectively. The results of the Wilcoxon rank sum test showed that there were no statistically significant differences between the spatially averaged shear moduli derived from conventional-MRE, SLIM-PC, and SLIM-PV acquisitions. Both SLIM approaches were comparable to conventional MRE scans with Spearman’s rank correlation of 69%-80% and with 3 times reduction in scan time. As a result, the SLIM-PC method had the best correlation, and SLIM-PV may be a useful tool in experimental conditions, where both measurement time and T2 relaxation is critical.« less
Ultrasonography with color Doppler and power Doppler in the diagnosis of periapical lesions
Goel, Sumit; Nagendrareddy, Suma Gundareddy; Raju, Manthena Srinivasa; Krishnojirao, Dayashankara Rao Jingade; Rastogi, Rajul; Mohan, Ravi Prakash Sasankoti; Gupta, Swati
2011-01-01
Aim: To evaluate the efficacy of ultrasonography (USG) with color Doppler and power Doppler applications over conventional radiography in the diagnosis of periapical lesions. Materials and Methods: Thirty patients having inflammatory periapical lesions of the maxillary or mandibular anterior teeth and requiring endodontic surgery were selected for inclusion in this study. All patients consented to participate in the study. We used conventional periapical radiographs as well as USG with color Doppler and power Doppler for the diagnosis of these lesions. Their diagnostic performances were compared against histopathologic examination. All data were compared and statistically analyzed. Results: USG examination with color Doppler and power Doppler identified 29 (19 cysts and 10 granulomas) of 30 periapical lesions accurately, with a sensitivity of 100% for cysts and 90.91% for granulomas and a specificity of 90.91% for cysts and 100% for granulomas. In comparison, conventional intraoral radiography identified only 21 lesions (sensitivity of 78.9% for cysts and 45.4% for granulomas and specificity of 45.4% for cysts and 78.9% for granulomas). There was definite correlation between the echotexture of the lesions and the histopathological features except in one case. Conclusions: USG imaging with color Doppler and power Doppler is superior to conventional intraoral radiographic methods for diagnosing the nature of periapical lesions in the anterior jaws. This study reveals the potential of USG examination in the study of other jaw lesions. PMID:22223940
Statistical framework and noise sensitivity of the amplitude radial correlation contrast method.
Kipervaser, Zeev Gideon; Pelled, Galit; Goelman, Gadi
2007-09-01
A statistical framework for the amplitude radial correlation contrast (RCC) method, which integrates a conventional pixel threshold approach with cluster-size statistics, is presented. The RCC method uses functional MRI (fMRI) data to group neighboring voxels in terms of their degree of temporal cross correlation and compares coherences in different brain states (e.g., stimulation OFF vs. ON). By defining the RCC correlation map as the difference between two RCC images, the map distribution of two OFF states is shown to be normal, enabling the definition of the pixel cutoff. The empirical cluster-size null distribution obtained after the application of the pixel cutoff is used to define a cluster-size cutoff that allows 5% false positives. Assuming that the fMRI signal equals the task-induced response plus noise, an analytical expression of amplitude-RCC dependency on noise is obtained and used to define the pixel threshold. In vivo and ex vivo data obtained during rat forepaw electric stimulation are used to fine-tune this threshold. Calculating the spatial coherences within in vivo and ex vivo images shows enhanced coherence in the in vivo data, but no dependency on the anesthesia method, magnetic field strength, or depth of anesthesia, strengthening the generality of the proposed cutoffs. Copyright (c) 2007 Wiley-Liss, Inc.
Alavi, Shiva; Kachuie, Marzie
2017-01-01
This study was conducted to assess the hardness of orthodontic brackets produced by metal injection molding (MIM) and conventional methods and different orthodontic wires (stainless steel, nickel-titanium [Ni-Ti], and beta-titanium alloys) for better clinical results. A total of 15 specimens from each brand of orthodontic brackets and wires were examined. The brackets (Elite Opti-Mim which is produced by MIM process and Ultratrimm which is produced by conventional brazing method) and the wires (stainless steel, Ni-Ti, and beta-titanium) were embedded in epoxy resin, followed by grinding, polishing, and coating. Then, X-ray energy dispersive spectroscopy (EDS) microanalysis was applied to assess their elemental composition. The same specimen surfaces were repolished and used for Vickers microhardness assessment. Hardness was statistically analyzed with Kruskal-Wallis test, followed by Mann-Whitney test at the 0.05 level of significance. The X-ray EDS analysis revealed different ferrous or co-based alloys in each bracket. The maximum mean hardness values of the wires were achieved for stainless steel (SS) (529.85 Vickers hardness [VHN]) versus the minimum values for beta-titanium (334.65 VHN). Among the brackets, Elite Opti-Mim exhibited significantly higher VHN values (262.66 VHN) compared to Ultratrimm (206.59 VHN). VHN values of wire alloys were significantly higher than those of the brackets. MIM orthodontic brackets exhibited hardness values much lower than those of SS orthodontic archwires and were more compatible with NiTi and beta-titanium archwires. A wide range of microhardness values has been reported for conventional orthodontic brackets and it should be considered that the manufacturing method might be only one of the factors affecting the mechanical properties of orthodontic brackets including hardness.
Atalay, Altay; Koc, Ayse Nedret; Suel, Ahmet; Sav, Hafize; Demir, Gonca; Elmali, Ferhan; Cakir, Nuri; Seyedmousavi, Seyedmojtaba
2016-09-01
Aspergillus species cause a wide range of diseases in humans, including allergies, localized infections, or fatal disseminated diseases. Rapid detection and identification of Aspergillus spp. facilitate effective patient management. In the current study we compared conventional morphological methods with PCR sequencing, rep-PCR, and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) for the identification of Aspergillus strains. A total of 24 consecutive clinical isolates of Aspergillus were collected during 2012-2014. Conventional morphology and rep-PCR were performed in our Mycology Laboratory. The identification, evaluation, and reporting of strains using MALDI-TOF-MS were performed by BioMérieux Diagnostic, Inc. in Istanbul. DNA sequence analysis of the clinical isolates was performed by the BMLabosis laboratory in Ankara. Samples consisted of 18 (75%) lower respiratory tract specimens, 3 otomycosis (12.5%) ear tissues, 1 sample from keratitis, and 1 sample from a cutaneous wound. According to DNA sequence analysis, 12 (50%) specimens were identified as A. fumigatus, 8 (33.3%) as A. flavus, 3 (12.5%) as A. niger, and 1 (4.2%) as A. terreus. Statistically, there was good agreement between the conventional morphology and rep-PCR and MALDI-TOF methods; kappa values were κ = 0.869, 0.871, and 0.916, respectively (P < 0.001). The good level of agreement between the methods included in the present study and sequence method could be due to the identification of Aspergillus strains that were commonly encountered. Therefore, it was concluded that studies conducted with a higher number of isolates, which include other Aspergillus strains, are required. © 2016 Wiley Periodicals, Inc.
Sadeghi, Roya; Sedaghat, Mohammad Mehdi; Sha Ahmadi, Faramarz
2014-10-01
Blended learning, a new approach in educational planning, is defined as an applying more than one method, strategy, technique or media in education. Todays, due to the development of infrastructure of Internet networks and the access of most of the students, the Internet can be utilized along with traditional and conventional methods of training. The aim of this study was to compare the students' learning and satisfaction in combination of lecture and e-learning with conventional lecture methods. This quasi-experimental study is conducted among the sophomore students of Public Health School, Tehran University of Medical Science in 2012-2013. Four classes of the school are randomly selected and are divided into two groups. Education in two classes (45 students) was in the form of lecture method and in the other two classes (48 students) was blended method with e-Learning and lecture methods. The students' knowledge about tuberculosis in two groups was collected and measured by using pre and post-test. This step has been done by sending self-reported electronic questionnaires to the students' email addresses through Google Document software. At the end of educational programs, students' satisfaction and comments about two methods were also collected by questionnaires. Statistical tests such as descriptive methods, paired t-test, independent t-test and ANOVA were done through the SPSS 14 software, and p≤0.05 was considered as significant difference. The mean scores of the lecture and blended groups were 13.18±1.37 and 13.35±1.36, respectively; the difference between the pre-test scores of the two groups was not statistically significant (p=0.535). Knowledge scores increased in both groups after training, and the mean and standard deviation of knowledge scores of the lectures and combined groups were 16.51±0.69 and 16.18±1.06, respectively. The difference between the post-test scores of the two groups was not statistically significant (p=0.112). Students' satisfaction in blended learning method was higher than lecture method. The results revealed that the blended method is effective in increasing the students' learning rate. E-learning can be used to teach some courses and might be considered as economic aspects. Since in universities of medical sciences in the country, the majority of students have access to the Internet and email address, using e-learning could be used as a supplement to traditional teaching methods or sometimes as educational alternative method because this method of teaching increases the students' knowledge, satisfaction and attention.
Malissiova, Eleni; Papadopoulos, Theofilos; Kyriazi, Aikaterini; Mparda, Maria; Sakorafa, Christina; Katsioulis, Antonios; Katsiaflaka, Anna; Kyritsi, Maria; Zdragas, Antonios; Hadjichristodoulou, Christos
2017-05-01
The aim of this study was to examine differences in the microbiological profile and antimicrobial resistance of bacteria isolated from milk from organic and conventional sheep and goat farms. Twenty-five organic and 25 conventional sheep and goat farms in the region of Thessaly, Greece participated in this study. A standardised detailed questionnaire was used to describe farming practices. A total of 50 samples were collected and analysed for total viable count (TVC), total coliform count (TCC) and somatic cell count (SCC), while Staphylococcus aureus and Escherichia coli were isolated using standard methods. Isolates were identified at species level by Api-test and Matrix-Assisted Laser Desorption/Ionisation-Time of Flight Mass Spectrometry (MALDI-TOF MS). Susceptibility to a panel of 20 for E. coli and 16 for S. aureus antimicrobials was determined by the agar dilution method. Pulsed Field Gel Electrophoresis (PFGE) was performed for S. aureus and E. coli isolates to determine predominant clones. Lower counts of TVC, TCC and SCC were identified in milk from the organic farms, possibly due to differences in the hygienic farming practices found on those farms. API-tests and MALDI-TOF MS showed no significant differences in the S. aureus and E. coli isolates. Overall, antimicrobial resistance rates were low, while a statistically higher percentage was estimated among strains originating from conventional farms in comparison with organic farms, possibly due to the restriction of antibiotic use in organic farming. PFGE revealed diversity among S. aureus and E. coli populations in both organic and conventional farms indicating circulation of 2-3 main clones changing slightly during their evolution. Consequently, there is evidence that milk from the organic farms presents a better microbiological profile when compared with milk from conventional farms.
Target attribute-based false alarm rejection in small infrared target detection
NASA Astrophysics Data System (ADS)
Kim, Sungho
2012-11-01
Infrared search and track is an important research area in military applications. Although there are a lot of works on small infrared target detection methods, we cannot apply them in real field due to high false alarm rate caused by clutters. This paper presents a novel target attribute extraction and machine learning-based target discrimination method. Eight kinds of target features are extracted and analyzed statistically. Learning-based classifiers such as SVM and Adaboost are developed and compared with conventional classifiers for real infrared images. In addition, the generalization capability is also inspected for various infrared clutters.
Fundamental Vocabulary Selection Based on Word Familiarity
NASA Astrophysics Data System (ADS)
Sato, Hiroshi; Kasahara, Kaname; Kanasugi, Tomoko; Amano, Shigeaki
This paper proposes a new method for selecting fundamental vocabulary. We are presently constructing the Fundamental Vocabulary Knowledge-base of Japanese that contains integrated information on syntax, semantics and pragmatics, for the purposes of advanced natural language processing. This database mainly consists of a lexicon and a treebank: Lexeed (a Japanese Semantic Lexicon) and the Hinoki Treebank. Fundamental vocabulary selection is the first step in the construction of Lexeed. The vocabulary should include sufficient words to describe general concepts for self-expandability, and should not be prohibitively large to construct and maintain. There are two conventional methods for selecting fundamental vocabulary. The first is intuition-based selection by experts. This is the traditional method for making dictionaries. A weak point of this method is that the selection strongly depends on personal intuition. The second is corpus-based selection. This method is superior in objectivity to intuition-based selection, however, it is difficult to compile a sufficiently balanced corpora. We propose a psychologically-motivated selection method that adopts word familiarity as the selection criterion. Word familiarity is a rating that represents the familiarity of a word as a real number ranging from 1 (least familiar) to 7 (most familiar). We determined the word familiarity ratings statistically based on psychological experiments over 32 subjects. We selected about 30,000 words as the fundamental vocabulary, based on a minimum word familiarity threshold of 5. We also evaluated the vocabulary by comparing its word coverage with conventional intuition-based and corpus-based selection over dictionary definition sentences and novels, and demonstrated the superior coverage of our lexicon. Based on this, we conclude that the proposed method is superior to conventional methods for fundamental vocabulary selection.
Nonlinear dynamic analysis of voices before and after surgical excision of vocal polyps
NASA Astrophysics Data System (ADS)
Zhang, Yu; McGilligan, Clancy; Zhou, Liang; Vig, Mark; Jiang, Jack J.
2004-05-01
Phase space reconstruction, correlation dimension, and second-order entropy, methods from nonlinear dynamics, are used to analyze sustained vowels generated by patients before and after surgical excision of vocal polyps. Two conventional acoustic perturbation parameters, jitter and shimmer, are also employed to analyze voices before and after surgery. Presurgical and postsurgical analyses of jitter, shimmer, correlation dimension, and second-order entropy are statistically compared. Correlation dimension and second-order entropy show a statistically significant decrease after surgery, indicating reduced complexity and higher predictability of postsurgical voice dynamics. There is not a significant postsurgical difference in shimmer, although jitter shows a significant postsurgical decrease. The results suggest that jitter and shimmer should be applied to analyze disordered voices with caution; however, nonlinear dynamic methods may be useful for analyzing abnormal vocal function and quantitatively evaluating the effects of surgical excision of vocal polyps.
Chapman, Benjamin P.; Weiss, Alexander; Duberstein, Paul
2016-01-01
Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in “big data” problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how three common SLT algorithms–Supervised Principal Components, Regularization, and Boosting—can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach—or perhaps because of them–SLT methods may hold value as a statistically rigorous approach to exploratory regression. PMID:27454257
NASA Astrophysics Data System (ADS)
Miyazawa, Arata; Hong, Young-Joo; Makita, Shuichi; Kasaragod, Deepa K.; Miura, Masahiro; Yasuno, Yoshiaki
2017-02-01
Local statistics are widely utilized for quantification and image processing of OCT. For example, local mean is used to reduce speckle, local variation of polarization state (degree-of-polarization-uniformity (DOPU)) is used to visualize melanin. Conventionally, these statistics are calculated in a rectangle kernel whose size is uniform over the image. However, the fixed size and shape of the kernel result in a tradeoff between image sharpness and statistical accuracy. Superpixel is a cluster of pixels which is generated by grouping image pixels based on the spatial proximity and similarity of signal values. Superpixels have variant size and flexible shapes which preserve the tissue structure. Here we demonstrate a new superpixel method which is tailored for multifunctional Jones matrix OCT (JM-OCT). This new method forms the superpixels by clustering image pixels in a 6-dimensional (6-D) feature space (spatial two dimensions and four dimensions of optical features). All image pixels were clustered based on their spatial proximity and optical feature similarity. The optical features are scattering, OCT-A, birefringence and DOPU. The method is applied to retinal OCT. Generated superpixels preserve the tissue structures such as retinal layers, sclera, vessels, and retinal pigment epithelium. Hence, superpixel can be utilized as a local statistics kernel which would be more suitable than a uniform rectangle kernel. Superpixelized image also can be used for further image processing and analysis. Since it reduces the number of pixels to be analyzed, it reduce the computational cost of such image processing.
Dasgupta, Aritra; Lee, Joon-Yong; Wilson, Ryan; Lafrance, Robert A; Cramer, Nick; Cook, Kristin; Payne, Samuel
2017-01-01
Combining interactive visualization with automated analytical methods like statistics and data mining facilitates data-driven discovery. These visual analytic methods are beginning to be instantiated within mixed-initiative systems, where humans and machines collaboratively influence evidence-gathering and decision-making. But an open research question is that, when domain experts analyze their data, can they completely trust the outputs and operations on the machine-side? Visualization potentially leads to a transparent analysis process, but do domain experts always trust what they see? To address these questions, we present results from the design and evaluation of a mixed-initiative, visual analytics system for biologists, focusing on analyzing the relationships between familiarity of an analysis medium and domain experts' trust. We propose a trust-augmented design of the visual analytics system, that explicitly takes into account domain-specific tasks, conventions, and preferences. For evaluating the system, we present the results of a controlled user study with 34 biologists where we compare the variation of the level of trust across conventional and visual analytic mediums and explore the influence of familiarity and task complexity on trust. We find that despite being unfamiliar with a visual analytic medium, scientists seem to have an average level of trust that is comparable with the same in conventional analysis medium. In fact, for complex sense-making tasks, we find that the visual analytic system is able to inspire greater trust than other mediums. We summarize the implications of our findings with directions for future research on trustworthiness of visual analytic systems.
Duc, Anh Nguyen; Wolbers, Marcel
2017-02-10
Composite endpoints are widely used as primary endpoints of randomized controlled trials across clinical disciplines. A common critique of the conventional analysis of composite endpoints is that all disease events are weighted equally, whereas their clinical relevance may differ substantially. We address this by introducing a framework for the weighted analysis of composite endpoints and interpretable test statistics, which are applicable to both binary and time-to-event data. To cope with the difficulty of selecting an exact set of weights, we propose a method for constructing simultaneous confidence intervals and tests that asymptotically preserve the family-wise type I error in the strong sense across families of weights satisfying flexible inequality or order constraints based on the theory of χ¯2-distributions. We show that the method achieves the nominal simultaneous coverage rate with substantial efficiency gains over Scheffé's procedure in a simulation study and apply it to trials in cardiovascular disease and enteric fever. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
He, Xiulan; Sonnenborg, Torben O.; Jørgensen, Flemming; Jensen, Karsten H.
2017-03-01
Stationarity has traditionally been a requirement of geostatistical simulations. A common way to deal with non-stationarity is to divide the system into stationary sub-regions and subsequently merge the realizations for each region. Recently, the so-called partition approach that has the flexibility to model non-stationary systems directly was developed for multiple-point statistics simulation (MPS). The objective of this study is to apply the MPS partition method with conventional borehole logs and high-resolution airborne electromagnetic (AEM) data, for simulation of a real-world non-stationary geological system characterized by a network of connected buried valleys that incise deeply into layered Miocene sediments (case study in Denmark). The results show that, based on fragmented information of the formation boundaries, the MPS partition method is able to simulate a non-stationary system including valley structures embedded in a layered Miocene sequence in a single run. Besides, statistical information retrieved from the AEM data improved the simulation of the geology significantly, especially for the deep-seated buried valley sediments where borehole information is sparse.
NASA Astrophysics Data System (ADS)
Havens, Timothy C.; Cummings, Ian; Botts, Jonathan; Summers, Jason E.
2017-05-01
The linear ordered statistic (LOS) is a parameterized ordered statistic (OS) that is a weighted average of a rank-ordered sample. LOS operators are useful generalizations of aggregation as they can represent any linear aggregation, from minimum to maximum, including conventional aggregations, such as mean and median. In the fuzzy logic field, these aggregations are called ordered weighted averages (OWAs). Here, we present a method for learning LOS operators from training data, viz., data for which you know the output of the desired LOS. We then extend the learning process with regularization, such that a lower complexity or sparse LOS can be learned. Hence, we discuss what 'lower complexity' means in this context and how to represent that in the optimization procedure. Finally, we apply our learning methods to the well-known constant-false-alarm-rate (CFAR) detection problem, specifically for the case of background levels modeled by long-tailed distributions, such as the K-distribution. These backgrounds arise in several pertinent imaging problems, including the modeling of clutter in synthetic aperture radar and sonar (SAR and SAS) and in wireless communications.
Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents
NASA Astrophysics Data System (ADS)
Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.
2016-12-01
Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.
TOY SAFETY SURVEILLANCE FROM ONLINE REVIEWS
Winkler, Matt; Abrahams, Alan S.; Gruss, Richard; Ehsani, Johnathan P.
2016-01-01
Toy-related injuries account for a significant number of childhood injuries and the prevention of these injuries remains a goal for regulatory agencies and manufacturers. Text-mining is an increasingly prevalent method for uncovering the significance of words using big data. This research sets out to determine the effectiveness of text-mining in uncovering potentially dangerous children’s toys. We develop a danger word list, also known as a ‘smoke word’ list, from injury and recall text narratives. We then use the smoke word lists to score over one million Amazon reviews, with the top scores denoting potential safety concerns. We compare the smoke word list to conventional sentiment analysis techniques, in terms of both word overlap and effectiveness. We find that smoke word lists are highly distinct from conventional sentiment dictionaries and provide a statistically significant method for identifying safety concerns in children’s toy reviews. Our findings indicate that text-mining is, in fact, an effective method for the surveillance of safety concerns in children’s toys and could be a gateway to effective prevention of toy-product-related injuries. PMID:27942092
Neither fixed nor random: weighted least squares meta-analysis.
Stanley, T D; Doucouliagos, Hristos
2015-06-15
This study challenges two core conventional meta-analysis methods: fixed effect and random effects. We show how and explain why an unrestricted weighted least squares estimator is superior to conventional random-effects meta-analysis when there is publication (or small-sample) bias and better than a fixed-effect weighted average if there is heterogeneity. Statistical theory and simulations of effect sizes, log odds ratios and regression coefficients demonstrate that this unrestricted weighted least squares estimator provides satisfactory estimates and confidence intervals that are comparable to random effects when there is no publication (or small-sample) bias and identical to fixed-effect meta-analysis when there is no heterogeneity. When there is publication selection bias, the unrestricted weighted least squares approach dominates random effects; when there is excess heterogeneity, it is clearly superior to fixed-effect meta-analysis. In practical applications, an unrestricted weighted least squares weighted average will often provide superior estimates to both conventional fixed and random effects. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Chlebda, Damian K.; Majda, Alicja; Łojewski, Tomasz; Łojewska, Joanna
2016-11-01
Differentiation of the written text can be performed with a non-invasive and non-contact tool that connects conventional imaging methods with spectroscopy. Hyperspectral imaging (HSI) is a relatively new and rapid analytical technique that can be applied in forensic science disciplines. It allows an image of the sample to be acquired, with full spectral information within every pixel. For this paper, HSI and three statistical methods (hierarchical cluster analysis, principal component analysis, and spectral angle mapper) were used to distinguish between traces of modern black gel pen inks. Non-invasiveness and high efficiency are among the unquestionable advantages of ink differentiation using HSI. It is also less time-consuming than traditional methods such as chromatography. In this study, a set of 45 modern gel pen ink marks deposited on a paper sheet were registered. The spectral characteristics embodied in every pixel were extracted from an image and analysed using statistical methods, externally and directly on the hypercube. As a result, different black gel inks deposited on paper can be distinguished and classified into several groups, in a non-invasive manner.
Chen, Y-J; Chen, S-K; Huang, H-W; Yao, C-C; Chang, H-F
2004-09-01
To compare the cephalometric landmark identification on softcopy and hardcopy of direct digital cephalography acquired by a storage-phosphor (SP) imaging system. Ten digital cephalograms and their conventional counterpart, hardcopy on a transparent blue film, were obtained by a SP imaging system and a dye sublimation printer. Twelve orthodontic residents identified 19 cephalometric landmarks on monitor-displayed SP digital images with computer-aided method and on their hardcopies with conventional method. The x- and y-coordinates for each landmark, indicating the horizontal and vertical positions, were analysed to assess the reliability of landmark identification and evaluate the concordance of the landmark locations in softcopy and hardcopy of SP digital cephalometric radiography. For each of the 19 landmarks, the location differences as well as the horizontal and vertical components were statistically significant between SP digital cephalometric radiography and its hardcopy. Smaller interobserver errors on SP digital images than those on their hardcopies were noted for all the landmarks, except point Go in vertical direction. The scatter-plots demonstrate the characteristic distribution of the interobserver error in both horizontal and vertical directions. Generally, the dispersion of interobserver error on SP digital cephalometric radiography is less than that on its hardcopy with conventional method. The SP digital cephalometric radiography could yield better or comparable level of performance in landmark identification as its hardcopy, except point Go in vertical direction.
Khalil, Rowaida; Gomaa, Mohamed
2014-01-01
This is a pioneer study in Egypt that provides some assessment of the microbiological quality of conventional and organic leafy green vegetables that constitute an essential component of the Egyptians' daily diet. A total of 380 samples of unpackaged whole conventional and 84 packaged whole organic leafy greens were collected from retail markets in Alexandria, and analyzed for total aerobic mesophilic count (AMC) and total E. coli count (ECC) using the standard spread plate method. Mean AMC values for organic samples were statistically less (p < 0.05) than those of the corresponding conventional samples. Conventional radish and organic parsley samples had the highest AMC of 7.17 and 7.68 log CFU/g respectively, while conventional green cabbage and organic basil had the lowest AMC of 3.63 and 3.23 log CFU/g respectively. The presence of E. coli in 100% of the studied leafy greens was indicative of potential fecal contamination, in view of open and unhygienic environmental and unsanitary handling conditions, as leafy green items are available for sale by street-vendors. Unsatisfactory AMC and ECC levels encountered in the studied samples, warrant future investigations to determine the potential prevalence of foodborne pathogens, and to identify sources of dominating microorganisms, which could make a contribution to the field of food safety
2014-01-01
Background The purpose of this study was to compare two impression techniques from the perspective of patient preferences and treatment comfort. Methods Twenty-four (12 male, 12 female) subjects who had no previous experience with either conventional or digital impression participated in this study. Conventional impressions of maxillary and mandibular dental arches were taken with a polyether impression material (Impregum, 3 M ESPE), and bite registrations were made with polysiloxane bite registration material (Futar D, Kettenbach). Two weeks later, digital impressions and bite scans were performed using an intra-oral scanner (CEREC Omnicam, Sirona). Immediately after the impressions were made, the subjects’ attitudes, preferences and perceptions towards impression techniques were evaluated using a standardized questionnaire. The perceived source of stress was evaluated using the State-Trait Anxiety Scale. Processing steps of the impression techniques (tray selection, working time etc.) were recorded in seconds. Statistical analyses were performed with the Wilcoxon Rank test, and p < 0.05 was considered significant. Results There were significant differences among the groups (p < 0.05) in terms of total working time and processing steps. Patients stated that digital impressions were more comfortable than conventional techniques. Conclusions Digital impressions resulted in a more time-efficient technique than conventional impressions. Patients preferred the digital impression technique rather than conventional techniques. PMID:24479892
Weak-value amplification and optimal parameter estimation in the presence of correlated noise
NASA Astrophysics Data System (ADS)
Sinclair, Josiah; Hallaji, Matin; Steinberg, Aephraim M.; Tollaksen, Jeff; Jordan, Andrew N.
2017-11-01
We analytically and numerically investigate the performance of weak-value amplification (WVA) and related parameter estimation methods in the presence of temporally correlated noise. WVA is a special instance of a general measurement strategy that involves sorting data into separate subsets based on the outcome of a second "partitioning" measurement. Using a simplified correlated noise model that can be analyzed exactly together with optimal statistical estimators, we compare WVA to a conventional measurement method. We find that WVA indeed yields a much lower variance of the parameter of interest than the conventional technique does, optimized in the absence of any partitioning measurements. In contrast, a statistically optimal analysis that employs partitioning measurements, incorporating all partitioned results and their known correlations, is found to yield an improvement—typically slight—over the noise reduction achieved by WVA. This result occurs because the simple WVA technique is not tailored to any specific noise environment and therefore does not make use of correlations between the different partitions. We also compare WVA to traditional background subtraction, a familiar technique where measurement outcomes are partitioned to eliminate unknown offsets or errors in calibration. Surprisingly, for the cases we consider, background subtraction turns out to be a special case of the optimal partitioning approach, possessing a similar typically slight advantage over WVA. These results give deeper insight into the role of partitioning measurements (with or without postselection) in enhancing measurement precision, which some have found puzzling. They also resolve previously made conflicting claims about the usefulness of weak-value amplification to precision measurement in the presence of correlated noise. We finish by presenting numerical results to model a more realistic laboratory situation of time-decaying correlations, showing that our conclusions hold for a wide range of statistical models.
Pandey, Pinki; Dixit, Alok; Tanwar, Aparna; Sharma, Anuradha; Mittal, Sanjeev
2014-07-01
Our study presents a new deparaffinizing and hematoxylin and eosin (H and E) staining method that involves the use of easily available, nontoxic and eco-friendly liquid diluted dish washing soap (DWS) by completely eliminating expensive and hazardous xylene and alcohol from deparaffinizing and rehydration prior to staining, staining and from dehydration prior to mounting. The aim was to evaluate and compare the quality of liquid DWS treated xylene and alcohol free (XAF) sections with that of the conventional H and E sections. A total of 100 paraffin embedded tissue blocks from different tissues were included. From each tissue block, one section was stained with conventional H and E (normal sections) and the other with XAF H and E (soapy sections) staining method. Slides were scored using five parameters: Nuclear, cytoplasmic, clarity, uniformity, and crispness of staining. Z-test was used for statistical analysis. Soapy sections scored better for cytoplasmic (90%) and crisp staining (95%) with a statistically significant difference. Whereas for uniformity of staining, normal sections (88%) scored over soapy sections (72%) (Z = 2.82, P < 0.05). For nuclear (90%) and clarity of staining (90%) total scored favored soapy sections, but the difference was not statistically significant. About 84% normal sections stained adequately for diagnosis when compared with 86% in soapy sections (Z = 0.396, P > 0.05). Liquid DWS is a safe and efficient alternative to xylene and alcohol in deparaffinization and routine H and E staining procedure. We are documenting this project that can be used as a model for other histology laboratories.
Damage localization by statistical evaluation of signal-processed mode shapes
NASA Astrophysics Data System (ADS)
Ulriksen, M. D.; Damkilde, L.
2015-07-01
Due to their inherent, ability to provide structural information on a local level, mode shapes and t.lieir derivatives are utilized extensively for structural damage identification. Typically, more or less advanced mathematical methods are implemented to identify damage-induced discontinuities in the spatial mode shape signals, hereby potentially facilitating damage detection and/or localization. However, by being based on distinguishing damage-induced discontinuities from other signal irregularities, an intrinsic deficiency in these methods is the high sensitivity towards measurement, noise. The present, article introduces a damage localization method which, compared to the conventional mode shape-based methods, has greatly enhanced robustness towards measurement, noise. The method is based on signal processing of spatial mode shapes by means of continuous wavelet, transformation (CWT) and subsequent, application of a generalized discrete Teager-Kaiser energy operator (GDTKEO) to identify damage-induced mode shape discontinuities. In order to evaluate whether the identified discontinuities are in fact, damage-induced, outlier analysis of principal components of the signal-processed mode shapes is conducted on the basis of T2-statistics. The proposed method is demonstrated in the context, of analytical work with a free-vibrating Euler-Bernoulli beam under noisy conditions.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Lee, Ki Song; Choe, Young Chan; Park, Sung Hee
2015-10-01
This study examined the structural variables affecting the environmental effects of organic farming compared to those of conventional farming. A meta-analysis based on 107 studies and 360 observations published from 1977 to 2012 compared energy efficiency (EE) and greenhouse gas emissions (GHGE) for organic and conventional farming. The meta-analysis systematically analyzed the results of earlier comparative studies and used logistic regression to identify the structural variables that contributed to differences in the effects of organic and conventional farming on the environment. The statistical evidence identified characteristics that differentiated the environmental effects of organic and conventional farming, which is controversial. The results indicated that data sources, sample size and product type significantly affected EE, whereas product type, cropping pattern and measurement unit significantly affected the GHGE of organic farming compared to conventional farming. Superior effects of organic farming on the environment were more likely to appear for larger samples, primary data rather than secondary data, monocropping rather than multicropping, and crops other than fruits and vegetables. The environmental effects of organic farming were not affected by the study period, geographic location, farm size, cropping pattern, or measurement method. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Jeon, Young-Chan; Jeong, Chang-Mo
2017-01-01
PURPOSE The purpose of this study was to compare the fit of cast gold crowns fabricated from the conventional and the digital impression technique. MATERIALS AND METHODS Artificial tooth in a master model and abutment teeth in ten patients were restored with cast gold crowns fabricated from the digital and the conventional impression technique. The forty silicone replicas were cut in three sections; each section was evaluated in nine points. The measurement was carried out by using a measuring microscope and I-Soultion. Data from the silicone replica were analyzed and all tests were performed with α-level of 0.05. RESULTS 1. The average gaps of cast gold crowns fabricated from the digital impression technique were larger than those of the conventional impression technique significantly. 2. In marginal and internal axial gap of cast gold crowns, no statistical differences were found between the two impression techniques. 3. The internal occlusal gaps of cast gold crowns fabricated from the digital impression technique were larger than those of the conventional impression technique significantly. CONCLUSION Both prostheses presented clinically acceptable results with comparing the fit. The prostheses fabricated from the digital impression technique showed more gaps, in respect of occlusal surface. PMID:28243386
Digholkar, Shruti; Madhav, V. N. V.; Palaskar, Jayant
2016-01-01
Purpose: The purpose of this study was to evaluate and compare the flexural strength and microhardness of provisional restorative materials fabricated utilizing rapid prototyping (RP), Computer Assisted Designing and Computer Assisted Milling (CAD-CAM) and conventional method. Materials and Methods: Twenty specimens of dimensions 25 mm × 2 mm × 2 mm (ADA-ANSI specification #27) were fabricated each using: (1) Three dimensional (3D) printed light-cured micro-hybrid filled composite by RP resin group, (2) a milled polymethyl methacrylate (CH) using CAD-CAM (CC resin group), and (3) a conventionally fabricated heat activated polymerized CH resin group. Flexural strength and microhardness were measured and values obtained were evaluated. Results: The measured mean flexural strength values (MegaPascals) were 79.54 (RP resin group), 104.20 (CC resin group), and 95.58 (CH resin group). The measured mean microhardness values (Knoop hardness number) were 32.77 (RP resin group), 25.33 (CC resin group), and 27.36 (CH resin group). The analysis of variance (ANOVA) test shows that there is statistically significant difference in the flexural strength values of the three groups (P < 0.05). According to the pairwise comparison of Tukey's honest significant difference (HSD) test, flexural strength values of CC resin group and CH resin group were higher and statistically significant than those of the RP resin group (P < 0.05). However, there was no significant difference between flexural strength values of CC resin and CH resin group (P = 0.64). The difference in microhardness values of the three groups was statistically significant according to ANOVA as well as the intergroup comparison done using the Tukey's HSD (post hoc) test (P < 0.05). Conclusions: CC-based CH had the highest flexural strength whereas RP-based 3D printed and light cured micro-hybrid filled composite had the highest microhardness. PMID:27746595
Shivasakthy, M; Asharaf Ali, Syed
2013-10-01
A new material is proposed in dentistry in the form of strips for producing gingival retraction. The clinical efficacy of the material remains untested. This study aimed to determine whether the polyvinyl acetate strips are able to effectively displace the gingival tissues in comparison with the conventional retraction cord. Complete metal ceramic preparation with supra-gingival margin was performed in fourteen maxillary incisors and gingival retraction was done using Merocel strips and conventional retraction cords alternatively in 2 weeks time interval. The amount of displacement was compared using a digital vernier caliper of 0.01mm accuracy. RESULTS were analyzed statistically using Paired students t-test. The statistical analysis of the data revealed that both the conventional retraction cord and the Merocel strip produce significant retraction. Among both the materials, Merocel proved to be significantly more effective. Merocel strip produces more gingival displacement than the conventional retraction cord.
Garaguso, Ivana; Nardini, Mirella
2015-07-15
Wine exerts beneficial effects on human health when it is drunk with moderation. Nevertheless, wine may also contain components negatively affecting human health. Among these, sulfites may induce adverse effects after ingestion. We examined total polyphenols and flavonoids content, phenolics profile and antioxidant activity of eight organic red wines produced without sulfur dioxide/sulfites addition in comparison to those of eight conventional red wines. Polyphenols and flavonoids content were slightly higher in organic wines in respect to conventional wines, however differences did not reach statistical significance. The phenolic acids profile was quite similar in both groups of wines. Antioxidant activity was higher in organic wines compared to conventional wines, although differences were not statistically significant. Our results indicate that organic red wines produced without sulfur dioxide/sulfites addition are comparable to conventional red wines with regard to the total polyphenols and flavonoids content, the phenolics profile and the antioxidant activity. Copyright © 2015 Elsevier Ltd. All rights reserved.
Study of optimum methods of optical communication
NASA Technical Reports Server (NTRS)
Harger, R. O.
1972-01-01
Optimum methods of optical communication accounting for the effects of the turbulent atmosphere and quantum mechanics, both by the semi-classical method and the full-fledged quantum theoretical model are described. A concerted effort to apply the techniques of communication theory to the novel problems of optical communication by a careful study of realistic models and their statistical descriptions, the finding of appropriate optimum structures and the calculation of their performance and, insofar as possible, comparing them to conventional and other suboptimal systems are discussed. In this unified way the bounds on performance and the structure of optimum communication systems for transmission of information, imaging, tracking, and estimation can be determined for optical channels.
Quantifying Cancer Risk from Radiation.
Keil, Alexander P; Richardson, David B
2017-12-06
Complex statistical models fitted to data from studies of atomic bomb survivors are used to estimate the human health effects of ionizing radiation exposures. We describe and illustrate an approach to estimate population risks from ionizing radiation exposure that relaxes many assumptions about radiation-related mortality. The approach draws on developments in methods for causal inference. The results offer a different way to quantify radiation's effects and show that conventional estimates of the population burden of excess cancer at high radiation doses are driven strongly by projecting outside the range of current data. Summary results obtained using the proposed approach are similar in magnitude to those obtained using conventional methods, although estimates of radiation-related excess cancers differ for many age, sex, and dose groups. At low doses relevant to typical exposures, the strength of evidence in data is surprisingly weak. Statements regarding human health effects at low doses rely strongly on the use of modeling assumptions. © 2017 Society for Risk Analysis.
A single test for rejecting the null hypothesis in subgroups and in the overall sample.
Lin, Yunzhi; Zhou, Kefei; Ganju, Jitendra
2017-01-01
In clinical trials, some patient subgroups are likely to demonstrate larger effect sizes than other subgroups. For example, the effect size, or informally the benefit with treatment, is often greater in patients with a moderate condition of a disease than in those with a mild condition. A limitation of the usual method of analysis is that it does not incorporate this ordering of effect size by patient subgroup. We propose a test statistic which supplements the conventional test by including this information and simultaneously tests the null hypothesis in pre-specified subgroups and in the overall sample. It results in more power than the conventional test when the differences in effect sizes across subgroups are at least moderately large; otherwise it loses power. The method involves combining p-values from models fit to pre-specified subgroups and the overall sample in a manner that assigns greater weight to subgroups in which a larger effect size is expected. Results are presented for randomized trials with two and three subgroups.
Amidžić Klarić, Daniela; Klarić, Ilija; Mornar, Ana; Velić, Darko; Velić, Natalija
2015-08-01
This study brings out the data on the content of 21 mineral and heavy metal in 15 blackberry wines made of conventionally and organically grown blackberries. The objective of this study was to classify the blackberry wine samples based on their mineral composition and the applied cultivation method of the starting raw material by using chemometric analysis. The metal content of Croatian blackberry wine samples was determined by AAS after dry ashing. The comparison between an organic and conventional group of investigated blackberry wines showed statistically significant difference in concentrations of Si and Li, where the organic group contained higher concentrations of these compounds. According to multivariate data analysis, the model based on the original metal content data set finally included seven original variables (K, Fe, Mn, Cu, Ba, Cd and Cr) and gave a satisfactory separation of two applied cultivation methods of the starting raw material.
Self-learning Monte Carlo method and cumulative update in fermion systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Junwei; Shen, Huitao; Qi, Yang
2017-06-07
In this study, we develop the self-learning Monte Carlo (SLMC) method, a general-purpose numerical method recently introduced to simulate many-body systems, for studying interacting fermion systems. Our method uses a highly efficient update algorithm, which we design and dub “cumulative update”, to generate new candidate configurations in the Markov chain based on a self-learned bosonic effective model. From a general analysis and a numerical study of the double exchange model as an example, we find that the SLMC with cumulative update drastically reduces the computational cost of the simulation, while remaining statistically exact. Remarkably, its computational complexity is far lessmore » than the conventional algorithm with local updates.« less
Fukuda, Haruhisa; Kuroki, Manabu
2016-03-01
To develop and internally validate a surgical site infection (SSI) prediction model for Japan. Retrospective observational cohort study. We analyzed surveillance data submitted to the Japan Nosocomial Infections Surveillance system for patients who had undergone target surgical procedures from January 1, 2010, through December 31, 2012. Logistic regression analyses were used to develop statistical models for predicting SSIs. An SSI prediction model was constructed for each of the procedure categories by statistically selecting the appropriate risk factors from among the collected surveillance data and determining their optimal categorization. Standard bootstrapping techniques were applied to assess potential overfitting. The C-index was used to compare the predictive performances of the new statistical models with those of models based on conventional risk index variables. The study sample comprised 349,987 cases from 428 participant hospitals throughout Japan, and the overall SSI incidence was 7.0%. The C-indices of the new statistical models were significantly higher than those of the conventional risk index models in 21 (67.7%) of the 31 procedure categories (P<.05). No significant overfitting was detected. Japan-specific SSI prediction models were shown to generally have higher accuracy than conventional risk index models. These new models may have applications in assessing hospital performance and identifying high-risk patients in specific procedure categories.
Multi-mounted X-ray cone-beam computed tomography
NASA Astrophysics Data System (ADS)
Fu, Jian; Wang, Jingzheng; Guo, Wei; Peng, Peng
2018-04-01
As a powerful nondestructive inspection technique, X-ray computed tomography (X-CT) has been widely applied to clinical diagnosis, industrial production and cutting-edge research. Imaging efficiency is currently one of the major obstacles for the applications of X-CT. In this paper, a multi-mounted three dimensional cone-beam X-CT (MM-CBCT) method is reported. It consists of a novel multi-mounted cone-beam scanning geometry and the corresponding three dimensional statistical iterative reconstruction algorithm. The scanning geometry is the most iconic design and significantly different from the current CBCT systems. Permitting the cone-beam scanning of multiple objects simultaneously, the proposed approach has the potential to achieve an imaging efficiency orders of magnitude greater than the conventional methods. Although multiple objects can be also bundled together and scanned simultaneously by the conventional CBCT methods, it will lead to the increased penetration thickness and signal crosstalk. In contrast, MM-CBCT avoids substantially these problems. This work comprises a numerical study of the method and its experimental verification using a dataset measured with a developed MM-CBCT prototype system. This technique will provide a possible solution for the CT inspection in a large scale.
Quantum tomography for collider physics: illustrations with lepton-pair production
NASA Astrophysics Data System (ADS)
Martens, John C.; Ralston, John P.; Takaki, J. D. Tapia
2018-01-01
Quantum tomography is a method to experimentally extract all that is observable about a quantum mechanical system. We introduce quantum tomography to collider physics with the illustration of the angular distribution of lepton pairs. The tomographic method bypasses much of the field-theoretic formalism to concentrate on what can be observed with experimental data. We provide a practical, experimentally driven guide to model-independent analysis using density matrices at every step. Comparison with traditional methods of analyzing angular correlations of inclusive reactions finds many advantages in the tomographic method, which include manifest Lorentz covariance, direct incorporation of positivity constraints, exhaustively complete polarization information, and new invariants free from frame conventions. For example, experimental data can determine the entanglement entropy of the production process. We give reproducible numerical examples and provide a supplemental standalone computer code that implements the procedure. We also highlight a property of complex positivity that guarantees in a least-squares type fit that a local minimum of a χ 2 statistic will be a global minimum: There are no isolated local minima. This property with an automated implementation of positivity promises to mitigate issues relating to multiple minima and convention dependence that have been problematic in previous work on angular distributions.
Sangkharak, Kanokphorn
2011-11-01
The present study investigated the development of high sugar production by optimization of an enzymatic hydrolysis process using both conventional and statistical methods, as well as the production of ethanol by the selected wastepaper source. Among four sources of pretreated wastepaper including office paper, newspaper, handbills and cardboard, office paper gave the highest values of cellulose (87.12%) and holocelluloses (89.07%). The effects of the amount of wastepaper, the pretreatment method and the type of enzyme on reducing sugar production from office paper were studied using conventional methods. The highest reducing sugar production (1851.28 µg L(-1); 37.03% conversion of glucose) was obtained from the optimal condition containing 40 mg of office paper, pretreated with stream explosion and hydrolysed with the combination of cellulase from Aspergillus niger and Trichoderma viride at the fixed loading rate of 20 FPU g(-1) sample. The effects of interaction of wastepaper amount and enzyme concentration as well as incubation time were studied by a statistical method using central composite design. The optimal medium composition consisted of 43.97 µg L(-1), 28.14 FPU g(-1) sample and 53.73 h of wastepaper, enzyme concentration and incubation time, respectively, and gave the highest amount of sugar production (2184.22 µg L(-1)) and percentage conversion of glucose (43.68%). The ethanol production from pretreated office paper using Saccharomyces cerevisiae in a simultaneous saccharification and fermentation process was 21.02 g L(-1) after 36 h of cultivation, corresponding to an ethanol volumetric production rate of 0.58 g ethanol L(-1) h(-1).
Faraji, Hakim; Helalizadeh, Masoumeh; Kordi, Mohammad Reza
2018-01-01
A rapid, simple, and sensitive approach to the analysis of trihalomethanes (THMs) in swimming pool water samples has been developed. The main goal of this study was to overcome or to improve the shortcomings of conventional dispersive liquid-liquid microextraction (DLLME) and to maximize the realization of green analytical chemistry principles. The method involves a simple vortex-assisted microextraction step, in the absence of the dispersive solvent, followed by salting-out effect for the elimination of the centrifugation step. A bell-shaped device and a solidifiable solvent were used to simplify the extraction solvent collection after phase separation. Optimization of the independent variables was performed by using chemometric methods in three steps. The method was statistically validated based on authentic guidance documents. The completion time for extraction was less than 8 min, and the limits of detection were in the range between 4 and 72 ng L -1 . Using this method, good linearity and precision were achieved. The results of THMs determination in different real samples showed that in some cases the concentration of total THMs was more than threshold values of THMs determined by accredited healthcare organizations. This method indicated satisfactory analytical figures of merit. Graphical Abstract A novel green microextraction technique for overcoming the challenges of conventional DLLME. The proposed procedure complies with the principles of green/sustainable analytical chemistry, comprising decreasing the sample size, making easy automation of the process, reducing organic waste, diminishing energy consumption, replacing toxic reagents with safer reagents, and enhancing operator safety.
Aslanimehr, Masoomeh; Rezvani, Shirin; Mahmoudi, Ali; Moosavi, Najmeh
2017-03-01
Candida species are believed to play an important role in initiation and progression of denture stomatitis. The type of the denture material also influences the adhesion of candida and development of stomatitis. The aim of this study was comparing the adherence of candida albicans to the conventional and injection molding acrylic denture base materials. Twenty injection molding and 20 conventional pressure pack acrylic discs (10×10×2 mm) were prepared according to their manufacturer's instructions. Immediately before the study, samples were placed in sterile water for 3 days to remove residual monomers. The samples were then sterilized using an ultraviolet light unit for 10 minutes. 1×10 8 Cfu/ml suspension of candida albicans ATCC-10231 was prepared from 48 h cultured organism on sabouraud dextrose agar plates incubated at 37oC. 100 μL of this suspension was placed on the surface of each disk. After being incubated at 37oC for 1 hour, the samples were washed with normal saline to remove non-adherent cells. Attached cells were counted using the colony count method after shaking at 3000 rmp for 20 seconds. Finally, each group was tested for 108 times and the data were statistically analyzed by t-test. Quantitative analysis revealed that differences in colony count average of candida albicans adherence to conventional acrylic materials (8.3×10 3 ) comparing to injection molding acrylic resins (6×10 3 ) were statistically significant ( p <0.001). Significant reduction of candida albicans adherence to the injection acrylic resin materials makes them valuable for patients with high risk of denture stomatitis.
Park, Young Mi; Fornage, Bruno D; Benveniste, Ana Paula; Fox, Patricia S; Bassett, Roland L; Yang, Wei Tse
2014-12-01
The purpose of this study was to determine the diagnostic value of strain elastography (SE) alone and in combination with gray-scale ultrasound in the diagnosis of benign versus metastatic disease for abnormal axillary lymph nodes in breast cancer patients. Patients with breast cancer and axillary lymph nodes suspicious for metastatic disease on conventional ultrasound who underwent SE of the suspicious node before ultrasound-guided fine-needle aspiration biopsy (FNAB) were included in this study. On conventional ultrasound, the long- and short-axis diameters, long-axis-to-short-axis ratio, cortical echogenicity, thickness, and evenness were documented. The nodal vascularity was assessed on power Doppler imaging. Elastograms were evaluated for the percentage of black (hard) areas in the lymph node, and the SE-ultrasound size ratio was calculated. Two readers assessed the images independently and then in consensus in cases of disagreement. ROC AUCs were calculated for conventional ultrasound, SE, and both methods combined. Interreader reliability was assessed using kappa statistics. A total of 101 patients with 104 nodes were examined; 35 nodes were benign, and 69 had metastases. SE alone showed a significantly lower AUC (62%) than did conventional ultrasound (92%) (p<0.001). There was no difference between the AUC of conventional ultrasound and the AUC of the combination of conventional ultrasound and SE (93%) (p=0.16). Interreader reliability was moderate for all variables (κ≥0.60) except the SE-ultrasound size ratio (κ=0.35). Added SE does not improve the diagnostic ability of conventional ultrasound when evaluating abnormal axillary lymph nodes.
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
Utility of Modified Ultrafast Papanicolaou Stain in Cytological Diagnosis
Arakeri, Surekha Ulhas
2017-01-01
Introduction Need for minimal turnaround time for assessing Fine Needle Aspiration Cytology (FNAC) has encouraged innovations in staining techniques that require lesser staining time with unequivocal cell morphology. The standard protocol for conventional Papanicolaou (PAP) stain requires about 40 minutes. To overcome this, Ultrafast Papanicolaou (UFP) stain was introduced which reduces staining time to 90 seconds and also enhances the quality. However, reagents required for this were not easily available hence, Modified Ultrafast Papanicolaou (MUFP) stain was introduced subsequently. Aim To assess the efficacy of MUFP staining by comparing the quality of MUFP stain with conventional PAP stain. Materials and Methods FNAC procedure was performed by using 10 ml disposable syringe and 22-23 G needle. Total 131 FNAC cases were studied which were lymph node (30), thyroid (38), breast (22), skin and soft tissue (24), salivary gland (11) and visceral organs (6). Two smears were prepared and stained by MUFP and conventional PAP stain. Scores were given on four parameters: background of smears, overall staining pattern, cell morphology and nuclear staining. Quality Index (QI) was calculated from ratio of total score achieved to maximum score possible. Statistical analysis using chi square test was applied to each of the four parameters before obtaining the QI in both stains. Students t-test was applied to evaluate the efficacy of MUFP in comparison with conventional PAP stain. Results The QI of MUFP for thyroid, breast, lymph node, skin and soft tissue, salivary gland and visceral organs was 0.89, 0.85, 0.89, 0.83, 0.92, and 0.78 respectively. Compared to conventional PAP stain QI of MUFP smears was better in all except visceral organ cases and was statistically significant. MUFP showed clear red blood cell background, transparent cytoplasm and crisp nuclear features. Conclusion MUFP is fast, reliable and can be done with locally available reagents with unequivocal morphology which is the need of the hour for a cytopathology set-up. PMID:28511391
Adwan, Ghaleb; Salameh, Yousef; Adwan, Kamel; Barakat, Ali
2012-01-01
Objective To detect the anticandidal activity of nine toothpastes containing sodium fluoride, sodium monofluorophosphate and herbal extracts as an active ingredients against 45 oral and non oral Candida albicans (C. albicans) isolates. Methods The antifungal activity of these toothpaste formulations was determined using a standard agar well diffusion method. Statistical analysis was performed using a statistical package, SPSS windows version 15, by applying mean values using one-way ANOVA with post-hoc least square differences (LSD) method. A P value of less than 0.05 was considered significant. Results All toothpastes studied in our experiments were effective in inhibiting the growth of all C. albicans isolates. The highest anticandidal activity was obtained from toothpaste that containing both herbal extracts and sodium fluoride as active ingredients, while the lowest activity was obtained from toothpaste containing sodium monofluorophosphate as an active ingredient. Antifungal activity of Parodontax toothpaste showed a significant difference (P< 0.001) against C. albicans isolates compared to toothpastes containing sodium fluoride or herbal products. Conclusions In the present study, it has been demonstrated that toothpaste containing both herbal extracts and sodium fluoride as active ingredients are more effective in control of C. albicans, while toothpaste that containing monofluorophosphate as an active ingredient is less effective against C. albicans. Some herbal toothpaste formulations studied in our experiments, appear to be equally effective as the fluoride dental formulations and it can be used as an alternative to conventional formulations for individuals who have an interest in naturally-based products. Our results may provide invaluable information for dental professionals. PMID:23569933
Kim, Ki-Baek; Kim, Woong-Chul; Kim, Hae-Young; Kim, Ji-Hwan
2013-07-01
This in vitro study aimed to evaluate and compare marginal fit of three-unit fixed dental prostheses (FDPs) fabricated using a newly developed direct metal laser sintering (DMLS) system with that of three-unit FDPs by a conventional lost wax technique (LW) method. Ten cobalt-chromium alloy three-unit FDPs using DMLS system and another ten nickel-chromium alloy FDPs using LW method were fabricated. Marginal fit was examined using a light-body silicone. After setting, the silicon film was cut into four parts and the thickness of silicon layer was measured at 160× magnification using a digital microscope to measure absolute marginal discrepancy (AMD), marginal gap (MG) and internal gap (IG). A repeated measure ANOVA for statistical analysis was performed using the SPSS statistical package version 12.0 (α=0.05). The mean values of AMD, MG, and IG were significantly larger in the DMLS group than in the LW group (p<0.001). Means of AMD, MG and IG in the first molars were 83.3, 80.0, and 82.0μm in the LW group; and 128.0, 112.0, and 159.5μm in the DMLS group, respectively. No significant difference between measurements for premolars and molars was found (p>0.05). The marginal fit of the DMLS system appeared significantly inferior compared to that of the conventional LW method and slightly larger than the acceptable range. For clinical application further improvement of DMLS system may be required. Copyright © 2013 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Catto, James W F; Linkens, Derek A; Abbod, Maysam F; Chen, Minyou; Burton, Julian L; Feeley, Kenneth M; Hamdy, Freddie C
2003-09-15
New techniques for the prediction of tumor behavior are needed, because statistical analysis has a poor accuracy and is not applicable to the individual. Artificial intelligence (AI) may provide these suitable methods. Whereas artificial neural networks (ANN), the best-studied form of AI, have been used successfully, its hidden networks remain an obstacle to its acceptance. Neuro-fuzzy modeling (NFM), another AI method, has a transparent functional layer and is without many of the drawbacks of ANN. We have compared the predictive accuracies of NFM, ANN, and traditional statistical methods, for the behavior of bladder cancer. Experimental molecular biomarkers, including p53 and the mismatch repair proteins, and conventional clinicopathological data were studied in a cohort of 109 patients with bladder cancer. For all three of the methods, models were produced to predict the presence and timing of a tumor relapse. Both methods of AI predicted relapse with an accuracy ranging from 88% to 95%. This was superior to statistical methods (71-77%; P < 0.0006). NFM appeared better than ANN at predicting the timing of relapse (P = 0.073). The use of AI can accurately predict cancer behavior. NFM has a similar or superior predictive accuracy to ANN. However, unlike the impenetrable "black-box" of a neural network, the rules of NFM are transparent, enabling validation from clinical knowledge and the manipulation of input variables to allow exploratory predictions. This technique could be used widely in a variety of areas of medicine.
Renormalization Group Tutorial
NASA Technical Reports Server (NTRS)
Bell, Thomas L.
2004-01-01
Complex physical systems sometimes have statistical behavior characterized by power- law dependence on the parameters of the system and spatial variability with no particular characteristic scale as the parameters approach critical values. The renormalization group (RG) approach was developed in the fields of statistical mechanics and quantum field theory to derive quantitative predictions of such behavior in cases where conventional methods of analysis fail. Techniques based on these ideas have since been extended to treat problems in many different fields, and in particular, the behavior of turbulent fluids. This lecture will describe a relatively simple but nontrivial example of the RG approach applied to the diffusion of photons out of a stellar medium when the photons have wavelengths near that of an emission line of atoms in the medium.
Approximate sample size formulas for the two-sample trimmed mean test with unequal variances.
Luh, Wei-Ming; Guo, Jiin-Huarng
2007-05-01
Yuen's two-sample trimmed mean test statistic is one of the most robust methods to apply when variances are heterogeneous. The present study develops formulas for the sample size required for the test. The formulas are applicable for the cases of unequal variances, non-normality and unequal sample sizes. Given the specified alpha and the power (1-beta), the minimum sample size needed by the proposed formulas under various conditions is less than is given by the conventional formulas. Moreover, given a specified size of sample calculated by the proposed formulas, simulation results show that Yuen's test can achieve statistical power which is generally superior to that of the approximate t test. A numerical example is provided.
Redefining the lower statistical limit in x-ray phase-contrast imaging
NASA Astrophysics Data System (ADS)
Marschner, M.; Birnbacher, L.; Willner, M.; Chabior, M.; Fehringer, A.; Herzen, J.; Noël, P. B.; Pfeiffer, F.
2015-03-01
Phase-contrast x-ray computed tomography (PCCT) is currently investigated and developed as a potentially very interesting extension of conventional CT, because it promises to provide high soft-tissue contrast for weakly absorbing samples. For data acquisition several images at different grating positions are combined to obtain a phase-contrast projection. For short exposure times, which are necessary for lower radiation dose, the photon counts in a single stepping position are very low. In this case, the currently used phase-retrieval does not provide reliable results for some pixels. This uncertainty results in statistical phase wrapping, which leads to a higher standard deviation in the phase-contrast projections than theoretically expected. For even lower statistics, the phase retrieval breaks down completely and the phase information is lost. New measurement procedures rely on a linear approximation of the sinusoidal phase stepping curve around the zero crossings. In this case only two images are acquired to obtain the phase-contrast projection. The approximation is only valid for small phase values. However, typically nearly all pixels are within this regime due to the differential nature of the signal. We examine the statistical properties of a linear approximation method and illustrate by simulation and experiment that the lower statistical limit can be redefined using this method. That means that the phase signal can be retrieved even with very low photon counts and statistical phase wrapping can be avoided. This is an important step towards enhanced image quality in PCCT with very low photon counts.
New U.S. Geological Survey Method for the Assessment of Reserve Growth
Klett, Timothy R.; Attanasi, E.D.; Charpentier, Ronald R.; Cook, Troy A.; Freeman, P.A.; Gautier, Donald L.; Le, Phuong A.; Ryder, Robert T.; Schenk, Christopher J.; Tennyson, Marilyn E.; Verma, Mahendra K.
2011-01-01
Reserve growth is defined as the estimated increases in quantities of crude oil, natural gas, and natural gas liquids that have the potential to be added to remaining reserves in discovered accumulations through extension, revision, improved recovery efficiency, and additions of new pools or reservoirs. A new U.S. Geological Survey method was developed to assess the reserve-growth potential of technically recoverable crude oil and natural gas to be added to reserves under proven technology currently in practice within the trend or play, or which reasonably can be extrapolated from geologically similar trends or plays. This method currently is in use to assess potential additions to reserves in discovered fields of the United States. The new approach involves (1) individual analysis of selected large accumulations that contribute most to reserve growth, and (2) conventional statistical modeling of reserve growth in remaining accumulations. This report will focus on the individual accumulation analysis. In the past, the U.S. Geological Survey estimated reserve growth by statistical methods using historical recoverable-quantity data. Those statistical methods were based on growth rates averaged by the number of years since accumulation discovery. Accumulations in mature petroleum provinces with volumetrically significant reserve growth, however, bias statistical models of the data; therefore, accumulations with significant reserve growth are best analyzed separately from those with less significant reserve growth. Large (greater than 500 million barrels) and older (with respect to year of discovery) oil accumulations increase in size at greater rates late in their development history in contrast to more recently discovered accumulations that achieve most growth early in their development history. Such differences greatly affect the statistical methods commonly used to forecast reserve growth. The individual accumulation-analysis method involves estimating the in-place petroleum quantity and its uncertainty, as well as the estimated (forecasted) recoverability and its respective uncertainty. These variables are assigned probabilistic distributions and are combined statistically to provide probabilistic estimates of ultimate recoverable quantities. Cumulative production and remaining reserves are then subtracted from the estimated ultimate recoverable quantities to provide potential reserve growth. In practice, results of the two methods are aggregated to various scales, the highest of which includes an entire country or the world total. The aggregated results are reported along with the statistically appropriate uncertainties.
Baek, Hye Jin; Kim, Dong Wook; Ryu, Ji Hwa; Lee, Yoo Jin
2013-09-01
There has been no study to compare the diagnostic accuracy of an experienced radiologist with a trainee in nasal bone fracture. To compare the diagnostic accuracy between conventional radiography and computed tomography (CT) for the identification of nasal bone fractures and to evaluate the interobserver reliability between a staff radiologist and a trainee. A total of 108 patients who underwent conventional radiography and CT after acute nasal trauma were included in this retrospective study. Two readers, a staff radiologist and a second-year resident, independently assessed the results of the imaging studies. Of the 108 patients, the presence of a nasal bone fracture was confirmed in 88 (81.5%) patients. The number of non-depressed fractures was higher than the number of depressed fractures. In nine (10.2%) patients, nasal bone fractures were only identified on conventional radiography, including three depressed and six non-depressed fractures. CT was more accurate as compared to conventional radiography for the identification of nasal bone fractures as determined by both readers (P <0.05), all diagnostic indices of an experienced radiologist were similar to or higher than those of a trainee, and κ statistics showed moderate agreement between the two diagnostic tools for both readers. There was no statistical difference in the assessment of interobserver reliability for both imaging modalities in the identification of nasal bone fractures. For the identification of nasal bone fractures, CT was significantly superior to conventional radiography. Although a staff radiologist showed better values in the identification of nasal bone fracture and differentiation between depressed and non-depressed fractures than a trainee, there was no statistically significant difference in the interpretation of conventional radiography and CT between a radiologist and a trainee.
Effects of Item Exposure for Conventional Examinations in a Continuous Testing Environment.
ERIC Educational Resources Information Center
Hertz, Norman R.; Chinn, Roberta N.
This study explored the effect of item exposure on two conventional examinations administered as computer-based tests. A principal hypothesis was that item exposure would have little or no effect on average difficulty of the items over the course of an administrative cycle. This hypothesis was tested by exploring conventional item statistics and…
Liang, Shanshan; Yuan, Fusong; Luo, Xu; Yu, Zhuoren; Tang, Zhihui
2018-04-05
Marginal discrepancy is key to evaluating the accuracy of fixed dental prostheses. An improved method of evaluating marginal discrepancy is needed. The purpose of this in vitro study was to evaluate the absolute marginal discrepancy of ceramic crowns fabricated using conventional and digital methods with a digital method for the quantitative evaluation of absolute marginal discrepancy. The novel method was based on 3-dimensional scanning, iterative closest point registration techniques, and reverse engineering theory. Six standard tooth preparations for the right maxillary central incisor, right maxillary second premolar, right maxillary second molar, left mandibular lateral incisor, left mandibular first premolar, and left mandibular first molar were selected. Ten conventional ceramic crowns and 10 CEREC crowns were fabricated for each tooth preparation. A dental cast scanner was used to obtain 3-dimensional data of the preparations and ceramic crowns, and the data were compared with the "virtual seating" iterative closest point technique. Reverse engineering software used edge sharpening and other functional modules to extract the margins of the preparations and crowns. Finally, quantitative evaluation of the absolute marginal discrepancy of the ceramic crowns was obtained from the 2-dimensional cross-sectional straight-line distance between points on the margin of the ceramic crowns and the standard preparations based on the circumferential function module along the long axis. The absolute marginal discrepancy of the ceramic crowns fabricated using conventional methods was 115 ±15.2 μm, and 110 ±14.3 μm for those fabricated using the digital technique was. ANOVA showed no statistical difference between the 2 methods or among ceramic crowns for different teeth (P>.05). The digital quantitative evaluation method for the absolute marginal discrepancy of ceramic crowns was established. The evaluations determined that the absolute marginal discrepancies were within a clinically acceptable range. This method is acceptable for the digital evaluation of the accuracy of complete crowns. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Demodulation of messages received with low signal to noise ratio
NASA Astrophysics Data System (ADS)
Marguinaud, A.; Quignon, T.; Romann, B.
The implementation of this all-digital demodulator is derived from maximum likelihood considerations applied to an analytical representation of the received signal. Traditional adapted filters and phase lock loops are replaced by minimum variance estimators and hypothesis tests. These statistical tests become very simple when working on phase signal. These methods, combined with rigorous control data representation allow significant computation savings as compared to conventional realizations. Nominal operation has been verified down to energetic signal over noise of -3 dB upon a QPSK demodulator.
Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics
NASA Technical Reports Server (NTRS)
Pohorille, Andrew
2006-01-01
The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described by rate constants. These problems are isomorphic with chemical kinetics problems. Recently, several efficient techniques for this purpose have been developed based on the approach originally proposed by Gillespie. Although the utility of the techniques mentioned above for Bayesian problems has not been determined, further research along these lines is warranted
Comparing the effectiveness of laser vs. conventional endoforehead lifting.
Chang, Cheng-Jen; Yu, De-Yi; Chang, Shu-Ying; Hsiao, Yen-Chang
2018-04-01
The objective of this study was to compare the efficacy and safety of laser versus conventional endoforehead lifting. Over a period of 12 years (January 2000-January 2012), a total of 110 patients with hyperactive muscles over the frontal region have been collected for a retrospective study. The SurgiLase 150XJ CO 2 laser system, in conjunction with the flexible FIBERLASE, was used. The endoscope was 4 mm in diameter with an angle of 30°. The primary efficacy measurement was the assessment of the final outcome for using laser vs. conventional methods. Both groups were observed at three weeks, six weeks and six months after surgery. The most common complication in early convalescence (three weeks) was swelling. This was followed by local paraesthesia, ecchymosis, localized hematomas and scar with alopecia. All these problems disappeared completely after the 6-month study period. Based on a chi-square analysis, there were clinically and statistically significant differences favouring the laser endoforehead surgery in the operative time, early and late complications. All patients achieved significant improvement after both laser and conventional endoforehead surgery in the final outcome. However, the early and late complications indicated a greater difference in the laser group.
NASA Astrophysics Data System (ADS)
Jeong, Jeong-Won; Kim, Tae-Seong; Shin, Dae-Chul; Do, Synho; Marmarelis, Vasilis Z.
2004-04-01
Recently it was shown that soft tissue can be differentiated with spectral unmixing and detection methods that utilize multi-band information obtained from a High-Resolution Ultrasonic Transmission Tomography (HUTT) system. In this study, we focus on tissue differentiation using the spectral target detection method based on Constrained Energy Minimization (CEM). We have developed a new tissue differentiation method called "CEM filter bank". Statistical inference on the output of each CEM filter of a filter bank is used to make a decision based on the maximum statistical significance rather than the magnitude of each CEM filter output. We validate this method through 3-D inter/intra-phantom soft tissue classification where target profiles obtained from an arbitrary single slice are used for differentiation in multiple tomographic slices. Also spectral coherence between target and object profiles of an identical tissue at different slices and phantoms is evaluated by conventional cross-correlation analysis. The performance of the proposed classifier is assessed using Receiver Operating Characteristic (ROC) analysis. Finally we apply our method to classify tiny structures inside a beef kidney such as Styrofoam balls (~1mm), chicken tissue (~5mm), and vessel-duct structures.
Lopes, Lawrence Gonzaga; Franco, Eduardo Batista; Pereira, José Carlos; Mondelli, Rafael Francisco Lia
2008-01-01
The aim of this study was to evaluate the polymerization shrinkage and shrinkage stress of composites polymerized with a LED and a quartz tungsten halogen (QTH) light sources. The LED was used in a conventional mode (CM) and the QTH was used in both conventional and pulse-delay modes (PD). The composite resins used were Z100, A110, SureFil and Bisfil 2B (chemical-cured). Composite deformation upon polymerization was measured by the strain gauge method. The shrinkage stress was measured by photoelastic analysis. The polymerization shrinkage data were analyzed statistically using two-way ANOVA and Tukey test (p≤0.05), and the stress data were analyzed by one-way ANOVA and Tukey's test (p≤0.05). Shrinkage and stress means of Bisfil 2B were statistically significant lower than those of Z100, A110 and SureFil. In general, the PD mode reduced the contraction and the stress values when compared to CM. LED generated the same stress as QTH in conventional mode. Regardless of the activation mode, SureFil produced lower contraction and stress values than the other light-cured resins. Conversely, Z100 and A110 produced the greatest contraction and stress values. As expected, the chemically cured resin generated lower shrinkage and stress than the light-cured resins. In conclusion, The PD mode effectively decreased contraction stress for Z100 and A110. Development of stress in light-cured resins depended on the shrinkage value. PMID:19089287
Keystroke dynamics in the pre-touchscreen era
Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A.
2013-01-01
Biometric authentication seeks to measure an individual’s unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals’ typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts. PMID:24391568
In vitro evaluation of an alternative method to bond molar tubes
PINZAN-VERCELINO, Célia Regina Maio; PINZAN, Arnaldo; GURGEL, Júlio de Araújo; BRAMANTE, Fausto Silva; PINZAN, Luciana Maio
2011-01-01
Despite the advances in bonding materials, many clinicians today still prefer to place bands on molar teeth. Molar bonding procedures need improvement to be widely accepted clinically. Objective The purpose of this study was to evaluate the shear bond strength when an additional adhesive layer was applied on the occlusal tooth/tube interface to provide reinforcement to molar tubes. Material and methods Sixty third molars were selected and allocated to the 3 groups: group 1 received a conventional direct bond followed by the application of an additional layer of adhesive on the occlusal tooth/tube interface, group 2 received a conventional direct bond, and group 3 received a conventional direct bond and an additional cure time of 10 s. The specimens were debonded in a universal testing machine. The results were analyzed statistically by ANOVA and Tukey’s test (α=0.05). Results Group 1 had a significantly higher (p<0.05) shear bond strength compared to groups 2 and 3. No difference was detected between groups 2 and 3 (p>0.05). Conclusions The present in vitro findings indicate that the application of an additional layer of adhesive on the tooth/tube interface increased the shear bond strength of the bonded molar tubes. PMID:21437468
Keystroke dynamics in the pre-touchscreen era.
Ahmad, Nasir; Szymkowiak, Andrea; Campbell, Paul A
2013-12-19
Biometric authentication seeks to measure an individual's unique physiological attributes for the purpose of identity verification. Conventionally, this task has been realized via analyses of fingerprints or signature iris patterns. However, whilst such methods effectively offer a superior security protocol compared with password-based approaches for example, their substantial infrastructure costs, and intrusive nature, make them undesirable and indeed impractical for many scenarios. An alternative approach seeks to develop similarly robust screening protocols through analysis of typing patterns, formally known as keystroke dynamics. Here, keystroke analysis methodologies can utilize multiple variables, and a range of mathematical techniques, in order to extract individuals' typing signatures. Such variables may include measurement of the period between key presses, and/or releases, or even key-strike pressures. Statistical methods, neural networks, and fuzzy logic have often formed the basis for quantitative analysis on the data gathered, typically from conventional computer keyboards. Extension to more recent technologies such as numerical keypads and touch-screen devices is in its infancy, but obviously important as such devices grow in popularity. Here, we review the state of knowledge pertaining to authentication via conventional keyboards with a view toward indicating how this platform of knowledge can be exploited and extended into the newly emergent type-based technological contexts.
Hung, Chun-Chi; Li, Yuan-Ta; Chou, Yu-Ching; Chen, Jia-En; Wu, Chia-Chun; Shen, Hsain-Chung; Yeh, Tsu-Te
2018-05-03
Treating pelvic fractures remains a challenging task for orthopaedic surgeons. We aimed to evaluate the feasibility, accuracy, and effectiveness of three-dimensional (3D) printing technology and computer-assisted virtual surgery for pre-operative planning in anterior ring fractures of the pelvis. We hypothesized that using 3D printing models would reduce operation time and significantly improve the surgical outcomes of pelvic fracture repair. We retrospectively reviewed the records of 30 patients with pelvic fractures treated by anterior pelvic fixation with locking plates (14 patients, conventional locking plate fixation; 16 patients, pre-operative virtual simulation with 3D, printing-assisted, pre-contoured, locking plate fixation). We compared operative time, instrumentation time, blood loss, and post-surgical residual displacements, as evaluated on X-ray films, among groups. Statistical analyses evaluated significant differences between the groups for each of these variables. The patients treated with the virtual simulation and 3D printing-assisted technique had significantly shorter internal fixation times, shorter surgery duration, and less blood loss (- 57 minutes, - 70 minutes, and - 274 ml, respectively; P < 0.05) than patients in the conventional surgery group. However, the post-operative radiological result was similar between groups (P > 0.05). The complication rate was less in the 3D printing group (1/16 patients) than in the conventional surgery group (3/14 patients). The 3D simulation and printing technique is an effective and reliable method for treating anterior pelvic ring fractures. With precise pre-operative planning and accurate execution of the procedures, this time-saving approach can provide a more personalized treatment plan, allowing for a safer orthopaedic surgery.
Fouda, Sameh M; Mattout, Hala K
2017-06-01
Laser in situ keratomelieusis (LASIK) is one of the commonest refractive procedures performed nowadays. The dry eye problem is nearly universal in all patients after LASIK and it can be so annoying that the post-operative patient satisfaction is sometimes precluded. Conventional treatment includes the use of artificial tears. Alternative methods such as punctal plugs and botulinum toxin injection can be used for the management of post-LASIK dry eye. The aim of this study is to compare botulinum toxin injection in the orbicularis muscle to lacrimal punctal plugs for the control of post-LASIK dry eye manifestations. This is a prospective study that included 60 patients who had LASIK eye surgery for correction of refractive errors. Patients were randomly assigned to one of three methods of dry eye management; the first method was conventional medical treatment with preservative-free tear substitutes only (group A: 20 patients = 40 eyes); the second method was intraoperative injection of botulinum toxin A (BTA) in the orbicularis muscle below the lower punctum of both eyes (group B: 20 patients = 40 eyes) and the third method was intraoperative insertion of temporary extended duration silicone punctal plugs in the lower punctum of both eyes (group C: 20 patients = 40 eyes). In the first follow-up visit after 2 weeks, the two test groups (B, C) showed a statistically significant increase in both tear film break up time (TBUT) and Schirmer test score with a decrease in the OSDI score and daily frequency of lubricants used in comparison to the control group A. These differences were maintained in the next follow-up visit but they became statistically insignificant at the 3rd and 6th post-operative months. Complications were encountered more in the punctal plug patients (60%) than in BTA patients (25%) and this difference was statistically significant. The use of BTA injection to control dry eye symptoms by inducing temporary punctal ectropion is an effective method to improve patient satisfaction after LASIK eye surgery. It has higher level of patient satisfaction and fewer complications in comparison to punctal plugs or topical standard dry eye treatment.
Chapman, Benjamin P; Weiss, Alexander; Duberstein, Paul R
2016-12-01
Statistical learning theory (SLT) is the statistical formulation of machine learning theory, a body of analytic methods common in "big data" problems. Regression-based SLT algorithms seek to maximize predictive accuracy for some outcome, given a large pool of potential predictors, without overfitting the sample. Research goals in psychology may sometimes call for high dimensional regression. One example is criterion-keyed scale construction, where a scale with maximal predictive validity must be built from a large item pool. Using this as a working example, we first introduce a core principle of SLT methods: minimization of expected prediction error (EPE). Minimizing EPE is fundamentally different than maximizing the within-sample likelihood, and hinges on building a predictive model of sufficient complexity to predict the outcome well, without undue complexity leading to overfitting. We describe how such models are built and refined via cross-validation. We then illustrate how 3 common SLT algorithms-supervised principal components, regularization, and boosting-can be used to construct a criterion-keyed scale predicting all-cause mortality, using a large personality item pool within a population cohort. Each algorithm illustrates a different approach to minimizing EPE. Finally, we consider broader applications of SLT predictive algorithms, both as supportive analytic tools for conventional methods, and as primary analytic tools in discovery phase research. We conclude that despite their differences from the classic null-hypothesis testing approach-or perhaps because of them-SLT methods may hold value as a statistically rigorous approach to exploratory regression. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
de França, Danilo Gonzaga; Morais, Maria Helena; das Neves, Flávio D; Carreiro, Adriana Fonte; Barbosa, Gustavo As
The aim of this study was to evaluate the effectiveness of fabrication methods (computer-aided design/computer-aided manufacture [CAD/CAM], copy-milling, and conventional casting) in the fit accuracy of three-unit, screw-retained fixed dental prostheses. Sixteen three-unit implant-supported screw-retained frameworks were fabricated to fit an in vitro model. Eight frameworks were fabricated using the CAD/CAM system, four in zirconia and four in cobalt-chromium. Four zirconia frameworks were fabricated using the copy-milled system, and four were cast in cobalt-chromium using conventional casting with premachined abutments. The vertical and horizontal misfit at the implant-framework interface was measured using scanning electron microscopy at ×250. The results for vertical misfit were analyzed using Kruskal-Wallis and Mann-Whitney tests. The horizontal misfits were categorized as underextended, equally extended, or overextended. Statistical analysis established differences between groups according to the chi-square test (α = .05). The mean vertical misfit was 5.9 ± 3.6 μm for CAD/CAM-fabricated zirconia, 1.2 ± 2.2 μm for CAD/CAM-fabricated cobalt-chromium frameworks, 7.6 ± 9.2 μm for copy-milling-fabricated zirconia frameworks, and 11.8 (9.8) μm for conventionally fabricated frameworks. The Mann-Whitney test revealed significant differences between all but the zirconia-fabricated frameworks. A significant association was observed between the horizontal misfits and the fabrication method. The percentage of horizontal misfits that were underextended and overextended was higher in milled zirconia (83.3%), CAD/CAM cobaltchromium (66.7%), cast cobalt-chromium (58.3%), and CAD/CAM zirconia (33.3%) frameworks. CAD/CAM-fabricated frameworks exhibit better vertical misfit and low variability compared with copy-milled and conventionally fabricated frameworks. The percentage of interfaces equally extended was higher when CAD/CAM and zirconia were used.
Win, Khin Thanda; Vegas, Juan; Zhang, Chunying; Song, Kihwan; Lee, Sanghyeob
2017-01-01
QTL mapping using NGS-assisted BSA was successfully applied to an F 2 population for downy mildew resistance in cucumber. QTLs detected by NGS-assisted BSA were confirmed by conventional QTL analysis. Downy mildew (DM), caused by Pseudoperonospora cubensis, is one of the most destructive foliar diseases in cucumber. QTL mapping is a fundamental approach for understanding the genetic inheritance of DM resistance in cucumber. Recently, many studies have reported that a combination of bulked segregant analysis (BSA) and next-generation sequencing (NGS) can be a rapid and cost-effective way of mapping QTLs. In this study, we applied NGS-assisted BSA to QTL mapping of DM resistance in cucumber and confirmed the results by conventional QTL analysis. By sequencing two DNA pools each consisting of ten individuals showing high resistance and susceptibility to DM from a F 2 population, we identified single nucleotide polymorphisms (SNPs) between the two pools. We employed a statistical method for QTL mapping based on these SNPs. Five QTLs, dm2.2, dm4.1, dm5.1, dm5.2, and dm6.1, were detected and dm2.2 showed the largest effect on DM resistance. Conventional QTL analysis using the F 2 confirmed dm2.2 (R 2 = 10.8-24 %) and dm5.2 (R 2 = 14-27.2 %) as major QTLs and dm4.1 (R 2 = 8 %) as two minor QTLs, but could not detect dm5.1 and dm6.1. A new QTL on chromosome 2, dm2.1 (R 2 = 28.2 %) was detected by the conventional QTL method using an F 3 population. This study demonstrated the effectiveness of NGS-assisted BSA for mapping QTLs conferring DM resistance in cucumber and revealed the unique genetic inheritance of DM resistance in this population through two distinct major QTLs on chromosome 2 that mainly harbor DM resistance.
Modified neural networks for rapid recovery of tokamak plasma parameters for real time control
NASA Astrophysics Data System (ADS)
Sengupta, A.; Ranjan, P.
2002-07-01
Two modified neural network techniques are used for the identification of the equilibrium plasma parameters of the Superconducting Steady State Tokamak I from external magnetic measurements. This is expected to ultimately assist in a real time plasma control. As different from the conventional network structure where a single network with the optimum number of processing elements calculates the outputs, a multinetwork system connected in parallel does the calculations here in one of the methods. This network is called the double neural network. The accuracy of the recovered parameters is clearly more than the conventional network. The other type of neural network used here is based on the statistical function parametrization combined with a neural network. The principal component transformation removes linear dependences from the measurements and a dimensional reduction process reduces the dimensionality of the input space. This reduced and transformed input set, rather than the entire set, is fed into the neural network input. This is known as the principal component transformation-based neural network. The accuracy of the recovered parameters in the latter type of modified network is found to be a further improvement over the accuracy of the double neural network. This result differs from that obtained in an earlier work where the double neural network showed better performance. The conventional network and the function parametrization methods have also been used for comparison. The conventional network has been used for an optimization of the set of magnetic diagnostics. The effective set of sensors, as assessed by this network, are compared with the principal component based network. Fault tolerance of the neural networks has been tested. The double neural network showed the maximum resistance to faults in the diagnostics, while the principal component based network performed poorly. Finally the processing times of the methods have been compared. The double network and the principal component network involve the minimum computation time, although the conventional network also performs well enough to be used in real time.
Comparative study of signalling methods for high-speed backplane transceiver
NASA Astrophysics Data System (ADS)
Wu, Kejun
2017-11-01
A combined analysis of transient simulation and statistical method is proposed for comparative study of signalling methods applied to high-speed backplane transceivers. This method enables fast and accurate signal-to-noise ratio and symbol error rate estimation of a serial link based on a four-dimension design space, including channel characteristics, noise scenarios, equalisation schemes, and signalling methods. The proposed combined analysis method chooses an efficient sampling size for performance evaluation. A comparative study of non-return-to-zero (NRZ), PAM-4, and four-phase shifted sinusoid symbol (PSS-4) using parameterised behaviour-level simulation shows PAM-4 and PSS-4 has substantial advantages over conventional NRZ in most of the cases. A comparison between PAM-4 and PSS-4 shows PAM-4 gets significant bit error rate degradation when noise level is enhanced.
A Discriminative Sentence Compression Method as Combinatorial Optimization Problem
NASA Astrophysics Data System (ADS)
Hirao, Tsutomu; Suzuki, Jun; Isozaki, Hideki
In the study of automatic summarization, the main research topic was `important sentence extraction' but nowadays `sentence compression' is a hot research topic. Conventional sentence compression methods usually transform a given sentence into a parse tree or a dependency tree, and modify them to get a shorter sentence. However, this method is sometimes too rigid. In this paper, we regard sentence compression as an combinatorial optimization problem that extracts an optimal subsequence of words. Hori et al. also proposed a similar method, but they used only a small number of features and their weights were tuned by hand. We introduce a large number of features such as part-of-speech bigrams and word position in the sentence. Furthermore, we train the system by discriminative learning. According to our experiments, our method obtained better score than other methods with statistical significance.
Childress, Carolyn J. Oblinger; Foreman, William T.; Connor, Brooke F.; Maloney, Thomas J.
1999-01-01
This report describes the U.S. Geological Survey National Water Quality Laboratory?s approach for determining long-term method detection levels and establishing reporting levels, details relevant new reporting conventions, and provides preliminary guidance on interpreting data reported with the new conventions. At the long-term method detection level concentration, the risk of a false positive detection (analyte reported present at the long-term method detection level when not in sample) is no more than 1 percent. However, at the long-term method detection level, the risk of a false negative occurrence (analyte reported not present when present at the long-term method detection level concentration) is up to 50 percent. Because this false negative rate is too high for use as a default 'less than' reporting level, a more reliable laboratory reporting level is set at twice the determined long-term method detection level. For all methods, concentrations measured between the laboratory reporting level and the long-term method detection level will be reported as estimated concentrations. Non-detections will be censored to the laboratory reporting level. Adoption of the new reporting conventions requires a full understanding of how low-concentration data can be used and interpreted and places responsibility for using and presenting final data with the user rather than with the laboratory. Users must consider that (1) new laboratory reporting levels may differ from previously established minimum reporting levels, (2) long-term method detection levels and laboratory reporting levels may change over time, and (3) estimated concentrations are less certain than concentrations reported above the laboratory reporting level. The availability of uncensored but qualified low-concentration data for interpretation and statistical analysis is a substantial benefit to the user. A decision to censor data after they are reported from the laboratory may still be made by the user, if merited, on the basis of the intended use of the data.
Rao, Goutham; Lopez-Jimenez, Francisco; Boyd, Jack; D'Amico, Frank; Durant, Nefertiti H; Hlatky, Mark A; Howard, George; Kirley, Katherine; Masi, Christopher; Powell-Wiley, Tiffany M; Solomonides, Anthony E; West, Colin P; Wessel, Jennifer
2017-09-05
Meta-analyses are becoming increasingly popular, especially in the fields of cardiovascular disease prevention and treatment. They are often considered to be a reliable source of evidence for making healthcare decisions. Unfortunately, problems among meta-analyses such as the misapplication and misinterpretation of statistical methods and tests are long-standing and widespread. The purposes of this statement are to review key steps in the development of a meta-analysis and to provide recommendations that will be useful for carrying out meta-analyses and for readers and journal editors, who must interpret the findings and gauge methodological quality. To make the statement practical and accessible, detailed descriptions of statistical methods have been omitted. Based on a survey of cardiovascular meta-analyses, published literature on methodology, expert consultation, and consensus among the writing group, key recommendations are provided. Recommendations reinforce several current practices, including protocol registration; comprehensive search strategies; methods for data extraction and abstraction; methods for identifying, measuring, and dealing with heterogeneity; and statistical methods for pooling results. Other practices should be discontinued, including the use of levels of evidence and evidence hierarchies to gauge the value and impact of different study designs (including meta-analyses) and the use of structured tools to assess the quality of studies to be included in a meta-analysis. We also recommend choosing a pooling model for conventional meta-analyses (fixed effect or random effects) on the basis of clinical and methodological similarities among studies to be included, rather than the results of a test for statistical heterogeneity. © 2017 American Heart Association, Inc.
Mukhopadhyay, Nitai D; Sampson, Andrew J; Deniz, Daniel; Alm Carlsson, Gudrun; Williamson, Jeffrey; Malusek, Alexandr
2012-01-01
Correlated sampling Monte Carlo methods can shorten computing times in brachytherapy treatment planning. Monte Carlo efficiency is typically estimated via efficiency gain, defined as the reduction in computing time by correlated sampling relative to conventional Monte Carlo methods when equal statistical uncertainties have been achieved. The determination of the efficiency gain uncertainty arising from random effects, however, is not a straightforward task specially when the error distribution is non-normal. The purpose of this study is to evaluate the applicability of the F distribution and standardized uncertainty propagation methods (widely used in metrology to estimate uncertainty of physical measurements) for predicting confidence intervals about efficiency gain estimates derived from single Monte Carlo runs using fixed-collision correlated sampling in a simplified brachytherapy geometry. A bootstrap based algorithm was used to simulate the probability distribution of the efficiency gain estimates and the shortest 95% confidence interval was estimated from this distribution. It was found that the corresponding relative uncertainty was as large as 37% for this particular problem. The uncertainty propagation framework predicted confidence intervals reasonably well; however its main disadvantage was that uncertainties of input quantities had to be calculated in a separate run via a Monte Carlo method. The F distribution noticeably underestimated the confidence interval. These discrepancies were influenced by several photons with large statistical weights which made extremely large contributions to the scored absorbed dose difference. The mechanism of acquiring high statistical weights in the fixed-collision correlated sampling method was explained and a mitigation strategy was proposed. Copyright © 2011 Elsevier Ltd. All rights reserved.
Mubeen; K.R., Vijayalakshmi; Bhuyan, Sanat Kumar; Panigrahi, Rajat G; Priyadarshini, Smita R; Misra, Satyaranjan; Singh, Chandravir
2014-01-01
Objectives: The identification and radiographic interpretation of periapical bone lesions is important for accurate diagnosis and treatment. The present study was undertaken to study the feasibility and diagnostic accuracy of colour coded digital radiographs in terms of presence and size of lesion and to compare the diagnostic accuracy of colour coded digital images with direct digital images and conventional radiographs for assessing periapical lesions. Materials and Methods: Sixty human dry cadaver hemimandibles were obtained and periapical lesions were created in first and second premolar teeth at the junction of cancellous and cortical bone using a micromotor handpiece and carbide burs of sizes 2, 4 and 6. After each successive use of round burs, a conventional, RVG and colour coded image was taken for each specimen. All the images were evaluated by three observers. The diagnostic accuracy for each bur and image mode was calculated statistically. Results: Our results showed good interobserver (kappa > 0.61) agreement for the different radiographic techniques and for the different bur sizes. Conventional Radiography outperformed Digital Radiography in diagnosing periapical lesions made with Size two bur. Both were equally diagnostic for lesions made with larger bur sizes. Colour coding method was least accurate among all the techniques. Conclusion: Conventional radiography traditionally forms the backbone in the diagnosis, treatment planning and follow-up of periapical lesions. Direct digital imaging is an efficient technique, in diagnostic sense. Colour coding of digital radiography was feasible but less accurate however, this imaging technique, like any other, needs to be studied continuously with the emphasis on safety of patients and diagnostic quality of images. PMID:25584318
Outcome reporting following navigated high tibial osteotomy of the knee: a systematic review.
Yan, James; Musahl, Volker; Kay, Jeffrey; Khan, Moin; Simunovic, Nicole; Ayeni, Olufemi R
2016-11-01
This systematic review evaluates radiographic and clinical outcome reporting following navigated high tibial osteotomy (HTO). Conventional HTO was used as a control to compare outcomes and furthermore investigate the quality of evidence in studies reporting outcomes for navigated HTO. It was hypothesized that navigated HTO will show superior clinical and radiographic outcomes compared to conventional HTO. Two independent reviewers searched PubMed, Ovid (MEDLINE), EMBASE, and Cochrane databases for studies reporting outcomes following navigated HTO. Titles, abstracts, and full-text were screened in duplicate using an a priori inclusion and exclusion criteria. Descriptive statistics were calculated using Minitab ® statistical software. Methodological Index for Nonrandomized Studies (MINORS) and Cochrane Risk of Bias Scores were used to evaluate methodological quality. Thirty-four studies which involved 2216 HTOs were analysed in this review, 1608 (72.6 %) navigated HTOs and 608 (27.4 %) conventional HTOs. The majority of studies were of level IV evidence (16). Clinical outcomes were reported in knee and function scores or range of motion comparisons. Postoperative clinical and functional scores were improved by navigated HTO although it is not demonstrated if there is significant improvement compared to conventional HTO. Most common clinical outcome score reported was Lysholm scores (6) which report postoperative scores of 87.8 (standard deviation 5.9) and 88.8 (standard deviation 5.9) for conventional and navigation-assisted HTO, respectively. Radiographic outcomes reported commonly were weight-bearing mechanical axis, coronal plane angle, and posterior tibial slope angle in the sagittal plane. Studies have shown HTO gives significant correction of mechanical alignment and navigated HTO produces significantly less change in posterior tibial slope postoperatively compared to conventional. The mean MINORS for the 17 non-comparative studies was 9/16, and 15/24 for the 14 non-randomized comparative studies. Navigation HTO results in improved mechanical axis alignment and demonstrates significantly better control over the tibial slope angle change postoperatively compared to conventional methods; however, these improvements have not yet been reflected in clinical outcome scores. Overall the studies report HTO does create significantly improved knee scores and functions compared to patients' preoperative ratings regardless of technique. Future studies on HTO outcomes need to focus on consistency of outcome reporting. IV.
Automated lithology prediction from PGNAA and other geophysical logs.
Borsaru, M; Zhou, B; Aizawa, T; Karashima, H; Hashimoto, T
2006-02-01
Different methods of lithology predictions from geophysical data have been developed in the last 15 years. The geophysical logs used for predicting lithology are the conventional logs: sonic, neutron-neutron, gamma (total natural-gamma) and density (backscattered gamma-gamma). The prompt gamma neutron activation analysis (PGNAA) is another established geophysical logging technique for in situ element analysis of rocks in boreholes. The work described in this paper was carried out to investigate the application of PGNAA to the lithology interpretation. The data interpretation was conducted using the automatic interpretation program LogTrans based on statistical analysis. Limited test suggests that PGNAA logging data can be used to predict the lithology. A success rate of 73% for lithology prediction was achieved from PGNAA logging data only. It can also be used in conjunction with the conventional geophysical logs to enhance the lithology prediction.
[Alternative medicine: faith or science?].
Pletscher, A
1990-04-21
For the success of both alternative and scientific (conventional) medicine, factors such as the psychological influence of the doctor, loving care, human affection, the patient's belief in the treatment, the suggestive power of attractive (even unproven) theories, dogmas and chance events (e.g. spontaneous remissions) etc. play a major role. Some practices of alternative medicine have a particularly strong appeal to the non-rational side of the human being. Conventional medicine includes a component which is based on scientific and statistical methods. The possibility that in alternative medicine principles and effects exist which are not (yet) known to scientific medicine, but which match up to scientific criteria, cannot be excluded. However, up to now this has not been convincingly proven. The difficulties which arise in the elucidation of this problem are discussed in the light of examples from the literature and some experiments of our own.
Kirgiz, Irina A; Calloway, Cassandra
2017-04-01
Tape lifting and FTA paper scraping methods were directly compared to traditional double swabbing for collecting touch DNA from car steering wheels (n = 70 cars). Touch DNA was collected from the left or right side of each steering wheel (randomized) using two sterile cotton swabs, while the other side was sampled using water-soluble tape or FTA paper cards. DNA was extracted and quantified in duplicate using qPCR. Quantifiable amounts of DNA were detected for 100% of the samples (n = 140) collected independent of the method. However, the DNA collection yield was dependent on the collection method. A statistically significant difference in DNA yield was observed between FTA scraping and double swabbing methods (p = 0.0051), with FTA paper collecting a two-fold higher amount. Statistical analysis showed no significant difference in DNA yields between the double swabbing and tape lifting techniques (p = 0.21). Based on the DNA concentration required for 1 ng input, 47% of the samples collected using FTA paper would be expected to yield a short tandem repeat (STR) profile compared to 30% and 23% using double swabbing or tape, respectively. Further, 55% and 77% of the samples collected using double swabbing or tape, respectively, did not yield a high enough DNA concentration for the 0.5 ng of DNA input recommended for conventional STR kits and would be expected to result in a partial or no profile compared to 35% of the samples collected using FTA paper. STR analysis was conducted for a subset of the higher concentrated samples to confirm that the DNA collected from the steering wheel was from the driver. 32 samples were selected with DNA amounts of at least 1 ng total DNA (100 pg/μl when concentrated if required). A mixed STR profile was observed for 26 samples (88%) and the last driver was the major DNA contributor for 29 samples (94%). For one sample, the last driver was the minor DNA contributor. A full STR profile of the last driver was observed for 21 samples (69%) and a partial profile was observed for nine samples (25%); STR analysis failed for two samples collected using tape (6%). In conclusion, we show that the FTA paper scraping method has the potential to collect higher DNA yields from touch DNA evidence deposited on non-porous surfaces often encountered in criminal cases compared to conventional methods. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Farida, Abesi; Maryam, Ehsani; Ali, Mirzapour; Ehsan, Moudi; Sajad, Yousefi; Soraya, Khafri
2013-01-01
Obtaining a correct working length is necessary for successful root canal treatment. The aim of this study was to compare conventional and digital radiography in measuring root canal working length. In this in vitro study 20 mesio buccal canal from maxillary first molars with moderate and severe curvature and 20 canal form anterior teeth with mild curvature were chosen and their working length were measured with number 15 k file (Maillefer, DENTSPLY, Germany). Then for each canal five radiographies were taken, three conventional radiographies using three methods of processing: Manual, automatic, and monobath solution; in addition to two other digital radiographies using CCD and PSP receptors. Two independent observers measured working length in each technique. Finally, the mean of working length in each group was compared with real working length using a paired T-test. Also a one-way ANOVA test was used for comparing the two groups. The level of statistical significance was P < 0.05. The results have shown that there was a high interobserver agreement on the measurements of the working length in conventional and digital radiography (P ≤ 0.001). Also there was no significant difference between conventional and digital radiography in measuring working length (P > 0.05). Therefore it was concluded that the accuracy of digital radiography is comparable with conventional radiography in measuring working length, so considering the advantages of the digital radiography, it can be used for working length determination.
A NEW LOG EVALUATION METHOD TO APPRAISE MESAVERDE RE-COMPLETION OPPORTUNITIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albert Greer
2003-09-11
Artificial intelligence tools, fuzzy logic and neural networks were used to evaluate the potential of the behind pipe Mesaverde formation in BMG's Mancos formation wells. A fractal geostatistical mapping algorithm was also used to predict Mesaverde production. Additionally, a conventional geological study was conducted. To date one Mesaverde completion has been performed. The Janet No.3 Mesaverde completion was non-economic. Both the AI method and the geostatistical methods predicted the failure of the Janet No.3. The Gavilan No.1 in the Mesaverde was completed during the course of the study and was an extremely good well. This well was not included inmore » the statistical dataset. The AI method predicted very good production while the fractal map predicted a poor producer.« less
Infrared face recognition based on LBP histogram and KW feature selection
NASA Astrophysics Data System (ADS)
Xie, Zhihua
2014-07-01
The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).
Analysis of longitudinal data from animals with missing values using SPSS.
Duricki, Denise A; Soleman, Sara; Moon, Lawrence D F
2016-06-01
Testing of therapies for disease or injury often involves the analysis of longitudinal data from animals. Modern analytical methods have advantages over conventional methods (particularly when some data are missing), yet they are not used widely by preclinical researchers. Here we provide an easy-to-use protocol for the analysis of longitudinal data from animals, and we present a click-by-click guide for performing suitable analyses using the statistical package IBM SPSS Statistics software (SPSS). We guide readers through the analysis of a real-life data set obtained when testing a therapy for brain injury (stroke) in elderly rats. If a few data points are missing, as in this example data set (for example, because of animal dropout), repeated-measures analysis of covariance may fail to detect a treatment effect. An alternative analysis method, such as the use of linear models (with various covariance structures), and analysis using restricted maximum likelihood estimation (to include all available data) can be used to better detect treatment effects. This protocol takes 2 h to carry out.
Wald, Lawrence L; Polimeni, Jonathan R
2017-07-01
We review the components of time-series noise in fMRI experiments and the effect of image acquisition parameters on the noise. In addition to helping determine the total amount of signal and noise (and thus temporal SNR), the acquisition parameters have been shown to be critical in determining the ratio of thermal to physiological induced noise components in the time series. Although limited attention has been given to this latter metric, we show that it determines the degree of spatial correlations seen in the time-series noise. The spatially correlations of the physiological noise component are well known, but recent studies have shown that they can lead to a higher than expected false-positive rate in cluster-wise inference based on parametric statistical methods used by many researchers. Based on understanding the effect of acquisition parameters on the noise mixture, we propose several acquisition strategies that might be helpful reducing this elevated false-positive rate, such as moving to high spatial resolution or using highly-accelerated acquisitions where thermal sources dominate. We suggest that the spatial noise correlations at the root of the inflated false-positive rate problem can be limited with these strategies, and the well-behaved spatial auto-correlation functions (ACFs) assumed by the conventional statistical methods are retained if the high resolution data is smoothed to conventional resolutions. Copyright © 2017 Elsevier Inc. All rights reserved.
Citirik, Mehmet; Batman, Cosar; Bicer, Tolga; Zilelioglu, Orhan
2009-09-01
To assess the alterations in keratometric astigmatism following the 25-gauge transconjunctival sutureless pars plana vitrectomy versus the conventional pars plana vitrectomy. Sixteen consecutive patients were enrolled into the study. Conventional vitrectomy was applied to eight of the cases and 25-gauge transconjunctival sutureless vitrectomy was performed in eight patients. Keratometry was performed before and after the surgery. In the 25-gauge transconjunctival sutureless pars plana vitrectomy group, statistically significant changes were not observed in the corneal curvature in any post-operative follow-up measurement (p > 0.05); whereas in the conventional pars plana vitrectomy group, statistically significant changes were observed in the first postoperative day (p = 0.01) and first postoperative month (p = 0.03). We noted that these changes returned to baseline in three months (p = 0.26). Both 25-gauge transconjunctival sutureless and conventional pars plana vitrectomy are effective surgical modalities for selected diseases of the posterior segment. Surgical procedures are critical for the visual rehabilitation of the patients. The post-operative corneal astigmatism of the vitrectomised eyes can be accurately determined at least two months post-operatively.
Ullattuthodi, Sujana; Cherian, Kandathil Phillip; Anandkumar, R; Nambiar, M Sreedevi
2017-01-01
This in vitro study seeks to evaluate and compare the marginal and internal fit of cobalt-chromium copings fabricated using the conventional and direct metal laser sintering (DMLS) techniques. A master model of a prepared molar tooth was made using cobalt-chromium alloy. Silicone impression of the master model was made and thirty standardized working models were then produced; twenty working models for conventional lost-wax technique and ten working models for DMLS technique. A total of twenty metal copings were fabricated using two different production techniques: conventional lost-wax method and DMLS; ten samples in each group. The conventional and DMLS copings were cemented to the working models using glass ionomer cement. Marginal gap of the copings were measured at predetermined four points. The die with the cemented copings are standardized-sectioned with a heavy duty lathe. Then, each sectioned samples were analyzed for the internal gap between the die and the metal coping using a metallurgical microscope. Digital photographs were taken at ×50 magnification and analyzed using measurement software. Statistical analysis was done by unpaired t -test and analysis of variance (ANOVA). The results of this study reveal that no significant difference was present in the marginal gap of conventional and DMLS copings ( P > 0.05) by means of ANOVA. The mean values of internal gap of DMLS copings were significantly greater than that of conventional copings ( P < 0.05). Within the limitations of this in vitro study, it was concluded that the internal fit of conventional copings was superior to that of the DMLS copings. Marginal fit of the copings fabricated by two different techniques had no significant difference.
Training Deep Spiking Neural Networks Using Backpropagation.
Lee, Jun Haeng; Delbruck, Tobi; Pfeiffer, Michael
2016-01-01
Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation. However, training such networks is difficult due to the non-differentiable nature of spike events. In this paper, we introduce a novel technique, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise. This enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membrane potentials. Compared with previous methods relying on indirect training and conversion, our technique has the potential to capture the statistics of spikes more precisely. We evaluate the proposed framework on artificially generated events from the original MNIST handwritten digit benchmark, and also on the N-MNIST benchmark recorded with an event-based dynamic vision sensor, in which the proposed method reduces the error rate by a factor of more than three compared to the best previous SNN, and also achieves a higher accuracy than a conventional convolutional neural network (CNN) trained and tested on the same data. We demonstrate in the context of the MNIST task that thanks to their event-driven operation, deep SNNs (both fully connected and convolutional) trained with our method achieve accuracy equivalent with conventional neural networks. In the N-MNIST example, equivalent accuracy is achieved with about five times fewer computational operations.
Tsochatzis, Emmanouil D; Bladenopoulos, Konstantinos; Papageorgiou, Maria
2012-06-01
Tocotrienols and tocopherols (tocols) are important phytochemical compounds with antioxidant activity and potential benefits for human health. Among cereals, barley is a good source of tocols. In the present study the effect of two cultivation methods, organic and conventional, on the tocol content in 12 Greek barley varieties was investigated. A validated reverse phase high-performance liquid chromatography method (RP-HPLC) with fluorescence detection (excitation at 292 nm, emission at 335 nm) was applied along with direct solvent extraction with acetonitrile at a 1:30 (w/v) sample/solvent ratio for tocol quantification. The results showed statistically significant differences (P < 0.05) between the two cultivation methods (except for δ-tocopherol) as well as among varieties. In the case of organic cultivation the four homologues of tocotrienol (α-, β + γ- and δ-) increased, by 3.05-37.14% for α-tocotrienol, 15.51-41.09% for (β + γ)-tocotrienol and 30.45-196.61% for δ-tocotrienol, while those of tocopherol (α- and β + γ- but not δ-) decreased, by 5.90-36.34% for α-tocopherol and 2.84-46.49% for (β + γ)-tocopherol. A simple correlation analysis between tocols revealed a good correlation between (β + γ)-tocotrienol and δ-tocotrienol. Although there was a significant decrease in the important α-tocopherol in the varieties studied under organic cultivation, there was an overall increase in tocotrienol content. The cultivation method (organic or conventional) had an important effect on tocotrienol and tocopherol concentrations in barley. An overall increase in total tocol content and a clear increment in the tocotrienol/tocopherol ratio were observed. Copyright © 2012 Society of Chemical Industry.
Kohli, Divyata; Badakar, Chandrashekhar M; Vyavahare, Supriya S; Shah, Parin P; Gokhale, Niraj S; Patel, Punit M; Mundada, Madhura V
2017-01-01
Introduction Early treatment of carious lesions in children is important for the maintenance of oral health. Multicoloured restorations could be the impetus for an extremely nervous or defiant child to take dental treatment. Aim The aim of this study was to assess and compare the clinical success of conventional composites and coloured compomer material in first permanent molars of children with mixed dentition. Materials and Methods A total of sixty sites, divided into two groups, with thirty subjects in each group using split mouth design were chosen amongst patients reporting to Department of Pedodontics and Preventive Dentistry. In control group conventional composites were placed, similarly coloured compomers were placed in experimental group under standard operating protocol. Patients were recalled for assessment of clinical success amongst control as well as experimental group at regular intervals of one; three and six months follow up based on Modified Ryge’s Criteria. Statistical analysis was done using Chi-square test using SPSS version 20.0 (Chicago, USA). Results Both conventional composites and coloured compomers had comparable retention rates in terms of anatomical form, marginal integrity, secondary caries and marginal discolouration. Conclusion The coloured compomer material showed promising results in this six month follow up study in permanent molars and had properties comparable to that of conventional composites. PMID:28764297
Lages Barbosa, Guilherme; Almeida Gadelha, Francisca Daiane; Kublik, Natalya; Proctor, Alan; Reichelm, Lucas; Weissinger, Emily; Wohlleb, Gregory M.; Halden, Rolf U.
2015-01-01
The land, water, and energy requirements of hydroponics were compared to those of conventional agriculture by example of lettuce production in Yuma, Arizona, USA. Data were obtained from crop budgets and governmental agricultural statistics, and contrasted with theoretical data for hydroponic lettuce production derived by using engineering equations populated with literature values. Yields of lettuce per greenhouse unit (815 m2) of 41 ± 6.1 kg/m2/y had water and energy demands of 20 ± 3.8 L/kg/y and 90,000 ± 11,000 kJ/kg/y (±standard deviation), respectively. In comparison, conventional production yielded 3.9 ± 0.21 kg/m2/y of produce, with water and energy demands of 250 ± 25 L/kg/y and 1100 ± 75 kJ/kg/y, respectively. Hydroponics offered 11 ± 1.7 times higher yields but required 82 ± 11 times more energy compared to conventionally produced lettuce. To the authors’ knowledge, this is the first quantitative comparison of conventional and hydroponic produce production by example of lettuce grown in the southwestern United States. It identified energy availability as a major factor in assessing the sustainability of hydroponics, and it points to water-scarce settings offering an abundance of renewable energy (e.g., from solar, geothermal, or wind power) as particularly attractive regions for hydroponic agriculture. PMID:26086708
Barbosa, Guilherme Lages; Gadelha, Francisca Daiane Almeida; Kublik, Natalya; Proctor, Alan; Reichelm, Lucas; Weissinger, Emily; Wohlleb, Gregory M; Halden, Rolf U
2015-06-16
The land, water, and energy requirements of hydroponics were compared to those of conventional agriculture by example of lettuce production in Yuma, Arizona, USA. Data were obtained from crop budgets and governmental agricultural statistics, and contrasted with theoretical data for hydroponic lettuce production derived by using engineering equations populated with literature values. Yields of lettuce per greenhouse unit (815 m2) of 41 ± 6.1 kg/m2/y had water and energy demands of 20 ± 3.8 L/kg/y and 90,000 ± 11,000 kJ/kg/y (±standard deviation), respectively. In comparison, conventional production yielded 3.9 ± 0.21 kg/m2/y of produce, with water and energy demands of 250 ± 25 L/kg/y and 1100 ± 75 kJ/kg/y, respectively. Hydroponics offered 11 ± 1.7 times higher yields but required 82 ± 11 times more energy compared to conventionally produced lettuce. To the authors' knowledge, this is the first quantitative comparison of conventional and hydroponic produce production by example of lettuce grown in the southwestern United States. It identified energy availability as a major factor in assessing the sustainability of hydroponics, and it points to water-scarce settings offering an abundance of renewable energy (e.g., from solar, geothermal, or wind power) as particularly attractive regions for hydroponic agriculture.
Natural air leak test without submergence for spontaneous pneumothorax.
Uramoto, Hidetaka; Tanaka, Fumihiro
2011-12-24
Postoperative air leaks are frequent complications after surgery for a spontaneous pneumothorax (SP). We herein describe a new method to test for air leaks by using a transparent film and thoracic tube in a closed system. Between 2005 and 2010, 35 patients underwent a novel method for evaluating air leaks without submergence, and their clinical records were retrospectively reviewed. The data on patient characteristics, surgical details, and perioperative outcomes were analyzed. The differences in the clinical background and intraoperative factors did not reach a statistically significant level between the new and classical methods. The incidence of recurrence was also equivalent to the standard method. However, the length of the operation and drainage periods were significantly shorter in patients evaluated using the new method than the conventional method. Further, no postoperative complications were observed in patients evaluated using the new method. This simple technique is satisfactorily effective and does not result in any complications.
NASA Astrophysics Data System (ADS)
Kivalov, Sergey N.; Fitzjarrald, David R.
2018-02-01
Cloud shadows lead to alternating light and dark periods at the surface, with the most abrupt changes occurring in the presence of low-level forced cumulus clouds. We examine multiyear irradiance time series observed at a research tower in a midlatitude mixed deciduous forest (Harvard Forest, Massachusetts, USA: 42.53{°}N, 72.17{°}W) and one made at a similar tower in a tropical rain forest (Tapajós National Forest, Pará, Brazil: 2.86{°}S, 54.96{°}W). We link the durations of these periods statistically to conventional meteorological reports of sky type and cloud height at the two forests and present a method to synthesize the surface irradiance time series from sky-type information. Four classes of events describing distinct sequential irradiance changes at the transition from cloud shadow and direct sunlight are identified: sharp-to-sharp, slow-to-slow, sharp-to-slow, and slow-to-sharp. Lognormal and the Weibull statistical distributions distinguish among cloudy-sky types. Observers' qualitative reports of `scattered' and `broken' clouds are quantitatively distinguished by a threshold value of the ratio of mean clear to cloudy period durations. Generated synthetic time series based on these statistics adequately simulate the temporal "radiative forcing" linked to sky type. Our results offer a quantitative way to connect the conventional meteorological sky type to the time series of irradiance experienced at the surface.
Dinakaran, Shiji
2015-01-01
Background: Cervical lesions of anterior and posterior teeth are a common finding in routine dental practice. They are of much concern to the patient, if present in esthetically sensitive regions. Adhesive tooth-colored restorative materials are generally recommended for treating such lesions. The aim of the present study was to evaluate and compare the effect of various food media (lime juice, tea, coffee, and Coca-Cola) on the marginal integrity of Class V compomer (Dyract®), conventional glass-ionomer (Fuji II) and resin-modified glass-ionomer (Fuji II LC improved) restorations along their cemental and enamel margins with saline as control media. Materials and Methods: After restoration of prepared Class V cavities in human premolars with the three different materials (n = 8), they were immersed in the test media for 7 days and then stained with methylene blue dye. Buccolingual sections were prepared and examined under stereomicroscope and scores (0-2) were given. Results: Data were analyzed statistically using one-way analysis of variance in SPSS version 16.0. P < 0.05 were considered statistically significant. Conclusions: Among the three tested materials Compomer (Dyract®) showed more marginal integrity than the other two. Micro leakage values of Fuji II and Fuji II LC improved were statistically significant in acidic media (lime juice and Coca-Cola) compared to saline. Enamel margins showed more marginal adaptation than cemental margins. PMID:25878480
Watanabe, Hiroshi C; Kubillus, Maximilian; Kubař, Tomáš; Stach, Robert; Mizaikoff, Boris; Ishikita, Hiroshi
2017-07-21
In the condensed phase, quantum chemical properties such as many-body effects and intermolecular charge fluctuations are critical determinants of the solvation structure and dynamics. Thus, a quantum mechanical (QM) molecular description is required for both solute and solvent to incorporate these properties. However, it is challenging to conduct molecular dynamics (MD) simulations for condensed systems of sufficient scale when adapting QM potentials. To overcome this problem, we recently developed the size-consistent multi-partitioning (SCMP) quantum mechanics/molecular mechanics (QM/MM) method and realized stable and accurate MD simulations, using the QM potential to a benchmark system. In the present study, as the first application of the SCMP method, we have investigated the structures and dynamics of Na + , K + , and Ca 2+ solutions based on nanosecond-scale sampling, a sampling 100-times longer than that of conventional QM-based samplings. Furthermore, we have evaluated two dynamic properties, the diffusion coefficient and difference spectra, with high statistical certainty. Furthermore the calculation of these properties has not previously been possible within the conventional QM/MM framework. Based on our analysis, we have quantitatively evaluated the quantum chemical solvation effects, which show distinct differences between the cations.
Bueno, Ana María; Marín, Miguel Ángel; Contento, Ana María; Ríos, Ángel
2016-02-01
A chromatographic method, using amperometric detection, for the sensitive determination of six representative mutagenic amines was developed. A glassy carbon electrode (GCE), modified with multiwall carbon nanotubes (GCE-CNTs), was prepared and its response compared to a conventional glassy carbon electrode. The chromatographic method (HPLC-GCE-CNTs) allowed the separation and the determination of heterocyclic aromatic amines (HAAs) classified as mutagenic amines by the International Agency for Research of Cancer. The new electrode was systematically studied in terms of stability, sensitivity, and reproducibility. Statistical analysis of the obtained data demonstrated that the modified electrode provided better sensitivity than the conventional unmodified ones. Detection limits were in the 3.0 and 7.5 ng/mL range, whereas quantification limits ranged between 9.5 and 25.0 ng/mL were obtained. The applicability of the method was demonstrated by the determination of the amines in several types of samples (water and food samples). Recoveries indicate very good agreement between amounts added and those found for all HAAs (recoveries in the 92% and 105% range). Copyright © 2015 Elsevier Ltd. All rights reserved.
Ramilo, Andrea; Navas, J Ignacio; Villalba, Antonio; Abollo, Elvira
2013-05-27
Bonamia ostreae and B. exitiosa have caused mass mortalities of various oyster species around the world and co-occur in some European areas. The World Organisation for Animal Health (OIE) has included infections with both species in the list of notifiable diseases. However, official methods for species-specific diagnosis of either parasite have certain limitations. In this study, new species-specific conventional PCR (cPCR) and real-time PCR techniques were developed to diagnose each parasite species. Moreover, a multiplex PCR method was designed to detect both parasites in a single assay. The analytical sensitivity and specificity of each new method were evaluated. These new procedures were compared with 2 OIE-recommended methods, viz. standard histology and PCR-RFLP. The new procedures showed higher sensitivity than the OIE recommended ones for the diagnosis of both species. The sensitivity of tests with the new primers was higher using oyster gills and gonad tissue, rather than gills alone. The lack of a 'gold standard' prevented accurate estimation of sensitivity and specificity of the new methods. The implementation of statistical tools (maximum likelihood method) for the comparison of the diagnostic tests showed the possibility of false positives with the new procedures, although the absence of a gold standard precluded certainty. Nevertheless, all procedures showed negative results when used for the analysis of oysters from a Bonamia-free area.
Bayesian-MCMC-based parameter estimation of stealth aircraft RCS models
NASA Astrophysics Data System (ADS)
Xia, Wei; Dai, Xiao-Xia; Feng, Yuan
2015-12-01
When modeling a stealth aircraft with low RCS (Radar Cross Section), conventional parameter estimation methods may cause a deviation from the actual distribution, owing to the fact that the characteristic parameters are estimated via directly calculating the statistics of RCS. The Bayesian-Markov Chain Monte Carlo (Bayesian-MCMC) method is introduced herein to estimate the parameters so as to improve the fitting accuracies of fluctuation models. The parameter estimations of the lognormal and the Legendre polynomial models are reformulated in the Bayesian framework. The MCMC algorithm is then adopted to calculate the parameter estimates. Numerical results show that the distribution curves obtained by the proposed method exhibit improved consistence with the actual ones, compared with those fitted by the conventional method. The fitting accuracy could be improved by no less than 25% for both fluctuation models, which implies that the Bayesian-MCMC method might be a good candidate among the optimal parameter estimation methods for stealth aircraft RCS models. Project supported by the National Natural Science Foundation of China (Grant No. 61101173), the National Basic Research Program of China (Grant No. 613206), the National High Technology Research and Development Program of China (Grant No. 2012AA01A308), the State Scholarship Fund by the China Scholarship Council (CSC), and the Oversea Academic Training Funds, and University of Electronic Science and Technology of China (UESTC).
Parallel consensual neural networks.
Benediktsson, J A; Sveinsson, J R; Ersoy, O K; Swain, P H
1997-01-01
A new type of a neural-network architecture, the parallel consensual neural network (PCNN), is introduced and applied in classification/data fusion of multisource remote sensing and geographic data. The PCNN architecture is based on statistical consensus theory and involves using stage neural networks with transformed input data. The input data are transformed several times and the different transformed data are used as if they were independent inputs. The independent inputs are first classified using the stage neural networks. The output responses from the stage networks are then weighted and combined to make a consensual decision. In this paper, optimization methods are used in order to weight the outputs from the stage networks. Two approaches are proposed to compute the data transforms for the PCNN, one for binary data and another for analog data. The analog approach uses wavelet packets. The experimental results obtained with the proposed approach show that the PCNN outperforms both a conjugate-gradient backpropagation neural network and conventional statistical methods in terms of overall classification accuracy of test data.
Kim, Youngwoo; Hong, Byung Woo; Kim, Seung Ja; Kim, Jong Hyo
2014-07-01
A major challenge when distinguishing glandular tissues on mammograms, especially for area-based estimations, lies in determining a boundary on a hazy transition zone from adipose to glandular tissues. This stems from the nature of mammography, which is a projection of superimposed tissues consisting of different structures. In this paper, the authors present a novel segmentation scheme which incorporates the learned prior knowledge of experts into a level set framework for fully automated mammographic density estimations. The authors modeled the learned knowledge as a population-based tissue probability map (PTPM) that was designed to capture the classification of experts' visual systems. The PTPM was constructed using an image database of a selected population consisting of 297 cases. Three mammogram experts extracted regions for dense and fatty tissues on digital mammograms, which was an independent subset used to create a tissue probability map for each ROI based on its local statistics. This tissue class probability was taken as a prior in the Bayesian formulation and was incorporated into a level set framework as an additional term to control the evolution and followed the energy surface designed to reflect experts' knowledge as well as the regional statistics inside and outside of the evolving contour. A subset of 100 digital mammograms, which was not used in constructing the PTPM, was used to validate the performance. The energy was minimized when the initial contour reached the boundary of the dense and fatty tissues, as defined by experts. The correlation coefficient between mammographic density measurements made by experts and measurements by the proposed method was 0.93, while that with the conventional level set was 0.47. The proposed method showed a marked improvement over the conventional level set method in terms of accuracy and reliability. This result suggests that the proposed method successfully incorporated the learned knowledge of the experts' visual systems and has potential to be used as an automated and quantitative tool for estimations of mammographic breast density levels.
Biophotons, coherence and photocount statistics: A critical review
NASA Astrophysics Data System (ADS)
Cifra, Michal; Brouder, Christian; Nerudová, Michaela; Kučera, Ondřej
2015-08-01
Biological samples continuously emit ultra-weak photon emission (UPE, or "biophotons") which stems from electronic excited states generated chemically during oxidative metabolism and stress. Thus, UPE can potentially serve as a method for non-invasive diagnostics of oxidative processes or, if discovered, also of other processes capable of electron excitation. While the fundamental generating mechanisms of UPE are fairly elucidated together with their approximate ranges of intensities and spectra, statistical properties of UPE is still a highly challenging topic. Here we review claims about nontrivial statistical properties of UPE, such as coherence and squeezed states of light. After introduction to the necessary theory, we categorize the experimental works of all authors to those with solid, conventional interpretation and those with unconventional and even speculative interpretation. The conclusion of our review is twofold; while the phenomenon of UPE from biological systems can be considered experimentally well established, no reliable evidence for the coherence or nonclassicality of UPE was actually achieved up to now. Furthermore, we propose perspective avenues in the research of statistical properties of biological UPE.
In vivo precision of conventional and digital methods for obtaining quadrant dental impressions.
Ender, Andreas; Zimmermann, Moritz; Attin, Thomas; Mehl, Albert
2016-09-01
Quadrant impressions are commonly used as alternative to full-arch impressions. Digital impression systems provide the ability to take these impressions very quickly; however, few studies have investigated the accuracy of the technique in vivo. The aim of this study is to assess the precision of digital quadrant impressions in vivo in comparison to conventional impression techniques. Impressions were obtained via two conventional (metal full-arch tray, CI, and triple tray, T-Tray) and seven digital impression systems (Lava True Definition Scanner, T-Def; Lava Chairside Oral Scanner, COS; Cadent iTero, ITE; 3Shape Trios, TRI; 3Shape Trios Color, TRC; CEREC Bluecam, Software 4.0, BC4.0; CEREC Bluecam, Software 4.2, BC4.2; and CEREC Omnicam, OC). Impressions were taken three times for each of five subjects (n = 15). The impressions were then superimposed within the test groups. Differences from model surfaces were measured using a normal surface distance method. Precision was calculated using the Perc90_10 value. The values for all test groups were statistically compared. The precision ranged from 18.8 (CI) to 58.5 μm (T-Tray), with the highest precision in the CI, T-Def, BC4.0, TRC, and TRI groups. The deviation pattern varied distinctly depending on the impression method. Impression systems with single-shot capture exhibited greater deviations at the tooth surface whereas high-frame rate impression systems differed more in gingival areas. Triple tray impressions displayed higher local deviation at the occlusal contact areas of upper and lower jaw. Digital quadrant impression methods achieve a level of precision, comparable to conventional impression techniques. However, there are significant differences in terms of absolute values and deviation pattern. With all tested digital impression systems, time efficient capturing of quadrant impressions is possible. The clinical precision of digital quadrant impression models is sufficient to cover a broad variety of restorative indications. Yet the precision differs significantly between the digital impression systems.
Accuracy of complete-arch dental impressions: a new method of measuring trueness and precision.
Ender, Andreas; Mehl, Albert
2013-02-01
A new approach to both 3-dimensional (3D) trueness and precision is necessary to assess the accuracy of intraoral digital impressions and compare them to conventionally acquired impressions. The purpose of this in vitro study was to evaluate whether a new reference scanner is capable of measuring conventional and digital intraoral complete-arch impressions for 3D accuracy. A steel reference dentate model was fabricated and measured with a reference scanner (digital reference model). Conventional impressions were made from the reference model, poured with Type IV dental stone, scanned with the reference scanner, and exported as digital models. Additionally, digital impressions of the reference model were made and the digital models were exported. Precision was measured by superimposing the digital models within each group. Superimposing the digital models on the digital reference model assessed the trueness of each impression method. Statistical significance was assessed with an independent sample t test (α=.05). The reference scanner delivered high accuracy over the entire dental arch with a precision of 1.6 ±0.6 µm and a trueness of 5.3 ±1.1 µm. Conventional impressions showed significantly higher precision (12.5 ±2.5 µm) and trueness values (20.4 ±2.2 µm) with small deviations in the second molar region (P<.001). Digital impressions were significantly less accurate with a precision of 32.4 ±9.6 µm and a trueness of 58.6 ±15.8µm (P<.001). More systematic deviations of the digital models were visible across the entire dental arch. The new reference scanner is capable of measuring the precision and trueness of both digital and conventional complete-arch impressions. The digital impression is less accurate and shows a different pattern of deviation than the conventional impression. Copyright © 2013 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.
Charpentier, R.R.; Klett, T.R.
2005-01-01
During the last 30 years, the methodology for assessment of undiscovered conventional oil and gas resources used by the Geological Survey has undergone considerable change. This evolution has been based on five major principles. First, the U.S. Geological Survey has responsibility for a wide range of U.S. and world assessments and requires a robust methodology suitable for immaturely explored as well as maturely explored areas. Second, the assessments should be based on as comprehensive a set of geological and exploration history data as possible. Third, the perils of methods that solely use statistical methods without geological analysis are recognized. Fourth, the methodology and course of the assessment should be documented as transparently as possible, within the limits imposed by the inevitable use of subjective judgement. Fifth, the multiple uses of the assessments require a continuing effort to provide the documentation in such ways as to increase utility to the many types of users. Undiscovered conventional oil and gas resources are those recoverable volumes in undiscovered, discrete, conventional structural or stratigraphic traps. The USGS 2000 methodology for these resources is based on a framework of assessing numbers and sizes of undiscovered oil and gas accumulations and the associated risks. The input is standardized on a form termed the Seventh Approximation Data Form for Conventional Assessment Units. Volumes of resource are then calculated using a Monte Carlo program named Emc2, but an alternative analytic (non-Monte Carlo) program named ASSESS also can be used. The resource assessment methodology continues to change. Accumulation-size distributions are being examined to determine how sensitive the results are to size-distribution assumptions. The resource assessment output is changing to provide better applicability for economic analysis. The separate methodology for assessing continuous (unconventional) resources also has been evolving. Further studies of the relationship between geologic models of conventional and continuous resources will likely impact the respective resource assessment methodologies. ?? 2005 International Association for Mathematical Geology.
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
Idris A, Elbakri; Fessler, Jeffrey A
2003-08-07
This paper describes a statistical image reconstruction method for x-ray CT that is based on a physical model that accounts for the polyenergetic x-ray source spectrum and the measurement nonlinearities caused by energy-dependent attenuation. Unlike our earlier work, the proposed algorithm does not require pre-segmentation of the object into the various tissue classes (e.g., bone and soft tissue) and allows mixed pixels. The attenuation coefficient of each voxel is modelled as the product of its unknown density and a weighted sum of energy-dependent mass attenuation coefficients. We formulate a penalized-likelihood function for this polyenergetic model and develop an iterative algorithm for estimating the unknown density of each voxel. Applying this method to simulated x-ray CT measurements of objects containing both bone and soft tissue yields images with significantly reduced beam hardening artefacts relative to conventional beam hardening correction methods. We also apply the method to real data acquired from a phantom containing various concentrations of potassium phosphate solution. The algorithm reconstructs an image with accurate density values for the different concentrations, demonstrating its potential for quantitative CT applications.
Basaki, Kinga; Alkumru, Hasan; De Souza, Grace; Finer, Yoav
To assess the three-dimensional (3D) accuracy and clinical acceptability of implant definitive casts fabricated using a digital impression approach and to compare the results with those of a conventional impression method in a partially edentulous condition. A mandibular reference model was fabricated with implants in the first premolar and molar positions to simulate a patient with bilateral posterior edentulism. Ten implant-level impressions per method were made using either an intraoral scanner with scanning abutments for the digital approach or an open-tray technique and polyvinylsiloxane material for the conventional approach. 3D analysis and comparison of implant location on resultant definitive casts were performed using laser scanner and quality control software. The inter-implant distances and interimplant angulations for each implant pair were measured for the reference model and for each definitive cast (n = 20 per group); these measurements were compared to calculate the magnitude of error in 3D for each definitive cast. The influence of implant angulation on definitive cast accuracy was evaluated for both digital and conventional approaches. Statistical analysis was performed using t test (α = .05) for implant position and angulation. Clinical qualitative assessment of accuracy was done via the assessment of the passivity of a master verification stent for each implant pair, and significance was analyzed using chi-square test (α = .05). A 3D error of implant positioning was observed for the two impression techniques vs the reference model, with mean ± standard deviation (SD) error of 116 ± 94 μm and 56 ± 29 μm for the digital and conventional approaches, respectively (P = .01). In contrast, the inter-implant angulation errors were not significantly different between the two techniques (P = .83). Implant angulation did not have a significant influence on definitive cast accuracy within either technique (P = .64). The verification stent demonstrated acceptable passive fit for 11 out of 20 casts and 18 out of 20 casts for the digital and conventional methods, respectively (P = .01). Definitive casts fabricated using the digital impression approach were less accurate than those fabricated from the conventional impression approach for this simulated clinical scenario. A significant number of definitive casts generated by the digital technique did not meet clinically acceptable accuracy for the fabrication of a multiple implant-supported restoration.
Chen, Chih-Hao; Hsu, Chueh-Lin; Huang, Shih-Hao; Chen, Shih-Yuan; Hung, Yi-Lin; Chen, Hsiao-Rong; Wu, Yu-Chung
2015-01-01
Although genome-wide expression analysis has become a routine tool for gaining insight into molecular mechanisms, extraction of information remains a major challenge. It has been unclear why standard statistical methods, such as the t-test and ANOVA, often lead to low levels of reproducibility, how likely applying fold-change cutoffs to enhance reproducibility is to miss key signals, and how adversely using such methods has affected data interpretations. We broadly examined expression data to investigate the reproducibility problem and discovered that molecular heterogeneity, a biological property of genetically different samples, has been improperly handled by the statistical methods. Here we give a mathematical description of the discovery and report the development of a statistical method, named HTA, for better handling molecular heterogeneity. We broadly demonstrate the improved sensitivity and specificity of HTA over the conventional methods and show that using fold-change cutoffs has lost much information. We illustrate the especial usefulness of HTA for heterogeneous diseases, by applying it to existing data sets of schizophrenia, bipolar disorder and Parkinson’s disease, and show it can abundantly and reproducibly uncover disease signatures not previously detectable. Based on 156 biological data sets, we estimate that the methodological issue has affected over 96% of expression studies and that HTA can profoundly correct 86% of the affected data interpretations. The methodological advancement can better facilitate systems understandings of biological processes, render biological inferences that are more reliable than they have hitherto been and engender translational medical applications, such as identifying diagnostic biomarkers and drug prediction, which are more robust. PMID:25793610
Zhou, Yunyi; Tao, Chenyang; Lu, Wenlian; Feng, Jianfeng
2018-04-20
Functional connectivity is among the most important tools to study brain. The correlation coefficient, between time series of different brain areas, is the most popular method to quantify functional connectivity. Correlation coefficient in practical use assumes the data to be temporally independent. However, the time series data of brain can manifest significant temporal auto-correlation. A widely applicable method is proposed for correcting temporal auto-correlation. We considered two types of time series models: (1) auto-regressive-moving-average model, (2) nonlinear dynamical system model with noisy fluctuations, and derived their respective asymptotic distributions of correlation coefficient. These two types of models are most commonly used in neuroscience studies. We show the respective asymptotic distributions share a unified expression. We have verified the validity of our method, and shown our method exhibited sufficient statistical power for detecting true correlation on numerical experiments. Employing our method on real dataset yields more robust functional network and higher classification accuracy than conventional methods. Our method robustly controls the type I error while maintaining sufficient statistical power for detecting true correlation in numerical experiments, where existing methods measuring association (linear and nonlinear) fail. In this work, we proposed a widely applicable approach for correcting the effect of temporal auto-correlation on functional connectivity. Empirical results favor the use of our method in functional network analysis. Copyright © 2018. Published by Elsevier B.V.
FMRI group analysis combining effect estimates and their variances
Chen, Gang; Saad, Ziad S.; Nath, Audrey R.; Beauchamp, Michael S.; Cox, Robert W.
2012-01-01
Conventional functional magnetic resonance imaging (FMRI) group analysis makes two key assumptions that are not always justified. First, the data from each subject is condensed into a single number per voxel, under the assumption that within-subject variance for the effect of interest is the same across all subjects or is negligible relative to the cross-subject variance. Second, it is assumed that all data values are drawn from the same Gaussian distribution with no outliers. We propose an approach that does not make such strong assumptions, and present a computationally efficient frequentist approach to FMRI group analysis, which we term mixed-effects multilevel analysis (MEMA), that incorporates both the variability across subjects and the precision estimate of each effect of interest from individual subject analyses. On average, the more accurate tests result in higher statistical power, especially when conventional variance assumptions do not hold, or in the presence of outliers. In addition, various heterogeneity measures are available with MEMA that may assist the investigator in further improving the modeling. Our method allows group effect t-tests and comparisons among conditions and among groups. In addition, it has the capability to incorporate subject-specific covariates such as age, IQ, or behavioral data. Simulations were performed to illustrate power comparisons and the capability of controlling type I errors among various significance testing methods, and the results indicated that the testing statistic we adopted struck a good balance between power gain and type I error control. Our approach is instantiated in an open-source, freely distributed program that may be used on any dataset stored in the universal neuroimaging file transfer (NIfTI) format. To date, the main impediment for more accurate testing that incorporates both within- and cross-subject variability has been the high computational cost. Our efficient implementation makes this approach practical. We recommend its use in lieu of the less accurate approach in the conventional group analysis. PMID:22245637
NASA Astrophysics Data System (ADS)
Allawi, Mohammed Falah; Jaafar, Othman; Mohamad Hamzah, Firdaus; Mohd, Nuruol Syuhadaa; Deo, Ravinesh C.; El-Shafie, Ahmed
2017-10-01
Existing forecast models applied for reservoir inflow forecasting encounter several drawbacks, due to the difficulty of the underlying mathematical procedures being to cope with and to mimic the naturalization and stochasticity of the inflow data patterns. In this study, appropriate adjustments to the conventional coactive neuro-fuzzy inference system (CANFIS) method are proposed to improve the mathematical procedure, thus enabling a better detection of the high nonlinearity patterns found in the reservoir inflow training data. This modification includes the updating of the back propagation algorithm, leading to a consequent update of the membership rules and the induction of the centre-weighted set rather than the global weighted set used in feature extraction. The modification also aids in constructing an integrated model that is able to not only detect the nonlinearity in the training data but also the wide range of features within the training data records used to simulate the forecasting model. To demonstrate the model's efficacy, the proposed CANFIS method has been applied to forecast monthly inflow data at Aswan High Dam (AHD), located in southern Egypt. Comparative analyses of the forecasting skill of the modified CANFIS and the conventional ANFIS model are carried out with statistical score indicators to assess the reliability of the developed method. The statistical metrics support the better performance of the developed CANFIS model, which significantly outperforms the ANFIS model to attain a low relative error value (23%), mean absolute error (1.4 BCM month-1), root mean square error (1.14 BCM month-1), and a relative large coefficient of determination (0.94). The present study ascertains the better utility of the modified CANFIS model in respect to the traditional ANFIS model applied in reservoir inflow forecasting for a semi-arid region.
Do flexible acrylic resin lingual flanges improve retention of mandibular complete dentures?
Ahmed Elmorsy, Ayman Elmorsy; Ahmed Ibraheem, Eman Mostafa; Ela, Alaa Aboul; Fahmy, Ahmed; Nassani, Mohammad Zakaria
2015-01-01
Objectives: The aim of this study was to compare the retention of conventional mandibular complete dentures with that of mandibular complete dentures having lingual flanges constructed with flexible acrylic resin “Versacryl.” Materials and Methods: The study sample comprised 10 completely edentulous patients. Each patient received one maxillary complete denture and two mandibular complete dentures. One mandibular denture was made of conventional heat-cured acrylic resin and the other had its lingual flanges made of flexible acrylic resin Versacryl. Digital force-meter was used to measure retention of mandibular dentures at delivery and at 2 weeks and 45 days following denture insertion. Results: The statistical analysis showed that at baseline and follow-up appointments, retention of mandibular complete dentures with flexible lingual flanges was significantly greater than retention of conventional mandibular dentures (P < 0.05). In both types of mandibular dentures, retention of dentures increased significantly over the follow-up period (P < 0.05). Conclusions: The use of flexible acrylic resin lingual flanges in the construction of mandibular complete dentures improved denture retention. PMID:26539387
Simulation-based sensitivity analysis for non-ignorably missing data.
Yin, Peng; Shi, Jian Q
2017-01-01
Sensitivity analysis is popular in dealing with missing data problems particularly for non-ignorable missingness, where full-likelihood method cannot be adopted. It analyses how sensitively the conclusions (output) may depend on assumptions or parameters (input) about missing data, i.e. missing data mechanism. We call models with the problem of uncertainty sensitivity models. To make conventional sensitivity analysis more useful in practice we need to define some simple and interpretable statistical quantities to assess the sensitivity models and make evidence based analysis. We propose a novel approach in this paper on attempting to investigate the possibility of each missing data mechanism model assumption, by comparing the simulated datasets from various MNAR models with the observed data non-parametrically, using the K-nearest-neighbour distances. Some asymptotic theory has also been provided. A key step of this method is to plug in a plausibility evaluation system towards each sensitivity parameter, to select plausible values and reject unlikely values, instead of considering all proposed values of sensitivity parameters as in the conventional sensitivity analysis method. The method is generic and has been applied successfully to several specific models in this paper including meta-analysis model with publication bias, analysis of incomplete longitudinal data and mean estimation with non-ignorable missing data.
Measurement of the $B^-$ lifetime using a simulation free approach for trigger bias correction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaltonen, T.; /Helsinki Inst. of Phys.; Adelman, J.
2010-04-01
The collection of a large number of B hadron decays to hadronic final states at the CDF II detector is possible due to the presence of a trigger that selects events based on track impact parameters. However, the nature of the selection requirements of the trigger introduces a large bias in the observed proper decay time distribution. A lifetime measurement must correct for this bias and the conventional approach has been to use a Monte Carlo simulation. The leading sources of systematic uncertainty in the conventional approach are due to differences between the data and the Monte Carlo simulation. Inmore » this paper they present an analytic method for bias correction without using simulation, thereby removing any uncertainty between data and simulation. This method is presented in the form of a measurement of the lifetime of the B{sup -} using the mode B{sup -} {yields} D{sup 0}{pi}{sup -}. The B{sup -} lifetime is measured as {tau}{sub B{sup -}} = 1.663 {+-} 0.023 {+-} 0.015 ps, where the first uncertainty is statistical and the second systematic. This new method results in a smaller systematic uncertainty in comparison to methods that use simulation to correct for the trigger bias.« less
InSAR Tropospheric Correction Methods: A Statistical Comparison over Different Regions
NASA Astrophysics Data System (ADS)
Bekaert, D. P.; Walters, R. J.; Wright, T. J.; Hooper, A. J.; Parker, D. J.
2015-12-01
Observing small magnitude surface displacements through InSAR is highly challenging, and requires advanced correction techniques to reduce noise. In fact, one of the largest obstacles facing the InSAR community is related to tropospheric noise correction. Spatial and temporal variations in temperature, pressure, and relative humidity result in a spatially-variable InSAR tropospheric signal, which masks smaller surface displacements due to tectonic or volcanic deformation. Correction methods applied today include those relying on weather model data, GNSS and/or spectrometer data. Unfortunately, these methods are often limited by the spatial and temporal resolution of the auxiliary data. Alternatively a correction can be estimated from the high-resolution interferometric phase by assuming a linear or a power-law relationship between the phase and topography. For these methods, the challenge lies in separating deformation from tropospheric signals. We will present results of a statistical comparison of the state-of-the-art tropospheric corrections estimated from spectrometer products (MERIS and MODIS), a low and high spatial-resolution weather model (ERA-I and WRF), and both the conventional linear and power-law empirical methods. We evaluate the correction capability over Southern Mexico, Italy, and El Hierro, and investigate the impact of increasing cloud cover on the accuracy of the tropospheric delay estimation. We find that each method has its strengths and weaknesses, and suggest that further developments should aim to combine different correction methods. All the presented methods are included into our new open source software package called TRAIN - Toolbox for Reducing Atmospheric InSAR Noise (Bekaert et al., in review), which is available to the community Bekaert, D., R. Walters, T. Wright, A. Hooper, and D. Parker (in review), Statistical comparison of InSAR tropospheric correction techniques, Remote Sensing of Environment
NASA Astrophysics Data System (ADS)
Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.
2012-04-01
The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 years, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterise the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. Shortly after the publication of this method an eruption in the island of El Hierro took place for the first time in historical times, supporting our method and contributing towards the validation of our results.
Reproducibility Between Brain Uptake Ratio Using Anatomic Standardization and Patlak-Plot Methods.
Shibutani, Takayuki; Onoguchi, Masahisa; Noguchi, Atsushi; Yamada, Tomoki; Tsuchihashi, Hiroko; Nakajima, Tadashi; Kinuya, Seigo
2015-12-01
The Patlak-plot and conventional methods of determining brain uptake ratio (BUR) have some problems with reproducibility. We formulated a method of determining BUR using anatomic standardization (BUR-AS) in a statistical parametric mapping algorithm to improve reproducibility. The objective of this study was to demonstrate the inter- and intraoperator reproducibility of mean cerebral blood flow as determined using BUR-AS in comparison to the conventional-BUR (BUR-C) and Patlak-plot methods. The images of 30 patients who underwent brain perfusion SPECT were retrospectively used in this study. The images were reconstructed using ordered-subset expectation maximization and processed using an automatic quantitative analysis for cerebral blood flow of ECD tool. The mean SPECT count was calculated from axial basal ganglia slices of the normal side (slices 31-40) drawn using a 3-dimensional stereotactic region-of-interest template after anatomic standardization. The mean cerebral blood flow was calculated from the mean SPECT count. Reproducibility was evaluated using coefficient of variation and Bland-Altman plotting. For both inter- and intraoperator reproducibility, the BUR-AS method had the lowest coefficient of variation and smallest error range about the Bland-Altman plot. Mean CBF obtained using the BUR-AS method had the highest reproducibility. Compared with the Patlak-plot and BUR-C methods, the BUR-AS method provides greater inter- and intraoperator reproducibility of cerebral blood flow measurement. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Letant, S E; Kane, S R; Murphy, G A
2008-05-30
This note presents a comparison of Most-Probable-Number Rapid Viability (MPN-RV) PCR and traditional culture methods for the quantification of Bacillus anthracis Sterne spores in macrofoam swabs generated by the Centers for Disease Control and Prevention (CDC) for a multi-center validation study aimed at testing environmental swab processing methods for recovery, detection, and quantification of viable B. anthracis spores from surfaces. Results show that spore numbers provided by the MPN RV-PCR method were in statistical agreement with the CDC conventional culture method for all three levels of spores tested (10{sup 4}, 10{sup 2}, and 10 spores) even in the presence ofmore » dirt. In addition to detecting low levels of spores in environmental conditions, the MPN RV-PCR method is specific, and compatible with automated high-throughput sample processing and analysis protocols.« less
NASA Technical Reports Server (NTRS)
Benediktsson, J. A.; Swain, P. H.; Ersoy, O. K.
1993-01-01
Application of neural networks to classification of remote sensing data is discussed. Conventional two-layer backpropagation is found to give good results in classification of remote sensing data but is not efficient in training. A more efficient variant, based on conjugate-gradient optimization, is used for classification of multisource remote sensing and geographic data and very-high-dimensional data. The conjugate-gradient neural networks give excellent performance in classification of multisource data, but do not compare as well with statistical methods in classification of very-high-dimentional data.
The application of satellite data in monitoring strip mines
NASA Technical Reports Server (NTRS)
Sharber, L. A.; Shahrokhi, F.
1977-01-01
Strip mines in the New River Drainage Basin of Tennessee were studied through use of Landsat-1 imagery and aircraft photography. A multilevel analysis, involving conventional photo interpretation techniques, densitometric methods, multispectral analysis and statistical testing was applied to the data. The Landsat imagery proved adequate for monitoring large-scale change resulting from active mining and land-reclamation projects. However, the spatial resolution of the satellite imagery rendered it inadequate for assessment of many smaller strip mines, in the region which may be as small as a few hectares.
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-Claire; Schleiss, Marc
2017-04-01
In this study, we introduced an alternative approach for analysis of hydrological flow time series, using an adaptive sampling framework based on inter-amount times (IATs). The main difference with conventional flow time series is the rate at which low and high flows are sampled: the unit of analysis for IATs is a fixed flow amount, instead of a fixed time window. We analysed statistical distributions of flows and IATs across a wide range of sampling scales to investigate sensitivity of statistical properties such as quantiles, variance, skewness, scaling parameters and flashiness indicators to the sampling scale. We did this based on streamflow time series for 17 (semi)urbanised basins in North Carolina, US, ranging from 13 km2 to 238 km2 in size. Results showed that adaptive sampling of flow time series based on inter-amounts leads to a more balanced representation of low flow and peak flow values in the statistical distribution. While conventional sampling gives a lot of weight to low flows, as these are most ubiquitous in flow time series, IAT sampling gives relatively more weight to high flow values, when given flow amounts are accumulated in shorter time. As a consequence, IAT sampling gives more information about the tail of the distribution associated with high flows, while conventional sampling gives relatively more information about low flow periods. We will present results of statistical analyses across a range of subdaily to seasonal scales and will highlight some interesting insights that can be derived from IAT statistics with respect to basin flashiness and impact urbanisation on hydrological response.
Leite-Ribeiro, Patrícia; de Oliveira, Thais Feitosa Leitão; Mathias, Paula; Campo, Elisângela de Jesus; Sarmento, Viviane Almeida
2014-01-01
This study aimed to compare digital techniques for evaluating dental enamel de-/remineralization. Sixty extracted molars were subjected to a process of de- and remineralization. Radiographs were taken before and after each stage. These radiographs were evaluated by the conventional method and were then scanned and analyzed either with or without the use of image enhancement. Moreover, the gray levels (GLs) of the affected areas were measured. All methods exhibited low sensitivity and identical levels of specificity (99.4%). Analysis of the grayscale levels found statistically significant differences between the initial radiographs (P < 0.05). The mean GL of the carious group was significantly lower than that of the remineralized group. The GL did not differ significantly between the initial and final radiographs of the remineralized group, although the mean of the first group was lower than that of the second, which demonstrated that the remineralization process restored the normal density of the dental enamel. Measurement of the mean GL was sufficiently sensitive to detect small alterations in the surface of the enamel.
NASA Technical Reports Server (NTRS)
Schweikhard, W. G.; Dennon, S. R.
1986-01-01
A review of the Melick method of inlet flow dynamic distortion prediction by statistical means is provided. These developments include the general Melick approach with full dynamic measurements, a limited dynamic measurement approach, and a turbulence modelling approach which requires no dynamic rms pressure fluctuation measurements. These modifications are evaluated by comparing predicted and measured peak instantaneous distortion levels from provisional inlet data sets. A nonlinear mean-line following vortex model is proposed and evaluated as a potential criterion for improving the peak instantaneous distortion map generated from the conventional linear vortex of the Melick method. The model is simplified to a series of linear vortex segments which lay along the mean line. Maps generated with this new approach are compared with conventionally generated maps, as well as measured peak instantaneous maps. Inlet data sets include subsonic, transonic, and supersonic inlets under various flight conditions.
Li, Changyang; Wang, Xiuying; Eberl, Stefan; Fulham, Michael; Yin, Yong; Dagan Feng, David
2015-01-01
Automated and general medical image segmentation can be challenging because the foreground and the background may have complicated and overlapping density distributions in medical imaging. Conventional region-based level set algorithms often assume piecewise constant or piecewise smooth for segments, which are implausible for general medical image segmentation. Furthermore, low contrast and noise make identification of the boundaries between foreground and background difficult for edge-based level set algorithms. Thus, to address these problems, we suggest a supervised variational level set segmentation model to harness the statistical region energy functional with a weighted probability approximation. Our approach models the region density distributions by using the mixture-of-mixtures Gaussian model to better approximate real intensity distributions and distinguish statistical intensity differences between foreground and background. The region-based statistical model in our algorithm can intuitively provide better performance on noisy images. We constructed a weighted probability map on graphs to incorporate spatial indications from user input with a contextual constraint based on the minimization of contextual graphs energy functional. We measured the performance of our approach on ten noisy synthetic images and 58 medical datasets with heterogeneous intensities and ill-defined boundaries and compared our technique to the Chan-Vese region-based level set model, the geodesic active contour model with distance regularization, and the random walker model. Our method consistently achieved the highest Dice similarity coefficient when compared to the other methods.
Li, Haoxi; Huang, Yufeng; Cheng, Changzhi; Lin, Zhoudan; Wu, Desheng
2017-04-01
To analyze and confirm the advantages of anterior cervical distraction and screw elevating-pulling reduction which are absent in conventional anterior cervical reduction for traumatic cervical spine fractures and dislocations. A retrospective study was conducted on 86 patients with traumatic cervical spine fractures and dislocations who received one-stage anterior approach treatment for a distraction-flexion injury with bilateral locked facet joints between January 2010 and June 2015. They were 54 males and 32 females with an age ranging from 20 to 73 years (average age, 40.1 ± 5.6 years). These patients were distributed into group A and group B in the sequence of visits, with 44 cases of conventional anterior cervical reduction (group A) and 42 cases of anterior cervical distraction and screw elevating-pulling reduction (group B). Comparison of intraoperative blood loss, operation duration and vertebral reduction rate was made between the two groups. The follow-up time was 12-18 months, and the clinical outcomes of surgery were evaluated according to ASIA score, VAS score and JOA score. Statistically significant difference was revealed between group A and group B in the surgical time and the correction rate of cervical spine dislocation (p < 0.05), with the results of group B better than those of group A. For the two groups, statistically significant difference was shown between the ASIA score, VAS score and JOA score before and after operation (p < 0.05), with the results better after operation, while no statistically significant difference was revealed in such scores between the two groups (p > 0.05), with the therapeutic effect of group A the same with that of group B. Anterior cervical distraction and screw elevating-pulling reduction is simple with low risk, short operation duration, good effect of intraoperative vertebral reduction and well-recovered function after the operation. Meanwhile, as a safe and effective operation method for cervical spine fractures and dislocations, it can reduce postoperative complications and the risk of the iatrogenic cervical spinal cord injury caused by prying or facet joint springing during conventional reduction, having more obvious advantages compared to the conventional surgical reduction adopted by group A, with good cervical spine stability as shown in long-term follow-up. Therefore, it is suitable for clinical promotion and application. Copyright © 2017. Published by Elsevier Ltd.
An adaptive two-stage dose-response design method for establishing proof of concept.
Franchetti, Yoko; Anderson, Stewart J; Sampson, Allan R
2013-01-01
We propose an adaptive two-stage dose-response design where a prespecified adaptation rule is used to add and/or drop treatment arms between the stages. We extend the multiple comparison procedures-modeling (MCP-Mod) approach into a two-stage design. In each stage, we use the same set of candidate dose-response models and test for a dose-response relationship or proof of concept (PoC) via model-associated statistics. The stage-wise test results are then combined to establish "global" PoC using a conditional error function. Our simulation studies showed good and more robust power in our design method compared to conventional and fixed designs.
UWB communication receiver feedback loop
Spiridon, Alex; Benzel, Dave; Dowla, Farid U.; Nekoogar, Faranak; Rosenbury, Erwin T.
2007-12-04
A novel technique and structure that maximizes the extraction of information from reference pulses for UWB-TR receivers is introduced. The scheme efficiently processes an incoming signal to suppress different types of UWB as well as non-UWB interference prior to signal detection. Such a method and system adds a feedback loop mechanism to enhance the signal-to-noise ratio of reference pulses in a conventional TR receiver. Moreover, sampling the second order statistical function such as, for example, the autocorrelation function (ACF) of the received signal and matching it to the ACF samples of the original pulses for each transmitted bit provides a more robust UWB communications method and system in the presence of channel distortions.
Jun, Zhang; Juntao, Ge; Shuli, Liu; Li, Long
2016-01-01
PURPOSE: The purpose of this study is to determine whether singleport laparoscopic repair (SLR) for incarcerated inguinal hernia in children is superior toconventional repair (CR) approaches. METHOD: Between March 2013 and September 2013, 126 infants and children treatedwere retrospectively reviewed. All the patients were divided into three groups. Group A (48 patients) underwent trans-umbilical SLR, group B (36 patients) was subjected to trans-umbilical conventional two-port laparoscopic repair (TLR) while the conventional open surgery repair (COR) was performed in group C (42 patients). Data regarding the operating time, bleeding volume, post-operative hydrocele formation, testicular atrophy, cosmetic results, recurrence rate, and duration of hospital stay of the patients were collected. RESULT: All the cases were completed successfully without conversion. The mean operative time for group A was 15 ± 3.9 min and 24 ± 7.2 min for unilateral hernia and bilateral hernia respectively, whereas for group B, it was 13 ± 6.7 min and 23 ± 9.2 min. The mean duration of surgery in group C was 35 ± 5.2 min for unilateral hernia. The recurrence rate was 0% in all the three groups. There were statistically significant differences in theoperating time, bleeding volume, post-operative hydrocele formation, cosmetic results and duration hospital stay between the three groups (P < 0.001). No statistically significant differences between SLR and TLR were observed except the more cosmetic result in SLR. CONCLUSION: SLR is safe and effective, minimally invasive, and is a new technology worth promoting. PMID:27073306
Ratcliffe, J; Van Haselen, R; Buxton, M; Hardy, K; Colehan, J; Partridge, M
2002-01-01
Background: A study was undertaken to investigate the preferences of patients with asthma for attributes or characteristics associated with treatment for their asthma and to investigate the extent to which such preferences may differ between patient subgroups. Methods: The economic technique of conjoint analysis (CA) was used to investigate patients' strength of preference for several key attributes associated with services for the treatment of asthma. A CA questionnaire was administered to two groups of asthma outpatients aged 18 years or older, 150 receiving conventional treatment at Whipps Cross Hospital (WC) and 150 receiving homeopathic treatment at the Royal London Homoeopathic Hospital (RL). Results: An overall response rate of 47% (n=142) was achieved. Statistically significant attributes in influencing preferences for both the WC and RL respondents were (1) the extent to which the doctor gave sufficient time to listen to what the patient has to say, (2) the extent to which the treatment seemed to relieve symptoms, and (3) the travel costs of attending for an asthma consultation. The extent to which the doctor treated the patient as a whole person was also a statistically significant attribute for the RL respondents. Conclusions: This study has shown that aspects associated with the process of delivery of asthma services are important to patients in addition to treatment outcomes. The homeopathic respondents expressed stronger preferences for the doctor to treat them as a whole person than the patients receiving conventional treatment. Overall, the preferences for the attributes included in the study were similar for both groups. PMID:12037224
Minoda, Yukihide; Hata, Kanako; Ikebuchi, Mitsuhiko; Mizokawa, Shigekazu; Ohta, Yoichi; Nakamura, Hiroaki
2017-09-01
Polyethylene wear particle generation is one of the most important factors that affects the mid- to long-term results of total knee arthroplasties (TKA). Mobile-bearing total knee prostheses were developed to reduce polyethylene wear generation. However, whether mobile-bearing prostheses actually generate fewer polyethylene wear particles than fixed-bearing prostheses remains controversial. The aim of this study was to compare, within individual patients, the in vivo polyethylene wear particles created by a newly introduced mobile-bearing prosthesis in one knee and a conventional fixed-bearing prosthesis in other knee. Eighteen patients receiving bilateral TKAs to treat osteoarthritis were included. The synovial fluid was obtained from 36 knees at an average of 3.5 years after the operation. The in vivo polyethylene wear particles were isolated from the synovial fluid using a previously validated method and examined using a scanning electron microscope and an image analyser. The size and shape of the polyethylene wear particles from the mobile-bearing prostheses were similar to those from the conventional fixed-bearing prostheses. Although the number of wear particles from the mobile-bearing prosthesis (1.63 × 10 7 counts/knee) appeared smaller than that from the fixed-bearing prosthesis (2.16 × 10 7 counts/knee), the difference was not statistically significant. The current in vivo study shows that no statistically significant differences were found between the polyethylene wear particles generated by a newly introduced mobile-bearing PS prosthesis and a conventional fixed-bearing PS prosthesis during the early clinical stage after implantation. Therapeutic study, Level III.
Piezosurgery versus Rotatory Osteotomy in Mandibular Impacted Third Molar Extraction
Bhati, Bharat; Kukreja, Pankaj; Kumar, Sanjeev; Rathi, Vidhi C.; Singh, Kanika; Bansal, Shipra
2017-01-01
Aim: The aim of this study is to compare piezoelectric surgery versus rotatory osteotomy technique in removal of mandibular impacted third molar. Materials and Methods: Sample size of 30 patients 18 males, 12 females with a mean age of 27.43 ± 5.27. Bilateral extractions were required in all patients. All the patients were randomly allocated to two groups in one group, namely control group, surgical extraction of mandibular third molar was done using conventional rotatory osteotomy and in the other group, namely test group, extraction of lower third molar was done using Piezotome. Results: Parameters assessed in this study were – mouth opening (interincisal opening), pain (visual analog scale VAS score), swelling, incidence of dry socket, paresthesia and duration of surgery in both groups at baseline, 1st, 3rd, and 7th postoperative day. Comparing both groups pain scores with (P < 0.05) a statistically significant difference was found between two groups. Mean surgical time was longer for piezosurgery group (51.40 ± 17.9) minutes compared to the conventional rotatory group with a mean of (37.33 ± 15.5) minutes showing a statistically significant difference (P = 0.002). Conclusion: The main advantages of piezosurgery include soft tissue protection, optimal visibility in the surgical field, decreased blood loss, less vibration and noise, increased comfort for the patient, and protection of tooth structures. Therefore, the piezoelectric device was efficient in decreasing the short-term outcomes of pain and swelling although taking longer duration than conventional rotatory technique it significantly reduces the associated postoperative sequelae of third molar surgery. PMID:28713729
Przylipiak, Andrzej Feliks; Galicka, Elżbieta; Donejko, Magdalena; Niczyporuk, Marek; Przylipiak, Jerzy
2013-01-01
Liposuction is a type of aesthetic surgery that has been performed on humans for decades. There is not much literature addressing the subject matter of pre- and post-surgery blood parameters, although this information is rather interesting. Documentation on patients who received laser-assisted liposuction treatment is particularly scarce. Until now, there has been no literature showing values of platelets, lymphocytes, and neutrophils after liposuction. The aim of the work is to analyze and interpret values of platelets, lymphocytes and neutrophils in patient blood before and after liposuction, a surgery in which an extraordinarily large amount of potent drugs are used. Moreover, the aim is to compare values changes in patients of conventional and laser-assisted liposuction. We evaluated standard blood samples in patients prior to and after liposuction. This paper covers the number of platelets, lymphocytes, and neutrophils. A total of 54 patients were examined. Moreover, we compared the change in postoperative values in laser-assisted liposuction patients with the change of values in conventional liposuction patients. A paired two-sided Student's t-test was used for statistical evaluation. P < 0.005 was acknowledged to be statistically significant. Values of platelets were raised both in conventional and in laser-assisted liposuction patients, but this difference was statistically non-significant and levels of platelets were still normal and within the range of blood levels in healthy patients. Values of neutrophils rose by up to 79.49% ± 7.74% standard deviation (SD) and values of lymphocytes dropped by up to 12.68% ± 5.61% SD. The before/after variances of conventional tumescent local anesthesia liposuction and variations in laser-assisted liposuction were similar for all measured parameters; they also showed no statistically significant differences between before and after surgery. The mean value of total operation time without laser-assistance was 3 hours 42 minutes (± 57 minutes SD, range 2 hours 50 minutes to 5 hours 10 minutes). Surgeries with laser-assistance were on average 16 minutes shorter with a mean duration of 3 hours 26 minutes (± 45 minutes SD, range 2 hours 40 minutes to 4 hours 10 minutes). The difference was not statistically significant (P < 0.06). The mean value of aspirate volume for liposuctions performed without laser support was 2,618 mL (± 633.7 SD, range 700 mL to 3,500 mL). Mean aspirate volume for liposuctions with laser assistance was increased by up to 61 mL (2,677 mL ± 499.5 SD, range 1,800 mL to 3,500 mL). The difference was not statistically significant (P < 0.71). We conclude that conventional liposuction and laser-assisted liposuction have a similar influence on platelets, lymphocytes, and neutrophils in patients. Moreover, laser-assisted liposuction seems to be less time consuming than conventional liposuction.
Using conventional F-statistics to study unconventional sex-chromosome differentiation.
Rodrigues, Nicolas; Dufresnes, Christophe
2017-01-01
Species with undifferentiated sex chromosomes emerge as key organisms to understand the astonishing diversity of sex-determination systems. Whereas new genomic methods are widening opportunities to study these systems, the difficulty to separately characterize their X and Y homologous chromosomes poses limitations. Here we demonstrate that two simple F -statistics calculated from sex-linked genotypes, namely the genetic distance ( F st ) between sexes and the inbreeding coefficient ( F is ) in the heterogametic sex, can be used as reliable proxies to compare sex-chromosome differentiation between populations. We correlated these metrics using published microsatellite data from two frog species ( Hyla arborea and Rana temporaria ), and show that they intimately relate to the overall amount of X-Y differentiation in populations. However, the fits for individual loci appear highly variable, suggesting that a dense genetic coverage will be needed for inferring fine-scale patterns of differentiation along sex-chromosomes. The applications of these F -statistics, which implies little sampling requirement, significantly facilitate population analyses of sex-chromosomes.
NASA Astrophysics Data System (ADS)
Yi, Yong; Chen, Zhengying; Wang, Liming
2018-05-01
Corona-originated discharge of DC transmission lines is the main reason for the radiated electromagnetic interference (EMI) field in the vicinity of transmission lines. A joint time-frequency analysis technique was proposed to extract the radiated EMI current (excitation current) of DC corona based on corona current statistical measurements. A reduced-scale experimental platform was setup to measure the statistical distributions of current waveform parameters of aluminum conductor steel reinforced. Based on the measured results, the peak value, root-mean-square value and average value with 9 kHz and 200 Hz band-with of 0.5 MHz radiated EMI current were calculated by the technique proposed and validated with conventional excitation function method. Radio interference (RI) was calculated based on the radiated EMI current and a wire-to-plate platform was built for the validity of the RI computation results. The reason for the certain deviation between the computations and measurements was detailed analyzed.
Clark, D Angus; Bowles, Ryan P
2018-04-23
In exploratory item factor analysis (IFA), researchers may use model fit statistics and commonly invoked fit thresholds to help determine the dimensionality of an assessment. However, these indices and thresholds may mislead as they were developed in a confirmatory framework for models with continuous, not categorical, indicators. The present study used Monte Carlo simulation methods to investigate the ability of popular model fit statistics (chi-square, root mean square error of approximation, the comparative fit index, and the Tucker-Lewis index) and their standard cutoff values to detect the optimal number of latent dimensions underlying sets of dichotomous items. Models were fit to data generated from three-factor population structures that varied in factor loading magnitude, factor intercorrelation magnitude, number of indicators, and whether cross loadings or minor factors were included. The effectiveness of the thresholds varied across fit statistics, and was conditional on many features of the underlying model. Together, results suggest that conventional fit thresholds offer questionable utility in the context of IFA.
NASA Astrophysics Data System (ADS)
Mandelis, Andreas; Zhang, Yu; Melnikov, Alexander
2012-09-01
A solar cell lock-in carrierographic image generation theory based on the concept of non-equilibrium radiation chemical potential was developed. An optoelectronic diode expression was derived linking the emitted radiative recombination photon flux (current density), the solar conversion efficiency, and the external load resistance via the closed- and/or open-circuit photovoltage. The expression was shown to be of a structure similar to the conventional electrical photovoltaic I-V equation, thereby allowing the carrierographic image to be used in a quantitative statistical pixel brightness distribution analysis with outcome being the non-contacting measurement of mean values of these important parameters averaged over the entire illuminated solar cell surface. This is the optoelectronic equivalent of the electrical (contacting) measurement method using an external resistor circuit and the outputs of the solar cell electrode grid, the latter acting as an averaging distribution network over the surface. The statistical theory was confirmed using multi-crystalline Si solar cells.
López-Jornet, Pía
2013-01-01
Objective: The aim of this study was to compare conventional surgery with carbon dioxide (CO2) laser in patients with oral leukoplakia, and to evaluate the postoperative pain and swelling. Study design: A total of 48 patients (27 males and 21 females) with a mean age of 53.7 ± 11.7 years and diagnosed with oral leukoplakia were randomly assigned to receive treatment either with conventional surgery using a cold knife or with a CO2 laser technique. A visual analog scale (VAS) was used to score pain and swelling at different postoperative time points. Results: Pain and swelling reported by the patients was greater with the conventional cold knife than with the CO2 laser, statistically significant differences for pain and swelling were observed between the two techniques during the first three days after surgery. Followed by a gradual decrease over one week. In neither group was granuloma formation observed, and none of the patients showed malignant transformation during the period of follow-up. Conclusions: The CO2 laser causes only minimal pain and swelling, thus suggesting that it may be an alternative method to conventional surgery in treating patients with oral leukoplakia. Key words:Oral leukoplakia, treatment, laser surgery, cold knife, pain, swelling. PMID:23229239
Mokhtari, Negar; Shirazi, Alireza-Sarraf; Ebrahimi, Masoumeh
2017-11-01
Techniques with adequate accuracy of working length determination along with shorter duration of treatment in pulpectomy procedure seems to be essential in pediatric dentistry. The aim of the present study was to evaluate the accuracy of root canal length measurement with Root ZX II apex locator and rotary system in pulpectomy of primary teeth. In this randomized control clinical trial complete pulpectomy was performed on 80 mandibular primary molars in 80, 4-6-year-old children. The study population was randomly divided into case and control groups. In control group conventional pulpectomy was performed and in the case group working length was determined by electronic apex locator Root ZXII and instrumented with Mtwo rotary files. Statistical evaluation was performed using Mann-Whitney and Chi-Square tests ( P <0.05). There were no significant differences between electronic apex locator Root ZXII and conventional method in accuracy of root canal length determination. However significantly less time was needed for instrumenting with rotary files ( P =0.000). Considering the comparable results in accuracy of root canal length determination and the considerably shorter instrumentation time in Root ZXII apex locator and rotary system, it may be suggested for pulpectomy in primary molar teeth. Key words: Rotary technique, conventional technique, pulpectomy, primary teeth.
Is the authoritative parenting model effective in changing oral hygiene behavior in adolescents?
Brukienė, Vilma; Aleksejūnienė, Jolanta
2012-12-01
This study examined whether the authoritative parenting model (APM) is more effective than conventional approaches for changing adolescent oral hygiene behavior. A total of 247 adolescents were recruited using a cluster random-sampling method. Subject groups were randomly allocated into an intervention group (APM-based interventions), a Control Group 1 (conventional dental education and behavior modification) or a Control Group 2 (conventional behavior modification). The results were assessed after 3 and 12 months. Oral hygiene level was assessed as percent dental plaque and the ratio of plaque percent change (RPC). At the 3-month follow-up, there were significant differences among the groups; the APM group had the largest decrease in plaque levels (24.5%), Control Group 1 showed a decrease in plaque levels of 15.4% and Control Group 2 showed an increase in plaque levels of 2.8%. At the 12-month follow-up, an improvement was observed in all groups, but there were no statistically significant differences among the groups. In the short term, the intervention based on the APM was more effective in changing adolescent oral hygiene behavior compared with the conventional approaches. The reasons for long-term positive change after discontinued interventions in control groups need to be explored in future studies.
NASA Astrophysics Data System (ADS)
Sudhakar, Beeravelli; Krishna, Mylangam Chaitanya; Murthy, Kolapalli Venkata Ramana
2016-01-01
The aim of the present study was to formulate and evaluate the ritonavir-loaded stealth liposomes by using 32 factorial design and intended to delivered by parenteral delivery. Liposomes were prepared by ethanol injection method using 32 factorial designs and characterized for various physicochemical parameters such as drug content, size, zeta potential, entrapment efficiency and in vitro drug release. The optimization process was carried out using desirability and overlay plots. The selected formulation was subjected to PEGylation using 10 % PEG-10000 solution. Stealth liposomes were characterized for the above-mentioned parameters along with surface morphology, Fourier transform infrared spectrophotometer, differential scanning calorimeter, stability and in vivo pharmacokinetic studies in rats. Stealth liposomes showed better result compared to conventional liposomes due to effect of PEG-10000. The in vivo studies revealed that stealth liposomes showed better residence time compared to conventional liposomes and pure drug solution. The conventional liposomes and pure drug showed dose-dependent pharmacokinetics, whereas stealth liposomes showed long circulation half-life compared to conventional liposomes and pure ritonavir solution. The results of statistical analysis showed significance difference as the p value is (<0.05) by one-way ANOVA. The result of the present study revealed that stealth liposomes are promising tool in antiretroviral therapy.
Möltgen, C-V; Herdling, T; Reich, G
2013-11-01
This study demonstrates an approach, using science-based calibration (SBC), for direct coating thickness determination on heart-shaped tablets in real-time. Near-Infrared (NIR) spectra were collected during four full industrial pan coating operations. The tablets were coated with a thin hydroxypropyl methylcellulose (HPMC) film up to a film thickness of 28 μm. The application of SBC permits the calibration of the NIR spectral data without using costly determined reference values. This is due to the fact that SBC combines classical methods to estimate the coating signal and statistical methods for the noise estimation. The approach enabled the use of NIR for the measurement of the film thickness increase from around 8 to 28 μm of four independent batches in real-time. The developed model provided a spectroscopic limit of detection for the coating thickness of 0.64 ± 0.03 μm root-mean square (RMS). In the commonly used statistical methods for calibration, such as Partial Least Squares (PLS), sufficiently varying reference values are needed for calibration. For thin non-functional coatings this is a challenge because the quality of the model depends on the accuracy of the selected calibration standards. The obvious and simple approach of SBC eliminates many of the problems associated with the conventional statistical methods and offers an alternative for multivariate calibration. Copyright © 2013 Elsevier B.V. All rights reserved.
Hybrid perturbation methods based on statistical time series models
NASA Astrophysics Data System (ADS)
San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario
2016-04-01
In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Josse, Florent; Lefebvre, Yannick; Todeschini, Patrick
2006-07-01
Assessing the structural integrity of a nuclear Reactor Pressure Vessel (RPV) subjected to pressurized-thermal-shock (PTS) transients is extremely important to safety. In addition to conventional deterministic calculations to confirm RPV integrity, Electricite de France (EDF) carries out probabilistic analyses. Probabilistic analyses are interesting because some key variables, albeit conventionally taken at conservative values, can be modeled more accurately through statistical variability. One variable which significantly affects RPV structural integrity assessment is cleavage fracture initiation toughness. The reference fracture toughness method currently in use at EDF is the RCCM and ASME Code lower-bound K{sub IC} based on the indexing parameter RT{submore » NDT}. However, in order to quantify the toughness scatter for probabilistic analyses, the master curve method is being analyzed at present. Furthermore, the master curve method is a direct means of evaluating fracture toughness based on K{sub JC} data. In the framework of the master curve investigation undertaken by EDF, this article deals with the following two statistical items: building a master curve from an extract of a fracture toughness dataset (from the European project 'Unified Reference Fracture Toughness Design curves for RPV Steels') and controlling statistical uncertainty for both mono-temperature and multi-temperature tests. Concerning the first point, master curve temperature dependence is empirical in nature. To determine the 'original' master curve, Wallin postulated that a unified description of fracture toughness temperature dependence for ferritic steels is possible, and used a large number of data corresponding to nuclear-grade pressure vessel steels and welds. Our working hypothesis is that some ferritic steels may behave in slightly different ways. Therefore we focused exclusively on the basic french reactor vessel metal of types A508 Class 3 and A 533 grade B Class 1, taking the sampling level and direction into account as well as the test specimen type. As for the second point, the emphasis is placed on the uncertainties in applying the master curve approach. For a toughness dataset based on different specimens of a single product, application of the master curve methodology requires the statistical estimation of one parameter: the reference temperature T{sub 0}. Because of the limited number of specimens, estimation of this temperature is uncertain. The ASTM standard provides a rough evaluation of this statistical uncertainty through an approximate confidence interval. In this paper, a thorough study is carried out to build more meaningful confidence intervals (for both mono-temperature and multi-temperature tests). These results ensure better control over uncertainty, and allow rigorous analysis of the impact of its influencing factors: the number of specimens and the temperatures at which they have been tested. (authors)« less
Statistical modeling of yield and variance instability in conventional and organic cropping systems
USDA-ARS?s Scientific Manuscript database
Cropping systems research was undertaken to address declining crop diversity and verify competitiveness of alternatives to the predominant conventional cropping system in the northern Corn Belt. To understand and capitalize on temporal yield variability within corn and soybean fields, we quantified ...
Added, Marco Aurélio Nemitalla; Costa, Leonardo Oliveira Pena; Fukuda, Thiago Yukio; de Freitas, Diego Galace; Salomão, Evelyn Cassia; Monteiro, Renan Lima; Costa, Lucíola da Cunha Menezes
2013-10-24
Chronic nonspecific low back pain is a significant health condition with high prevalence worldwide and it is associated with enormous costs to society. Clinical practice guidelines show that many interventions are available to treat patients with chronic low back pain, but the vast majority of these interventions have a modest effect in reducing pain and disability. An intervention that has been widespread in recent years is the use of elastic bandages called Kinesio Taping. Although Kinesio Taping has been used extensively in clinical practice, current evidence does not support the use of this intervention; however these conclusions are based on a small number of underpowered studies. Therefore, questions remain about the effectiveness of the Kinesio Taping method as an additional treatment to interventions, such as conventional physiotherapy, that have already been recommended by the current clinical practice guidelines in robust and high-quality randomised controlled trials. We aim to determine the effectiveness of the addition of the use of Kinesio Taping in patients with chronic nonspecific low back pain who receive guideline-endorsed conventional physiotherapy. One hundred and forty-eight patients will be randomly allocated to receive either conventional physiotherapy, which consists of a combination of manual therapy techniques, general exercises, and specific stabilisation exercises (Guideline-Endorsed Conventional Physiotherapy Group) or to receive conventional physiotherapy with the addition of Kinesio Taping to the lumbar spine (Conventional Physiotherapy plus Kinesio Taping Group) over a period of 5 weeks (10 sessions of treatment). Clinical outcomes (pain intensity, disability and global perceived effect) will be collected at baseline and at 5 weeks, 3 months, and 6 months after randomisation. We will also collect satisfaction with care and adverse effects after treatment. Data will be collected by a blinded assessor. All statistical analysis will be conducted following the principles of intention to treat, and the effects of treatment will be calculated using Linear Mixed Models. The results of this study will provide new information about the usefulness of Kinesio Taping as an additional component of a guideline-endorsed physiotherapy program in patients with chronic nonspecific low back pain.
Paavolainen, Lassi; Acar, Erman; Tuna, Uygar; Peltonen, Sari; Moriya, Toshio; Soonsawad, Pan; Marjomäki, Varpu; Cheng, R Holland; Ruotsalainen, Ulla
2014-01-01
Electron tomography (ET) of biological samples is used to study the organization and the structure of the whole cell and subcellular complexes in great detail. However, projections cannot be acquired over full tilt angle range with biological samples in electron microscopy. ET image reconstruction can be considered an ill-posed problem because of this missing information. This results in artifacts, seen as the loss of three-dimensional (3D) resolution in the reconstructed images. The goal of this study was to achieve isotropic resolution with a statistical reconstruction method, sequential maximum a posteriori expectation maximization (sMAP-EM), using no prior morphological knowledge about the specimen. The missing wedge effects on sMAP-EM were examined with a synthetic cell phantom to assess the effects of noise. An experimental dataset of a multivesicular body was evaluated with a number of gold particles. An ellipsoid fitting based method was developed to realize the quantitative measures elongation and contrast in an automated, objective, and reliable way. The method statistically evaluates the sub-volumes containing gold particles randomly located in various parts of the whole volume, thus giving information about the robustness of the volume reconstruction. The quantitative results were also compared with reconstructions made with widely-used weighted backprojection and simultaneous iterative reconstruction technique methods. The results showed that the proposed sMAP-EM method significantly suppresses the effects of the missing information producing isotropic resolution. Furthermore, this method improves the contrast ratio, enhancing the applicability of further automatic and semi-automatic analysis. These improvements in ET reconstruction by sMAP-EM enable analysis of subcellular structures with higher three-dimensional resolution and contrast than conventional methods.
Aparecida de Oliveira, Maria; Abeid Ribeiro, Eliana Guimarães; Morato Bergamini, Alzira Maria; Pereira De Martinis, Elaine Cristina
2010-02-01
Modern lifestyle markedly changed eating habits worldwide, with an increasing demand for ready-to-eat foods, such as minimally processed fruits and leafy greens. Packaging and storage conditions of those products may favor the growth of psychrotrophic bacteria, including the pathogen Listeria monocytogenes. In this work, minimally processed leafy vegetables samples (n = 162) from retail market from Ribeirão Preto, São Paulo, Brazil, were tested for the presence or absence of Listeria spp. by the immunoassay Listeria Rapid Test, Oxoid. Two L. monocytogenes positive and six artificially contaminated samples of minimally processed leafy vegetables were evaluated by the Most Probable Number (MPN) with detection by classical culture method and also culture method combined with real-time PCR (RTi-PCR) for 16S rRNA genes of L. monocytogenes. Positive MPN enrichment tubes were analyzed by RTi-PCR with primers specific for L. monocytogenes using the commercial preparation ABSOLUTE QPCR SYBR Green Mix (ABgene, UK). Real-time PCR assay presented good exclusivity and inclusivity results and no statistical significant difference was found in comparison with the conventional culture method (p < 0.05). Moreover, RTi-PCR was fast and easy to perform, with MPN results obtained in ca. 48 h for RTi-PCR in comparison to 7 days for conventional method.
Franco, Érika Mendonça Fernandes; Valarelli, Fabrício Pinelli; Fernandes, João Batista; Cançado, Rodrigo Hermont; de Freitas, Karina Maria Salvatore
2015-01-01
Abstract Objective: The aim of this study was to compare torque expression in active and passive self-ligating and conventional brackets. Methods: A total of 300 segments of stainless steel wire 0.019 x 0.025-in and six different brands of brackets (Damon 3MX, Portia, In-Ovation R, Bioquick, Roth SLI and Roth Max) were used. Torque moments were measured at 12°, 24°, 36° and 48°, using a wire torsion device associated with a universal testing machine. The data obtained were compared by analysis of variance followed by Tukey test for multiple comparisons. Regression analysis was performed by the least-squares method to generate the mathematical equation of the optimal curve for each brand of bracket. Results: Statistically significant differences were observed in the expression of torque among all evaluated bracket brands in all evaluated torsions (p < 0.05). It was found that Bioquick presented the lowest torque expression in all tested torsions; in contrast, Damon 3MX bracket presented the highest torque expression up to 36° torsion. Conclusions: The connection system between wire/bracket (active, passive self-ligating or conventional with elastic ligature) seems not to interfere in the final torque expression, the latter being probably dependent on the interaction between the wire and the bracket chosen for orthodontic mechanics. PMID:26691972
Effectiveness of a modified tutorless problem-based learning method in dermatology - a pilot study.
Kaliyadan, F; Amri, M; Dhufiri, M; Amin, T T; Khan, M A
2012-01-01
Problem-Based Learning (PBL) is a student-centred instructional strategy in which students learn in a collaborative manner, the learning process being guided by a facilitator. One of the limitations of conventional PBL in medical education is the need for adequate resources in terms of faculty and time. Our study aimed to compare conventional PBL in dermatology with a modified tutorless PBL in which pre-listed cues and the use of digital media help direct the learning process. Thirty-one-fifth year medical students were divided into two groups: the study group comprising 16 students were exposed to the modified PBL, whereas the control group comprising 15 students were given the same scenarios and triggers, but in a conventional tutor-facilitated PBL. Knowledge acquisition and student feedback were assessed using a post-test and a Likert scale-based questionnaire, respectively. The post-test marks showed no significant statistical differences between the two groups. The general feedback regarding the modified PBL was positive and the students felt comfortable with the module. The learning objectives were met satisfactorily in both groups. Modified tutorless PBL modules might be an effective method to incorporate student-centred learning in dermatology without constraints in terms of faculty resources or time. © 2011 The Authors. Journal of the European Academy of Dermatology and Venereology © 2011 European Academy of Dermatology and Venereology.
Kwon, Chang-Il; Kim, Gwangil; Kim, Won Hee; Ko, Kwang Hyun; Hong, Sung Pyo; Jeong, Seok; Lee, Don Haeng
2014-01-01
Background/Aims In order to reduce the procedure time and the number of accessory changes during endoscopic submucosal dissection (ESD), we developed a novel versatile knife, which has the combined advantages of several conventional knives. The aim of this study was to compare the efficacy, safety, and histological quality of ESD performed using this novel versatile knife and a combination of several conventional knives. Methods This was an in vivo animal study comparing two different modalities of ESD in mini-pigs. Completion time of each resection was documented, and the resected specimens were retrieved and evaluated for completeness. To assess the quality control of the procedures and adverse events, detailed histopathological examinations were performed. Results A total of 18 specimens were dissected by ESD safely and easily (nine specimens using the new versatile knife; nine specimens by mixing conventional knives). All resections were completed as en bloc resections. There was no significant difference in procedure time between the 2 modalities (456 seconds vs. 355 seconds, p=0.258) and cutting speed (1.983 mm2/sec vs. 1.57 mm2/sec, p=1.000). The rate of adverse events and histological quality did not statistically differ between the modalities. Conclusions ESD with a versatile knife appeared to be an easy, safe, and technically efficient method. PMID:25505721
NASA Technical Reports Server (NTRS)
Trosin, Barry James
2007-01-01
Active flow control was applied at the point of separation of an axisymmetric, backward-facing-step flow. The control was implemented by employing a Helmholtz resonator that was externally driven by an amplitude-modulated, acoustic disturbance from a speaker located upstream of the wind tunnel. The velocity field of the separating/reattaching flow region downstream of the step was characterized using hotwire velocity measurements with and without flow control. Conventional statistics of the data reveal that the separating/reattaching flow is affected by the imposed forcing. Triple decomposition along with conditional averaging was used to distinguish periodic disturbances from random turbulence in the fluctuating velocity component. A significant outcome of the present study is that it demonstrates that amplitude-modulated forcing of the separated flow alters the flow in the same manner as the more conventional method of periodic excitation.
Combining remotely sensed and other measurements for hydrologic areal averages
NASA Technical Reports Server (NTRS)
Johnson, E. R.; Peck, E. L.; Keefer, T. N.
1982-01-01
A method is described for combining measurements of hydrologic variables of various sampling geometries and measurement accuracies to produce an estimated mean areal value over a watershed and a measure of the accuracy of the mean areal value. The method provides a means to integrate measurements from conventional hydrological networks and remote sensing. The resulting areal averages can be used to enhance a wide variety of hydrological applications including basin modeling. The correlation area method assigns weights to each available measurement (point, line, or areal) based on the area of the basin most accurately represented by the measurement. The statistical characteristics of the accuracy of the various measurement technologies and of the random fields of the hydrologic variables used in the study (water equivalent of the snow cover and soil moisture) required to implement the method are discussed.
Zhu, Xudong; Arman, Bessembayev; Chu, Ju; Wang, Yonghong; Zhuang, Yingping
2017-05-01
To develop an efficient cost-effective screening process to improve production of glucoamylase in Aspergillus niger. The cultivation of A. niger was achieved with well-dispersed morphology in 48-deep-well microtiter plates, which increased the throughput of the samples compared to traditional flask cultivation. There was a close negative correlation between glucoamylase and its pH of the fermentation broth. A novel high-throughput analysis method using Methyl Orange was developed. When compared to the conventional analysis method using 4-nitrophenyl α-D-glucopyranoside as substrate, a correlation coefficient of 0.96 by statistical analysis was obtained. Using this novel screening method, we acquired a strain with an activity of 2.2 × 10 3 U ml -1 , a 70% higher yield of glucoamylase than its parent strain.
Zhao, Bo; Yang, Lanju; Xiao, Lei; Sun, Baoquan; Zou, Xianbao; Gao, Dongmei; Jian, Xiandong
2016-01-01
To observe the effect of sodium bicarbonate combined with ulinastatin on cholinesterase activity for patients with acute phoxim pesticide poisoning. A total of 67 eligible patients with acute phoxim pesticide poisoning, Who were admitted to the emeryency department of hospital from March 2011 to February 2014, Acording to different treatments au patients were randomly divided into the conventional treatment group (n=34) and the sodium bicarbonate+ulinastatin group (n=35) . The conventional treatment group were given thorough gastric lavage with water, the sodium bicarbonate + ulinastatin group were given gastric lavage with 2% sodium bicarbonate solution. Both groups were given such treatments as catharsis, administration of oxygen, fluid infusion, diuresis, and antidotes such as atropine and pralidoxime methylchloride. On the basis of comprehensive treatment, people in the sodium bicarbonate+ulinastatin group were given 5% sodium bicarbonate injection and ulinastatin. The clinical effect of the two groups were compared. The serum cholinesterase activity of the sodium bicarbonate+ulinastatin group was significantly higher than the conventional treatment group from the 5th day, and the difference was statistically significant (P<0.05) . The total atropine dosage, total pralidoxime methylchloride dosage and hospitalization days were better than the conventional treatment group, and the differences were statistically significant (P<0.05) . The difference in the time of atropinization between the two groups was not statistically significant (P>0.05) . The results of arterial blood pH, HCO3- of the sodium bicarbonate + ulinastatin group were higher than the conventional treatment group, and the difference of HCO3- at the 10th day was statistically significant (P<0.05) . Sodium bicarbonate combined with ulinastatin can improve the therapeutic effect and reduce complications in the treatment of acute phoxim pesticide poisoning, and have beneficial effects on the recovery of cholinesterase activity.
Sadoughi, Mohammad Mehdi; Einollahi, Bahram; Baradaran-Rafii, Alireza; Roshandel, Danial; Hasani, Hamidreza; Nazeri, Mehrdad
2018-02-01
To compare the outcomes of the conventional and accelerated corneal collagen cross-linking (CXL) in patients with bilateral progressive keratoconus (KC). Fifteen consecutive patients with bilateral progressive KC were enrolled. In each patient, the fellow eyes were randomly assigned to the conventional CXL (3 mW/cm 2 for 30 min) or accelerated CXL (ACXL) (9 mW/cm 2 for 10 min) groups. Manifest refraction; uncorrected and corrected distant visual acuity; maximum and mean keratometry; corneal hysteresis and corneal resistance factor; endothelial cell density and morphology; central corneal thickness; and wavefront aberrations were measured before and 12 months after the CXL. Manifest refraction spherical equivalent and refractive cylinder improved significantly only in conventional group. Uncorrected and corrected distant visual acuity did not change significantly in either group. Also there was no significant change in the maximum and mean keratometry after 12 months. There was significant decrease in central corneal thickness in both groups which was more prominent in conventional group. Endothelial cell density reduced only in the conventional group which was not statistically significant (P = 0.147). CH, CRF, and wavefront aberrations did not change significantly in either group. We did not observe any significant difference in the changes of the variables between the two groups. Accelerated CXL with 9 mW/cm 2 irradiation for 10 min had similar refractive, visual, keratometric, and aberrometric results and less adverse effects on the corneal thickness and endothelial cells as compared with the conventional method after 12 months follow-up. However, randomized clinical trials with longer follow-ups and larger sample sizes are needed.
Seruya, Mitchel; Fisher, Mark; Rodriguez, Eduardo D
2013-11-01
There has been rising interest in computer-aided design/computer-aided manufacturing for preoperative planning and execution of osseous free flap reconstruction. The purpose of this study was to compare outcomes between computer-assisted and conventional fibula free flap techniques for craniofacial reconstruction. A two-center, retrospective review was carried out on patients who underwent fibula free flap surgery for craniofacial reconstruction from 2003 to 2012. Patients were categorized by the type of reconstructive technique: conventional (between 2003 and 2009) or computer-aided design/computer-aided manufacturing (from 2010 to 2012). Demographics, surgical factors, and perioperative and long-term outcomes were compared. A total of 68 patients underwent microsurgical craniofacial reconstruction: 58 conventional and 10 computer-aided design and manufacturing fibula free flaps. By demographics, patients undergoing the computer-aided design/computer-aided manufacturing method were significantly older and had a higher rate of radiotherapy exposure compared with conventional patients. Intraoperatively, the median number of osteotomies was significantly higher (2.0 versus 1.0, p=0.002) and the median ischemia time was significantly shorter (120 minutes versus 170 minutes, p=0.004) for the computer-aided design/computer-aided manufacturing technique compared with conventional techniques; operative times were shorter for patients undergoing the computer-aided design/computer-aided manufacturing technique, although this did not reach statistical significance. Perioperative and long-term outcomes were equivalent for the two groups, notably, hospital length of stay, recipient-site infection, partial and total flap loss, and rate of soft-tissue and bony tissue revisions. Microsurgical craniofacial reconstruction using a computer-assisted fibula flap technique yielded significantly shorter ischemia times amidst a higher number of osteotomies compared with conventional techniques. Therapeutic, III.
Oh, Pyung Chun; Suh, Soon Yong; Kang, Woong Chol; Lee, Kyounghoon; Han, Seung Hwan; Ahn, Taehoon; Shin, Eak Kyun
2016-01-01
Background/Aims: Treatment of coronary in-stent restenosis (ISR) is still associated with a high incidence of recurrence. We aimed to compare the efficacy and safety of drug-eluting balloons (DEB) for the treatment of ISR as compared with conventional balloon angioplasty (BA) and drug-eluting stents (DES). Methods: Between January 2006 and May 2012 a total of 177 patients (188 lesions, 64.1 ± 11.7 years old) who underwent percutaneous coronary intervention for ISR were retrospectively enrolled. Clinical outcomes were compared between patients treated with DEB (n = 58, 32.8%), conventional BA (n = 65, 36.7%), or DES (n = 54, 30.5%). The primary end point was a major adverse cardiac event (MACE), defined as a composite of cardiac death, myocardial infarction, and target lesion revascularization(TLR). Results: Baseline characteristics were not different except for a history of previous MI, which was more frequent in patients treated by conventional BA or DES than in patients treated by DEB (40.0% vs. 48.1% vs. 17.2%, respectively, p = 0.002). The total incidences of MACEs were 10.7%, 7.4%, and 15.4% in patients treated by DEB, DES, or conventional BA, respectively (p > 0.05). TLR was more frequent in patients treated by conventional BA than in patients treated by DEB or DES, but this was not statistically significant (10.8% vs. 6.9% vs. 3.7%, p > 0.05 between all group pairs, respectively). Conclusions: This study showed that percutaneous coronary intervention using DEB might be a feasible alternative to conventional BA or DES implantation for treatment of coronary ISR. Further large-scaled, randomized study assessing long-term clinical and angiographic outcomes will be needed. PMID:26951915
Vexler, Albert; Yu, Jihnhee
2018-04-13
A common statistical doctrine supported by many introductory courses and textbooks is that t-test type procedures based on normally distributed data points are anticipated to provide a standard in decision-making. In order to motivate scholars to examine this convention, we introduce a simple approach based on graphical tools of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. In this context, we propose employing a p-values-based method, taking into account the stochastic nature of p-values. We focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we extend the EPV concept to be considered in terms of the ROC curve technique. This provides expressive evaluations and visualizations of a wide spectrum of testing mechanisms' properties. We show that the conventional power characterization of tests is a partial aspect of the presented EPV/ROC technique. We desire that this explanation of the EPV/ROC approach convinces researchers of the usefulness of the EPV/ROC approach for depicting different characteristics of decision-making procedures, in light of the growing interest regarding correct p-values-based applications.
Adjustment of geochemical background by robust multivariate statistics
Zhou, D.
1985-01-01
Conventional analyses of exploration geochemical data assume that the background is a constant or slowly changing value, equivalent to a plane or a smoothly curved surface. However, it is better to regard the geochemical background as a rugged surface, varying with changes in geology and environment. This rugged surface can be estimated from observed geological, geochemical and environmental properties by using multivariate statistics. A method of background adjustment was developed and applied to groundwater and stream sediment reconnaissance data collected from the Hot Springs Quadrangle, South Dakota, as part of the National Uranium Resource Evaluation (NURE) program. Source-rock lithology appears to be a dominant factor controlling the chemical composition of groundwater or stream sediments. The most efficacious adjustment procedure is to regress uranium concentration on selected geochemical and environmental variables for each lithologic unit, and then to delineate anomalies by a common threshold set as a multiple of the standard deviation of the combined residuals. Robust versions of regression and RQ-mode principal components analysis techniques were used rather than ordinary techniques to guard against distortion caused by outliers Anomalies delineated by this background adjustment procedure correspond with uranium prospects much better than do anomalies delineated by conventional procedures. The procedure should be applicable to geochemical exploration at different scales for other metals. ?? 1985.
Taylor, Lisa; Poland, Fiona; Harrison, Peter; Stephenson, Richard
2011-01-01
To evaluate a systematic treatment programme developed by the researcher that targeted aspects of visual functioning affected by visual field deficits following stroke. The study design was a non-equivalent control (conventional) group pretest-posttest quasi-experimental feasibility design, using multisite data collection methods at specified stages. The study was undertaken within three acute hospital settings as outpatient follow-up sessions. Individuals who had visual field deficits three months post stroke were studied. A treatment group received routine occupational therapy and an experimental group received, in addition, a systematic treatment programme. The treatment phase of both groups lasted six weeks. The Nottingham Adjustment Scale, a measure developed specifically for visual impairment, was used as the primary outcome measure. The change in Nottingham Adjustment Scale score was compared between the experimental (n = 7) and conventional (n = 8) treatment groups using the Wilcoxon signed ranks test. The result of Z = -2.028 (P = 0.043) showed that there was a statistically significant difference between the change in Nottingham Adjustment Scale score between both groups. The introduction of the systematic treatment programme resulted in a statistically significant change in the scores of the Nottingham Adjustment Scale.
Latha, Selvanathan; Sivaranjani, Govindhan; Dhanasekaran, Dharumadurai
2017-09-01
Among diverse actinobacteria, Streptomyces is a renowned ongoing source for the production of a large number of secondary metabolites, furnishing immeasurable pharmacological and biological activities. Hence, to meet the demand of new lead compounds for human and animal use, research is constantly targeting the bioprospecting of Streptomyces. Optimization of media components and physicochemical parameters is a plausible approach for the exploration of intensified production of novel as well as existing bioactive metabolites from various microbes, which is usually achieved by a range of classical techniques including one factor at a time (OFAT). However, the major drawbacks of conventional optimization methods have directed the use of statistical optimization approaches in fermentation process development. Response surface methodology (RSM) is one of the empirical techniques extensively used for modeling, optimization and analysis of fermentation processes. To date, several researchers have implemented RSM in different bioprocess optimization accountable for the production of assorted natural substances from Streptomyces in which the results are very promising. This review summarizes some of the recent RSM adopted studies for the enhanced production of antibiotics, enzymes and probiotics using Streptomyces with the intention to highlight the significance of Streptomyces as well as RSM to the research community and industries.
Pore fluids and the LGM ocean salinity-Reconsidered
NASA Astrophysics Data System (ADS)
Wunsch, Carl
2016-03-01
Pore fluid chlorinity/salinity data from deep-sea cores related to the salinity maximum of the last glacial maximum (LGM) are analyzed using estimation methods deriving from linear control theory. With conventional diffusion coefficient values and no vertical advection, results show a very strong dependence upon initial conditions at -100 ky. Earlier inferences that the abyssal Southern Ocean was strongly salt-stratified in the LGM with a relatively fresh North Atlantic Ocean are found to be consistent within uncertainties of the salinity determination, which remain of order ±1 g/kg. However, an LGM Southern Ocean abyss with an important relative excess of salt is an assumption, one not required by existing core data. None of the present results show statistically significant abyssal salinity values above the global average, and results remain consistent, apart from a general increase owing to diminished sea level, with a more conventional salinity distribution having deep values lower than the global mean. The Southern Ocean core does show a higher salinity than the North Atlantic one on the Bermuda Rise at different water depths. Although much more sophisticated models of the pore-fluid salinity can be used, they will only increase the resulting uncertainties, unless considerably more data can be obtained. Results are consistent with complex regional variations in abyssal salinity during deglaciation, but none are statistically significant.
NASA Astrophysics Data System (ADS)
Matsuda, Takashi S.; Nakamura, Takuji; Ejiri, Mitsumu K.; Tsutsumi, Masaki; Shiokawa, Kazuo
2014-08-01
We have developed a new analysis method for obtaining the power spectrum in the horizontal phase velocity domain from airglow intensity image data to study atmospheric gravity waves. This method can deal with extensive amounts of imaging data obtained on different years and at various observation sites without bias caused by different event extraction criteria for the person processing the data. The new method was applied to sodium airglow data obtained in 2011 at Syowa Station (69°S, 40°E), Antarctica. The results were compared with those obtained from a conventional event analysis in which the phase fronts were traced manually in order to estimate horizontal characteristics, such as wavelengths, phase velocities, and wave periods. The horizontal phase velocity of each wave event in the airglow images corresponded closely to a peak in the spectrum. The statistical results of spectral analysis showed an eastward offset of the horizontal phase velocity distribution. This could be interpreted as the existence of wave sources around the stratospheric eastward jet. Similar zonal anisotropy was also seen in the horizontal phase velocity distribution of the gravity waves by the event analysis. Both methods produce similar statistical results about directionality of atmospheric gravity waves. Galactic contamination of the spectrum was examined by calculating the apparent velocity of the stars and found to be limited for phase speeds lower than 30 m/s. In conclusion, our new method is suitable for deriving the horizontal phase velocity characteristics of atmospheric gravity waves from an extensive amount of imaging data.
Duque, Jussaro Alves; Duarte, Marco Antonio Hungaro; Canali, Lyz Cristina Furquim; Zancan, Rafaela Fernandes; Vivan, Rodrigo Ricci; Bernardes, Ricardo Affonso; Bramante, Clovis Monteiro
2017-02-01
The aim of this study was to compare the effectiveness of Easy Clean (Easy Dental Equipment, Belo Horizonte, MG, Brazil) in continuous and reciprocating motion, passive ultrasonic irrigation (PUI), Endoactivator systems (Dentsply Maillefer, Ballaigues, Switzerland), and conventional irrigation for debris removal from root canals and isthmus. Fifty mesial roots of mandibular molars were embedded in epoxy resin using a metal muffle; afterward, the blocks containing the roots were sectioned at 2, 4, and 6 mm from the apex. After instrumentation, the roots were divided into 5 groups (n = 10) for application of the final irrigation protocol using Easy Clean in continuous rotation, Easy Clean in reciprocating motion, PUI, Endoactivator, and conventional irrigation. Scanning electron microscopic images were taken after instrumentation and after the first, second, and third activation of irrigating solution to evaluate the area of remaining debris with image J software (National Institutes of Health, Bethesda, MD). The protocol of 3 irrigating solution activations for 20 seconds provided better cleaning of the canal and isthmus. On conclusion of all procedures, analysis of the canals showed a statistical difference only at 2 mm; the Easy Clean in continuous rotation was more efficient than conventional irrigation (P < .05). On conclusion of all steps, the largest difference was observed in the isthmus in which the Easy Clean in continuous rotation was more effective than conventional irrigation at the 3 levels analyzed and the Endoactivator at 4 mm (P < .05). The PUI promoted greater cleaning than conventional irrigation at 6 mm (P < .05). There was no statistical difference between Easy Clean in continuous rotation, Easy Clean in reciprocating motion, and PUI (P > .05). Irrigating solution activation methods provided better cleaning of the canal and isthmus, especially the Easy Clean used in continuous rotation. The protocol of 3 irrigating solution activations for 20 seconds favored better cleaning. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
The possibility of application of spiral brain computed tomography to traumatic brain injury.
Lim, Daesung; Lee, Soo Hoon; Kim, Dong Hoon; Choi, Dae Seub; Hong, Hoon Pyo; Kang, Changwoo; Jeong, Jin Hee; Kim, Seong Chun; Kang, Tae-Sin
2014-09-01
The spiral computed tomography (CT) with the advantage of low radiation dose, shorter test time required, and its multidimensional reconstruction is accepted as an essential diagnostic method for evaluating the degree of injury in severe trauma patients and establishment of therapeutic plans. However, conventional sequential CT is preferred for the evaluation of traumatic brain injury (TBI) over spiral CT due to image noise and artifact. We aimed to compare the diagnostic power of spiral facial CT for TBI to that of conventional sequential brain CT. We evaluated retrospectively the images of 315 traumatized patients who underwent both brain CT and facial CT simultaneously. The hemorrhagic traumatic brain injuries such as epidural hemorrhage, subdural hemorrhage, subarachnoid hemorrhage, and contusional hemorrhage were evaluated in both images. Statistics were performed using Cohen's κ to compare the agreement between 2 imaging modalities and sensitivity, specificity, positive predictive value, and negative predictive value of spiral facial CT to conventional sequential brain CT. Almost perfect agreement was noted regarding hemorrhagic traumatic brain injuries between spiral facial CT and conventional sequential brain CT (Cohen's κ coefficient, 0.912). To conventional sequential brain CT, sensitivity, specificity, positive predictive value, and negative predictive value of spiral facial CT were 92.2%, 98.1%, 95.9%, and 96.3%, respectively. In TBI, the diagnostic power of spiral facial CT was equal to that of conventional sequential brain CT. Therefore, expanded spiral facial CT covering whole frontal lobe can be applied to evaluate TBI in the future. Copyright © 2014 Elsevier Inc. All rights reserved.
Many-body formalism for fermions: The partition function
NASA Astrophysics Data System (ADS)
Watson, D. K.
2017-09-01
The partition function, a fundamental tenet in statistical thermodynamics, contains in principle all thermodynamic information about a system. It encapsulates both microscopic information through the quantum energy levels and statistical information from the partitioning of the particles among the available energy levels. For identical particles, this statistical accounting is complicated by the symmetry requirements of the allowed quantum states. In particular, for Fermi systems, the enforcement of the Pauli principle is typically a numerically demanding task, responsible for much of the cost of the calculations. The interplay of these three elements—the structure of the many-body spectrum, the statistical partitioning of the N particles among the available levels, and the enforcement of the Pauli principle—drives the behavior of mesoscopic and macroscopic Fermi systems. In this paper, we develop an approach for the determination of the partition function, a numerically difficult task, for systems of strongly interacting identical fermions and apply it to a model system of harmonically confined, harmonically interacting fermions. This approach uses a recently introduced many-body method that is an extension of the symmetry-invariant perturbation method (SPT) originally developed for bosons. It uses group theory and graphical techniques to avoid the heavy computational demands of conventional many-body methods which typically scale exponentially with the number of particles. The SPT application of the Pauli principle is trivial to implement since it is done "on paper" by imposing restrictions on the normal-mode quantum numbers at first order in the perturbation. The method is applied through first order and represents an extension of the SPT method to excited states. Our method of determining the partition function and various thermodynamic quantities is accurate and efficient and has the potential to yield interesting insight into the role played by the Pauli principle and the influence of large degeneracies on the emergence of the thermodynamic behavior of large-N systems.
Conventional versus computer-navigated TKA: a prospective randomized study.
Todesca, Alessandro; Garro, Luca; Penna, Massimo; Bejui-Hugues, Jacques
2017-06-01
The purpose of this study was to assess the midterm results of total knee arthroplasty (TKA) implanted with a specific computer navigation system in a group of patients (NAV) and to assess the same prosthesis implanted with the conventional technique in another group (CON); we hypothesized that computer navigation surgery would improve implant alignment, functional scores and survival of the implant compared to the conventional technique. From 2008 to 2009, 225 patients were enrolled in the study and randomly assigned in CON and NAV groups; 240 consecutive mobile-bearing ultra-congruent score (Amplitude, Valence, France) TKAs were performed by a single surgeon, 117 using the conventional method and 123 using the computer-navigated approach. Clinical outcome assessment was based on the Knee Society Score (KSS), the Hospital for Special Surgery Knee Score and the Western Ontario Mac Master University Index score. Component survival was calculated by Kaplan-Meier analysis. Median follow-up was 6.4 years (range 6-7 years). Two patients were lost to follow-up. No differences were seen between the two groups in age, sex, BMI and side of implantation. Three patients of CON group referred feelings of instability during walking, but clinical tests were all negative. NAV group showed statistical significant better KSS Score and wider ROM and fewer outliers from neutral mechanical axis, lateral distal femoral angle, medial proximal tibial angle and tibial slope in post-operative radiographic assessment. There was one case of early post-operative superficial infection (caused by Staph. Aureus) successfully treated with antibiotics. No mechanical loosening, mobile-bearing dislocation or patellofemoral complication was seen. At 7 years of follow-up, component survival in relation to the risk of aseptic loosening or other complications was 100 %. There were no implant revisions. This study demonstrates superior accuracy in implant positioning and statistical significant better functional outcomes of computer-navigated TKA. Computer navigation for TKAs should be used routinely in primary implants. II.
Rana, Majeed; Gellrich, Nils-Claudius; Rana, Madiha; Piffkó, Jozsef; Kater, Wolfgang
2013-02-17
Ultrasonic bone-cutting surgery has been introduced as a feasible alternative to the conventional sharp instruments used in craniomaxillofacial surgery because of its precision and safety. The piezosurgery medical device allows the efficient cutting of mineralized tissues with minimal trauma to soft tissues. Piezoelectric osteotome has found its role in surgically assisted rapid maxillary expansion (SARME), a procedure well established to correct transverse maxillary discrepancies. The advantages include minimal risk to critical anatomic structures. The purpose of this clinical comparative study (CIS 2007-237-M) was to present the advantages of the piezoelectric cut as a minimally invasive device in surgically assisted, rapid maxillary expansion by protecting the maxillary sinus mucosal lining. Thirty patients (18 females and 12 males) at the age of 18 to 54 underwent a surgically assisted palatal expansion of the maxilla with a combined orthodontic and surgical approach. The patients were randomly divided into two separate treatment groups. While Group 1 received conventional surgery using an oscillating saw, Group 2 was treated with piezosurgery. The following parameters were examined: blood pressure, blood values, required medication, bleeding level in the maxillary sinus, duration of inpatient stay, duration of surgery and height of body temperature. The results displayed no statistically significant differences between the two groups regarding laboratory blood values and inpatient stay. The duration of surgery revealed a significant discrepancy. Deploying piezosurgery took the surgeon an average of 10 minutes longer than working with a conventional-saw technique. However, the observation of the bleeding level in the paranasal sinus presented a major and statistically significant advantage of piezosurgery: on average the bleeding level was one category above the one of the remaining patients. This method of piezoelectric surgery with all its advantages is going to replace many conventional operating procedures in oral and maxillofacial surgery. CIS 2007-237-M.
NASA Astrophysics Data System (ADS)
Rubel, Aleksey S.; Lukin, Vladimir V.; Egiazarian, Karen O.
2015-03-01
Results of denoising based on discrete cosine transform for a wide class of images corrupted by additive noise are obtained. Three types of noise are analyzed: additive white Gaussian noise and additive spatially correlated Gaussian noise with middle and high correlation levels. TID2013 image database and some additional images are taken as test images. Conventional DCT filter and BM3D are used as denoising techniques. Denoising efficiency is described by PSNR and PSNR-HVS-M metrics. Within hard-thresholding denoising mechanism, DCT-spectrum coefficient statistics are used to characterize images and, subsequently, denoising efficiency for them. Results of denoising efficiency are fitted for such statistics and efficient approximations are obtained. It is shown that the obtained approximations provide high accuracy of prediction of denoising efficiency.
Kazemi Yazdi, Haleh; Van Noort, Richard; Mansouri, Mona
2016-01-01
Statement of the Problem: The usage of glass ionomer cements (GICs) restorative materials are very limited due to lack of flexural strength and toughness. Purpose: The aim of this study was to investigate the effect of using a leucite glass on a range of mechanical and optical properties of commercially available conventional glass ionomer cement. Materials and Method: Ball milled 45μm leucite glass particles were incorporated into commercial conventional GIC, Ketac-Molar Easymix (KMEm). The characteristics of the powder particles were observed under scanning electron microscopy. The samples were made for each experimental group; KMEm and lucite- modified Ketac-Molar easy Mix (LMKMEm) according to manufacturer’s instruction then were collected in damp tissue and stored in incubator for 1 hour. The samples were divided into two groups, one stored in distilled water for 24 hours and the others for 1 week.10 samples were made for testing biaxial flexural strength after 1 day and 1 week, with a crosshead speed of 1mm/min, calculated in MPa. The hardness (Vickers hardness tester) of each experimental group was also tested. To evaluate optical properties, 3 samples were made for each experimental group and evaluated with a spectrophotometer. The setting time of modified GIC was measured with Gillmore machine. Result: The setting time in LMKMEm was 8 minutes. The mean biaxial flexural strength was LMKMEm/ 1day: 24.13±4.14 MPa, LMKMEm/ 1 week: 24.22±4.87 MPa KMEm/1day:28.87±6.31 MPa and KMEm/1 week: 26.65±5.82 MPa which were not statistically different from each other. The mean Vickers hardness was LMKMEm: 403±66 Mpa and KMEm: 358±22 MPa; though not statistically different from each other. The mean total transmittance (Tt) was LMKMEm: 15.9±0.7, KMEm: 22.3±1.2, the mean diffuse transmittance (Td) was LMKMEm: 12.2±0.5, KMEm: 18.0±0.5 which were statistically different from each other. Conclusion: Leucite glass can be incorporated with a conventional GIC without interfering with setting time. Yet, it did not improve the mechanical and optical properties of the GIC. PMID:27942546
A Doubly Stochastic Change Point Detection Algorithm for Noisy Biological Signals.
Gold, Nathan; Frasch, Martin G; Herry, Christophe L; Richardson, Bryan S; Wang, Xiaogang
2017-01-01
Experimentally and clinically collected time series data are often contaminated with significant confounding noise, creating short, noisy time series. This noise, due to natural variability and measurement error, poses a challenge to conventional change point detection methods. We propose a novel and robust statistical method for change point detection for noisy biological time sequences. Our method is a significant improvement over traditional change point detection methods, which only examine a potential anomaly at a single time point. In contrast, our method considers all suspected anomaly points and considers the joint probability distribution of the number of change points and the elapsed time between two consecutive anomalies. We validate our method with three simulated time series, a widely accepted benchmark data set, two geological time series, a data set of ECG recordings, and a physiological data set of heart rate variability measurements of fetal sheep model of human labor, comparing it to three existing methods. Our method demonstrates significantly improved performance over the existing point-wise detection methods.
Adwan, Ghaleb; Salameh, Yousef; Adwan, Kamel; Barakat, Ali
2012-05-01
To detect the anticandidal activity of nine toothpastes containing sodium fluoride, sodium monofluorophosphate and herbal extracts as an active ingredients against 45 oral and non oral Candida albicans (C. albicans) isolates. The antifungal activity of these toothpaste formulations was determined using a standard agar well diffusion method. Statistical analysis was performed using a statistical package, SPSS windows version 15, by applying mean values using one-way ANOVA with post-hoc least square differences (LSD) method. A P value of less than 0.05 was considered significant. All toothpastes studied in our experiments were effective in inhibiting the growth of all C. albicans isolates. The highest anticandidal activity was obtained from toothpaste that containing both herbal extracts and sodium fluoride as active ingredients, while the lowest activity was obtained from toothpaste containing sodium monofluorophosphate as an active ingredient. Antifungal activity of Parodontax toothpaste showed a significant difference (P< 0.001) against C. albicans isolates compared to toothpastes containing sodium fluoride or herbal products. In the present study, it has been demonstrated that toothpaste containing both herbal extracts and sodium fluoride as active ingredients are more effective in control of C. albicans, while toothpaste that containing monofluorophosphate as an active ingredient is less effective against C. albicans. Some herbal toothpaste formulations studied in our experiments, appear to be equally effective as the fluoride dental formulations and it can be used as an alternative to conventional formulations for individuals who have an interest in naturally-based products. Our results may provide invaluable information for dental professionals.
NASA Astrophysics Data System (ADS)
Russo, C.; Palaia, G.; Loskutova, E.; Libotte, F.; Kornblit, R.; Gaimari, G.; Tenore, G.; Romeo, U.
2016-03-01
Introduction: Periodontitis is a chronic inflammatory disease due to exposition to plaque and tartar. Conventional treatments consist of scaling and root planing (SRP) and antibiotics administration. Among them encouraging results have been obtained using alternative protocols, like the antimicrobial photodynamic therapy (PDT). Aim of the Study: Evaluation of PDT effects added to conventional methods. Materials and Methods: 11 patients (4M/7F, 37-67 years aged, non-smoking) affected by untreated chronic periodontal disease, with >3mm pockets in at least 4 teeth were divided in two groups, test and control group. Each patient had to made full-intraoral before and after the treatment. The test group received SRP+PDT, while the control group was subjected to SRP. The PDT was performed through the HELBO®TheraLite (Bredent Medical), diode laser battery powered 670nm with an output of 75mW/cm2. The Helbo Blue photosensitizer, containing methylene blue, was used. The exposure time to the laser effect was of 10'' for each site, for a total of 60'' at 3J/cm2. Results: Both groups had a significant improvement in the reduction of pocket depth (PD), above all in the test group. Statistical analysis was performed through the T-test, evaluating PD between the two groups p=0.96 (p> 0.05), resulting not statistically significant. Conclusion: PDT is a promising support to SRP, achieving a significant reduction in the pocket depth, but more cases are needed to confirm the validity of the used protocol.
Morlock, Scott E.; Nguyen, Hieu T.; Ross, Jerry H.
2002-01-01
It is feasible to use acoustic Doppler velocity meters (ADVM's) installed at U.S. Geological Survey (USGS) streamflow-gaging stations to compute records of river discharge. ADVM's are small acoustic current meters that use the Doppler principle to measure water velocities in a two-dimensional plane. Records of river discharge can be computed from stage and ADVM velocity data using the 'index velocity' method. The ADVM-measured velocities are used as an estimator or 'index' of the mean velocity in the channel. In evaluations of ADVM's for the computation of records of river discharge, the USGS installed ADVM's at three streamflow-gaging stations in Indiana: Kankakee River at Davis, Fall Creek at Millersville, and Iroquois River near Foresman. The ADVM evaluation study period was from June 1999 to February 2001. Discharge records were computed, using ADVM data from each station. Discharge records also were computed using conventional stage-discharge methods of the USGS. The records produced from ADVM and conventional methods were compared with discharge record hydrographs and statistics. Overall, the records compared closely from the Kankakee River and Fall Creek stations. For the Iroquois River station, variable backwater was present and affected the comparison; because the ADVM record compensates for backwater, the ADVM record may be superior to the conventional record. For the three stations, the ADVM records were judged to be of a quality acceptable to USGS standards for publications and near realtime ADVM-computed discharges are served on USGS real-time data World Wide Web pages.
Xiao, Di; You, Yuanhai; Bi, Zhenwang; Wang, Haibin; Zhang, Yongchan; Hu, Bin; Song, Yanyan; Zhang, Huifang; Kou, Zengqiang; Yan, Xiaomei; Zhang, Menghan; Jin, Lianmei; Jiang, Xihong; Su, Peng; Bi, Zhenqiang; Luo, Fengji; Zhang, Jianzhong
2013-03-01
There was a dramatic increase in scarlet fever cases in China from March to July 2011. Group A Streptococcus (GAS) is the only pathogen known to cause scarlet fever. Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) coupled to Biotyper system was used for GAS identification in 2011. A local reference database (LRD) was constructed, evaluated and used to identify GAS isolates. The 75 GAS strains used to evaluate the LRD were all identified correctly. Of the 157 suspected β-hemolytic strains isolated from 298 throat swab samples, 127 (100%) and 120 (94.5%) of the isolates were identified as GAS by the MALDI-TOF MS system and the conventional bacitracin sensitivity test method, respectively. All 202 (100%) isolates were identified at the species level by searching the LRD, while 182 (90.1%) were identified by searching the original reference database (ORD). There were statistically significant differences with a high degree of credibility at species level (χ(2)=6.052, P<0.05 between the LRD and ORD). The test turnaround time was shortened 36-48h, and the cost of each sample is one-tenth of the cost of conventional methods. Establishing a domestic database is the most effective way to improve the identification efficiency using a MALDI-TOF MS system. MALDI-TOF MS is a viable alternative to conventional methods and may aid in the diagnosis and surveillance of GAS. Copyright © 2013 Elsevier B.V. All rights reserved.
Advances in Bayesian Modeling in Educational Research
ERIC Educational Resources Information Center
Levy, Roy
2016-01-01
In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…
Rahana, A R; Ng, S P; Leong, C F; Rahimah, M D
2011-10-01
This study evaluated the effect of human semen cryopreservation using an ultra-low temperature technique with a mechanical freezer at -85°C as an alternative method to the conventional liquid nitrogen technique at -196°C. This was a prospective experimental study conducted in the Medically Assisted Conception unit, Department of Obstetrics and Gynaecology, National University Hospital, Malaysia from January 1, 2006 to April 30, 2007. All normozoospermic semen samples were included in the study. The concentration, motility and percentage of intact DNA of each semen sample were assessed before and after freezing and thawing on Days 7 and 30 post freezing. Sperm cryopreservation at -85°C was comparable to the conventional liquid nitrogen technique for a period of up to 30 days in a normozoospermic sample. There was no statistical difference in concentration (Day 7 p-value is 0.1, Day 30 p-value is 0.2), motility (Day 7 p-value is 0.9, Day 30 p-value is 0.5) and proportion of intact DNA (Day 7 p-value is 0.1, Day 30 p-value is 0.2) between the ultra-low temperature technique and conventional liquid nitrogen cryopreservation at Days 7 and 30 post thawing. This study clearly demonstrates that short-term storage of sperm at -85°C could be a viable alternative to conventional liquid nitrogen cryopreservation at -196°C due to their comparable post-thaw results.
Soft-tissue imaging with C-arm cone-beam CT using statistical reconstruction
NASA Astrophysics Data System (ADS)
Wang, Adam S.; Webster Stayman, J.; Otake, Yoshito; Kleinszig, Gerhard; Vogt, Sebastian; Gallia, Gary L.; Khanna, A. Jay; Siewerdsen, Jeffrey H.
2014-02-01
The potential for statistical image reconstruction methods such as penalized-likelihood (PL) to improve C-arm cone-beam CT (CBCT) soft-tissue visualization for intraoperative imaging over conventional filtered backprojection (FBP) is assessed in this work by making a fair comparison in relation to soft-tissue performance. A prototype mobile C-arm was used to scan anthropomorphic head and abdomen phantoms as well as a cadaveric torso at doses substantially lower than typical values in diagnostic CT, and the effects of dose reduction via tube current reduction and sparse sampling were also compared. Matched spatial resolution between PL and FBP was determined by the edge spread function of low-contrast (˜40-80 HU) spheres in the phantoms, which were representative of soft-tissue imaging tasks. PL using the non-quadratic Huber penalty was found to substantially reduce noise relative to FBP, especially at lower spatial resolution where PL provides a contrast-to-noise ratio increase up to 1.4-2.2× over FBP at 50% dose reduction across all objects. Comparison of sampling strategies indicates that soft-tissue imaging benefits from fully sampled acquisitions at dose above ˜1.7 mGy and benefits from 50% sparsity at dose below ˜1.0 mGy. Therefore, an appropriate sampling strategy along with the improved low-contrast visualization offered by statistical reconstruction demonstrates the potential for extending intraoperative C-arm CBCT to applications in soft-tissue interventions in neurosurgery as well as thoracic and abdominal surgeries by overcoming conventional tradeoffs in noise, spatial resolution, and dose.
Shah, Farhan Khalid; Gebreel, Ashraf; Elshokouki, Ali hamed; Habib, Ahmed Ali
2012-01-01
PURPOSE To compare the changes in the occlusal vertical dimension, activity of masseter muscles and biting force after insertion of immediate denture constructed with conventional, tooth-supported and Implant-supported immediate mandibular complete denture. MATERIALS AND METHODS Patients were selected and treatment was carried out with all the three different concepts i.e, immediate denture constructed with conventional (Group A), tooth-supported (Group B) and Implant-supported (Group C) immediate mandibular complete dentures. Parameters of evaluation and comparison were occlusal vertical dimension measured by radiograph (at three different time intervals), Masseter muscle electromyographic (EMG) measurement by EMG analysis (at three different positions of jaws) and bite force measured by force transducer (at two different time intervals). The obtained data were statistically analyzed by using ANOVA-F test at 5% level of significance. If the F test was significant, Least Significant Difference test was performed to test further significant differences between variables. RESULTS Comparison between mean differences in occlusal vertical dimension for tested groups showed that it was only statistically significant at 1 year after immediate dentures insertion. Comparison between mean differences in wavelet packet coefficients of the electromyographic signals of masseter muscles for tested groups was not significant at rest position, but significant at initial contact position and maximum voluntary clench position. Comparison between mean differences in maximum biting force for tested groups was not statistically significant at 5% level of significance. CONCLUSION Immediate complete overdentures whether tooth or implant supported prosthesis is recommended than totally mucosal supported prosthesis. PMID:22737309
Reflexion on linear regression trip production modelling method for ensuring good model quality
NASA Astrophysics Data System (ADS)
Suprayitno, Hitapriya; Ratnasari, Vita
2017-11-01
Transport Modelling is important. For certain cases, the conventional model still has to be used, in which having a good trip production model is capital. A good model can only be obtained from a good sample. Two of the basic principles of a good sampling is having a sample capable to represent the population characteristics and capable to produce an acceptable error at a certain confidence level. It seems that this principle is not yet quite understood and used in trip production modeling. Therefore, investigating the Trip Production Modelling practice in Indonesia and try to formulate a better modeling method for ensuring the Model Quality is necessary. This research result is presented as follows. Statistics knows a method to calculate span of prediction value at a certain confidence level for linear regression, which is called Confidence Interval of Predicted Value. The common modeling practice uses R2 as the principal quality measure, the sampling practice varies and not always conform to the sampling principles. An experiment indicates that small sample is already capable to give excellent R2 value and sample composition can significantly change the model. Hence, good R2 value, in fact, does not always mean good model quality. These lead to three basic ideas for ensuring good model quality, i.e. reformulating quality measure, calculation procedure, and sampling method. A quality measure is defined as having a good R2 value and a good Confidence Interval of Predicted Value. Calculation procedure must incorporate statistical calculation method and appropriate statistical tests needed. A good sampling method must incorporate random well distributed stratified sampling with a certain minimum number of samples. These three ideas need to be more developed and tested.
Rai, Arpita; Acharya, Ashith B.; Naikmasur, Venkatesh G.
2016-01-01
Background: Age estimation of living or deceased individuals is an important aspect of forensic sciences. Conventionally, pulp-to-tooth area ratio (PTR) measured from periapical radiographs have been utilized as a nondestructive method of age estimation. Cone-beam computed tomography (CBCT) is a new method to acquire three-dimensional images of the teeth in living individuals. Aims: The present study investigated age estimation based on PTR of the maxillary canines measured in three planes obtained from CBCT image data. Settings and Design: Sixty subjects aged 20–85 years were included in the study. Materials and Methods: For each tooth, mid-sagittal, mid-coronal, and three axial sections—cementoenamel junction (CEJ), one-fourth root level from CEJ, and mid-root—were assessed. PTR was calculated using AutoCAD software after outlining the pulp and tooth. Statistical Analysis Used: All statistical analyses were performed using an SPSS 17.0 software program. Results and Conclusions: Linear regression analysis showed that only PTR in axial plane at CEJ had significant age correlation (r = 0.32; P < 0.05). This is probably because of clearer demarcation of pulp and tooth outline at this level. PMID:28123269
Rodríguez-Arias, Miquel Angel; Rodó, Xavier
2004-03-01
Here we describe a practical, step-by-step primer to scale-dependent correlation (SDC) analysis. The analysis of transitory processes is an important but often neglected topic in ecological studies because only a few statistical techniques appear to detect temporary features accurately enough. We introduce here the SDC analysis, a statistical and graphical method to study transitory processes at any temporal or spatial scale. SDC analysis, thanks to the combination of conventional procedures and simple well-known statistical techniques, becomes an improved time-domain analogue of wavelet analysis. We use several simple synthetic series to describe the method, a more complex example, full of transitory features, to compare SDC and wavelet analysis, and finally we analyze some selected ecological series to illustrate the methodology. The SDC analysis of time series of copepod abundances in the North Sea indicates that ENSO primarily is the main climatic driver of short-term changes in population dynamics. SDC also uncovers some long-term, unexpected features in the population. Similarly, the SDC analysis of Nicholson's blowflies data locates where the proposed models fail and provides new insights about the mechanism that drives the apparent vanishing of the population cycle during the second half of the series.
Lancaster, Cady; Espinoza, Edgard
2012-05-15
International trade of several Dalbergia wood species is regulated by The Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). In order to supplement morphological identification of these species, a rapid chemical method of analysis was developed. Using Direct Analysis in Real Time (DART) ionization coupled with Time-of-Flight (TOF) Mass Spectrometry (MS), selected Dalbergia and common trade species were analyzed. Each of the 13 wood species was classified using principal component analysis and linear discriminant analysis (LDA). These statistical data clusters served as reliable anchors for species identification of unknowns. Analysis of 20 or more samples from the 13 species studied in this research indicates that the DART-TOFMS results are reproducible. Statistical analysis of the most abundant ions gave good classifications that were useful for identifying unknown wood samples. DART-TOFMS and LDA analysis of 13 species of selected timber samples and the statistical classification allowed for the correct assignment of unknown wood samples. This method is rapid and can be useful when anatomical identification is difficult but needed in order to support CITES enforcement. Published 2012. This article is a US Government work and is in the public domain in the USA.
Williams, L. Keoki; Buu, Anne
2017-01-01
We propose a multivariate genome-wide association test for mixed continuous, binary, and ordinal phenotypes. A latent response model is used to estimate the correlation between phenotypes with different measurement scales so that the empirical distribution of the Fisher’s combination statistic under the null hypothesis is estimated efficiently. The simulation study shows that our proposed correlation estimation methods have high levels of accuracy. More importantly, our approach conservatively estimates the variance of the test statistic so that the type I error rate is controlled. The simulation also shows that the proposed test maintains the power at the level very close to that of the ideal analysis based on known latent phenotypes while controlling the type I error. In contrast, conventional approaches–dichotomizing all observed phenotypes or treating them as continuous variables–could either reduce the power or employ a linear regression model unfit for the data. Furthermore, the statistical analysis on the database of the Study of Addiction: Genetics and Environment (SAGE) demonstrates that conducting a multivariate test on multiple phenotypes can increase the power of identifying markers that may not be, otherwise, chosen using marginal tests. The proposed method also offers a new approach to analyzing the Fagerström Test for Nicotine Dependence as multivariate phenotypes in genome-wide association studies. PMID:28081206
Makeyev, Oleksandr; Joe, Cody; Lee, Colin; Besio, Walter G
2017-07-01
Concentric ring electrodes have shown promise in non-invasive electrophysiological measurement demonstrating their superiority to conventional disc electrodes, in particular, in accuracy of Laplacian estimation. Recently, we have proposed novel variable inter-ring distances concentric ring electrodes. Analytic and finite element method modeling results for linearly increasing distances electrode configurations suggested they may decrease the truncation error resulting in more accurate Laplacian estimates compared to currently used constant inter-ring distances configurations. This study assesses statistical significance of Laplacian estimation accuracy improvement due to novel variable inter-ring distances concentric ring electrodes. Full factorial design of analysis of variance was used with one categorical and two numerical factors: the inter-ring distances, the electrode diameter, and the number of concentric rings in the electrode. The response variables were the Relative Error and the Maximum Error of Laplacian estimation computed using a finite element method model for each of the combinations of levels of three factors. Effects of the main factors and their interactions on Relative Error and Maximum Error were assessed and the obtained results suggest that all three factors have statistically significant effects in the model confirming the potential of using inter-ring distances as a means of improving accuracy of Laplacian estimation.
Dentascan – Is the Investment Worth the Hype ???
Shah, Monali A; Shah, Sneha S; Dave, Deepak
2013-01-01
Background: Open Bone Measurement (OBM) and Bone Sounding (BS) are most reliable but invasive clinical methods for Alveolar Bone Level (ABL) assessment, causing discomfort to the patient. Routinely, IOPAs & OPGs are the commonest radiographic techniques used, which tend to underestimate bone loss and obscure buccal/lingual defects. Novel technique like dentascan (CBCT) eliminates this limitation by giving images in 3 planes – sagittal, coronal and axial. Aim: To compare & correlate non-invasive 3D radiographic technique of Dentascan with BS & OBM, and IOPA and OPG, in assessing the ABL. Settings and Design: Cross-sectional diagnostic study. Material and Methods: Two hundred and five sites were subjected to clinical and radiographic diagnostic techniques. Relative distance between the alveolar bone crest and reference wire was measured. All the measurements were compared and tested against the OBM. Statistical Analysis: Student’s t-test, ANOVA, Pearson correlation coefficient. Results: There is statistically significant difference between dentascan and OBM, only BS showed agreement with OBM (p < 0.05). Dentascan weakly correlated with OBM & BS lingually.Rest all techniques showed statistically significant difference between them (p= 0.00). Conclusion: Within the limitations of this study, only BS seems to be comparable with OBM with no superior result of Dentascan over the conventional techniques, except for lingual measurements. PMID:24551722
Tupinambá, Rogerio Amaral; Claro, Cristiane Aparecida de Assis; Pereira, Cristiane Aparecida; Nobrega, Celestino José Prudente; Claro, Ana Paula Rosifini Alves
2017-01-01
Plasma-polymerized film deposition was created to modify metallic orthodontic brackets surface properties in order to inhibit bacterial adhesion. Hexamethyldisiloxane (HMDSO) polymer films were deposited on conventional (n = 10) and self-ligating (n = 10) stainless steel orthodontic brackets using the Plasma-Enhanced Chemical Vapor Deposition (PECVD) radio frequency technique. The samples were divided into two groups according to the kind of bracket and two subgroups after surface treatment. Scanning Electron Microscopy (SEM) analysis was performed to assess the presence of bacterial adhesion over samples surfaces (slot and wings region) and film layer integrity. Surface roughness was assessed by Confocal Interferometry (CI) and surface wettability, by goniometry. For bacterial adhesion analysis, samples were exposed for 72 hours to a Streptococcus mutans solution for biofilm formation. The values obtained for surface roughness were analyzed using the Mann-Whitney test while biofilm adhesion were assessed by Kruskal-Wallis and SNK test. Significant statistical differences (p< 0.05) for surface roughness and bacterial adhesion reduction were observed on conventional brackets after surface treatment and between conventional and self-ligating brackets; no significant statistical differences were observed between self-ligating groups (p> 0.05). Plasma-polymerized film deposition was only effective on reducing surface roughness and bacterial adhesion in conventional brackets. It was also noted that conventional brackets showed lower biofilm adhesion than self-ligating brackets despite the absence of film.
Yi, Jianru; Li, Meile; Li, Yu; Li, Xiaobing; Zhao, Zhihe
2016-11-21
The aim of this study was to compare the external apical root resorption (EARR) in patients receiving fixed orthodontic treatment with self-ligating or conventional brackets. Studies comparing the EARR between orthodontic patients using self-ligating or conventional brackets were identified through electronic search in databases including CENTRAL, PubMed, EMBASE, China National Knowledge Infrastructure (CNKI) and SIGLE, and manual search in relevant journals and reference lists of the included studies until Apr 2016. The extraction of data and risk of bias evaluation were conducted by two investigators independently. The original outcome underwent statistical pooling by using Review Manager 5. Seven studies were included in the systematic review, out of which, five studies were statistically pooled in meta-analysis. The value of EARR of maxillary central incisors in the self-ligating bracket group was significantly lower than that in the conventional bracket group (SMD -0.31; 95% CI: -0.60--0.01). No significant differences in other incisors were observed between self-ligating and conventional brackets. Current evidences suggest self-ligating brackets do not outperform conventional brackets in reducing the EARR in maxillary lateral incisors, mandible central incisors and mandible lateral incisors. However, self-ligating brackets appear to have an advantage in protecting maxillary central incisor from EARR, which still needs to be confirmed by more high-quality studies.
Visualizing Similarity of Appearance by Arrangement of Cards
Nakatsuji, Nao; Ihara, Hisayasu; Seno, Takeharu; Ito, Hiroshi
2016-01-01
This study proposes a novel method to extract the configuration of the psychological space by directly measuring subjects' similarity rating without computational work. Although multidimensional scaling (MDS) is well-known as a conventional method for extracting the psychological space, the method requires many pairwise evaluations. The times taken for evaluations increase in proportion to the square of the number of objects in MDS. The proposed method asks subjects to arrange cards on a poster sheet according to the degree of similarity of the objects. To compare the performance of the proposed method with the conventional one, we developed similarity maps of typefaces through the proposed method and through non-metric MDS. We calculated the trace correlation coefficient among all combinations of the configuration for both methods to evaluate the degree of similarity in the obtained configurations. The threshold value of trace correlation coefficient for statistically discriminating similar configuration was decided based on random data. The ratio of the trace correlation coefficient exceeding the threshold value was 62.0% so that the configurations of the typefaces obtained by the proposed method closely resembled those obtained by non-metric MDS. The required duration for the proposed method was approximately one third of the non-metric MDS's duration. In addition, all distances between objects in all the data for both methods were calculated. The frequency for the short distance in the proposed method was lower than that of the non-metric MDS so that a relatively small difference was likely to be emphasized among objects in the configuration by the proposed method. The card arrangement method we here propose, thus serves as a easier and time-saving tool to obtain psychological structures in the fields related to similarity of appearance. PMID:27242611
Directions for new developments on statistical design and analysis of small population group trials.
Hilgers, Ralf-Dieter; Roes, Kit; Stallard, Nigel
2016-06-14
Most statistical design and analysis methods for clinical trials have been developed and evaluated where at least several hundreds of patients could be recruited. These methods may not be suitable to evaluate therapies if the sample size is unavoidably small, which is usually termed by small populations. The specific sample size cut off, where the standard methods fail, needs to be investigated. In this paper, the authors present their view on new developments for design and analysis of clinical trials in small population groups, where conventional statistical methods may be inappropriate, e.g., because of lack of power or poor adherence to asymptotic approximations due to sample size restrictions. Following the EMA/CHMP guideline on clinical trials in small populations, we consider directions for new developments in the area of statistical methodology for design and analysis of small population clinical trials. We relate the findings to the research activities of three projects, Asterix, IDeAl, and InSPiRe, which have received funding since 2013 within the FP7-HEALTH-2013-INNOVATION-1 framework of the EU. As not all aspects of the wide research area of small population clinical trials can be addressed, we focus on areas where we feel advances are needed and feasible. The general framework of the EMA/CHMP guideline on small population clinical trials stimulates a number of research areas. These serve as the basis for the three projects, Asterix, IDeAl, and InSPiRe, which use various approaches to develop new statistical methodology for design and analysis of small population clinical trials. Small population clinical trials refer to trials with a limited number of patients. Small populations may result form rare diseases or specific subtypes of more common diseases. New statistical methodology needs to be tailored to these specific situations. The main results from the three projects will constitute a useful toolbox for improved design and analysis of small population clinical trials. They address various challenges presented by the EMA/CHMP guideline as well as recent discussions about extrapolation. There is a need for involvement of the patients' perspective in the planning and conduct of small population clinical trials for a successful therapy evaluation.
Research Design and Statistics for Applied Linguistics.
ERIC Educational Resources Information Center
Hatch, Evelyn; Farhady, Hossein
An introduction to the conventions of research design and statistical analysis is presented for graduate students of applied linguistics. The chapters cover such concepts as the definition of research, variables, research designs, research report formats, sorting and displaying data, probability and hypothesis testing, comparing means,…
Du, Yiping P; Jin, Zhaoyang
2009-10-01
To develop a robust algorithm for tissue-air segmentation in magnetic resonance imaging (MRI) using the statistics of phase and magnitude of the images. A multivariate measure based on the statistics of phase and magnitude was constructed for tissue-air volume segmentation. The standard deviation of first-order phase difference and the standard deviation of magnitude were calculated in a 3 x 3 x 3 kernel in the image domain. To improve differentiation accuracy, the uniformity of phase distribution in the kernel was also calculated and linear background phase introduced by field inhomogeneity was corrected. The effectiveness of the proposed volume segmentation technique was compared to a conventional approach that uses the magnitude data alone. The proposed algorithm was shown to be more effective and robust in volume segmentation in both synthetic phantom and susceptibility-weighted images of human brain. Using our proposed volume segmentation method, veins in the peripheral regions of the brain were well depicted in the minimum-intensity projection of the susceptibility-weighted images. Using the additional statistics of phase, tissue-air volume segmentation can be substantially improved compared to that using the statistics of magnitude data alone. (c) 2009 Wiley-Liss, Inc.
Statistical models for the analysis and design of digital polymerase chain (dPCR) experiments
Dorazio, Robert; Hunter, Margaret
2015-01-01
Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log–log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model’s parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.
Dorazio, Robert M; Hunter, Margaret E
2015-11-03
Statistical methods for the analysis and design of experiments using digital PCR (dPCR) have received only limited attention and have been misused in many instances. To address this issue and to provide a more general approach to the analysis of dPCR data, we describe a class of statistical models for the analysis and design of experiments that require quantification of nucleic acids. These models are mathematically equivalent to generalized linear models of binomial responses that include a complementary, log-log link function and an offset that is dependent on the dPCR partition volume. These models are both versatile and easy to fit using conventional statistical software. Covariates can be used to specify different sources of variation in nucleic acid concentration, and a model's parameters can be used to quantify the effects of these covariates. For purposes of illustration, we analyzed dPCR data from different types of experiments, including serial dilution, evaluation of copy number variation, and quantification of gene expression. We also showed how these models can be used to help design dPCR experiments, as in selection of sample sizes needed to achieve desired levels of precision in estimates of nucleic acid concentration or to detect differences in concentration among treatments with prescribed levels of statistical power.
NASA Astrophysics Data System (ADS)
Kolokythas, Kostantinos; Vasileios, Salamalikis; Athanassios, Argiriou; Kazantzidis, Andreas
2015-04-01
The wind is a result of complex interactions of numerous mechanisms taking place in small or large scales, so, the better knowledge of its behavior is essential in a variety of applications, especially in the field of power production coming from wind turbines. In the literature there is a considerable number of models, either physical or statistical ones, dealing with the problem of simulation and prediction of wind speed. Among others, Artificial Neural Networks (ANNs) are widely used for the purpose of wind forecasting and, in the great majority of cases, outperform other conventional statistical models. In this study, a number of ANNs with different architectures, which have been created and applied in a dataset of wind time series, are compared to Auto Regressive Integrated Moving Average (ARIMA) statistical models. The data consist of mean hourly wind speeds coming from a wind farm on a hilly Greek region and cover a period of one year (2013). The main goal is to evaluate the models ability to simulate successfully the wind speed at a significant point (target). Goodness-of-fit statistics are performed for the comparison of the different methods. In general, the ANN showed the best performance in the estimation of wind speed prevailing over the ARIMA models.
NASA Astrophysics Data System (ADS)
Nallala, Jayakrupakar; Gobinet, Cyril; Diebold, Marie-Danièle; Untereiner, Valérie; Bouché, Olivier; Manfait, Michel; Sockalingum, Ganesh Dhruvananda; Piot, Olivier
2012-11-01
Innovative diagnostic methods are the need of the hour that could complement conventional histopathology for cancer diagnosis. In this perspective, we propose a new concept based on spectral histopathology, using IR spectral micro-imaging, directly applied to paraffinized colon tissue array stabilized in an agarose matrix without any chemical pre-treatment. In order to correct spectral interferences from paraffin and agarose, a mathematical procedure is implemented. The corrected spectral images are then processed by a multivariate clustering method to automatically recover, on the basis of their intrinsic molecular composition, the main histological classes of the normal and the tumoral colon tissue. The spectral signatures from different histological classes of the colonic tissues are analyzed using statistical methods (Kruskal-Wallis test and principal component analysis) to identify the most discriminant IR features. These features allow characterizing some of the biomolecular alterations associated with malignancy. Thus, via a single analysis, in a label-free and nondestructive manner, main changes associated with nucleotide, carbohydrates, and collagen features can be identified simultaneously between the compared normal and the cancerous tissues. The present study demonstrates the potential of IR spectral imaging as a complementary modern tool, to conventional histopathology, for an objective cancer diagnosis directly from paraffin-embedded tissue arrays.
A comment on "Novel scavenger removal trials increase wind turbine-caused avian fatality estimates"
Huso, Manuela M.P.; Erickson, Wallace P.
2013-01-01
In a recent paper, Smallwood et al. (2010) conducted a study to compare their “novel” approach to conducting carcass removal trials with what they term the “conventional” approach and to evaluate the effects of the different methods on estimated avian fatality at a wind power facility in California. A quick glance at Table 3 that succinctly summarizes their results and provides estimated fatality rates and 80% confidence intervals calculated using the 2 methods reveals a surprising result. The confidence intervals of all of their estimates and most of the conventional estimates extend below 0. These results imply that wind turbines may have the capacity to create live birds. But a more likely interpretation is that a serious error occurred in the calculation of either the average fatality rate or its standard error or both. Further evaluation of their methods reveals that the scientific basis for concluding that “many estimates of scavenger removal rates prior to [their] study were likely biased low due to scavenger swamping” and “previously reported estimates of avian fatality rates … should be adjusted upwards” was not evident in their analysis and results. Their comparison to conventional approaches was not applicable, their statistical models were questionable, and the conclusions they drew were unsupported.
Two-dimensional replica exchange approach for peptide-peptide interactions
NASA Astrophysics Data System (ADS)
Gee, Jason; Shell, M. Scott
2011-02-01
The replica exchange molecular dynamics (REMD) method has emerged as a standard approach for simulating proteins and peptides with rugged underlying free energy landscapes. We describe an extension to the original methodology—here termed umbrella-sampling REMD (UREMD)—that offers specific advantages in simulating peptide-peptide interactions. This method is based on the use of two dimensions in the replica cascade, one in temperature as in conventional REMD, and one in an umbrella sampling coordinate between the center of mass of the two peptides that aids explicit exploration of the complete association-dissociation reaction coordinate. To mitigate the increased number of replicas required, we pursue an approach in which the temperature and umbrella dimensions are linked at only fully associated and dissociated states. Coupled with the reweighting equations, the UREMD method aids accurate calculations of normalized free energy profiles and structural or energetic measures as a function of interpeptide separation distance. We test the approach on two families of peptides: a series of designed tetrapeptides that serve as minimal models for amyloid fibril formation, and a fragment of a classic leucine zipper peptide and its mutant. The results for these systems are compared to those from conventional REMD simulations, and demonstrate good convergence properties, low statistical errors, and, for the leucine zippers, an ability to sample near-native structures.
Accelerated Compressed Sensing Based CT Image Reconstruction.
Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R; Paul, Narinder S; Cobbold, Richard S C
2015-01-01
In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.
Accelerated Compressed Sensing Based CT Image Reconstruction
Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R.; Paul, Narinder S.; Cobbold, Richard S. C.
2015-01-01
In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization. PMID:26167200